Eye Tracking

Scaling icons and UI elements dynamically based on where the user is looking.
When the user looks at an icon or element, the system can scale it up slightly to signal that it's in focus, making it easier to select and interact with. This gives a more intuitive experience, as the user’s gaze naturally guides the interface.
On the left, the two green spheres represent the user’s eyes, showing the objects currently in focus based on their gaze.
On the right, the user can select the object they're looking at, by using hand tracking and pinching to confirm the selection.
This demo features eye tracking implemented with Meta's VR SDK. The icons were designed in Blender, and the project was developed in Unity
Eye tracking highlights specific app icons based on the user’s gaze, while hand gestures enable swiping to the next set of icons.
HandPalmMenu

In this project, we use an upward-facing palm and a set of icons to view applications in a 3D space, aiming to transform traditional 2D app windows into a spatially-aware immersive 3D interface.
Developed in Unity with the XR Interaction Toolkit, this project leverages XR Hands to capture data such as the position, orientation, and velocity of key points on the user’s hand. The hand shape describes the overall posture, focusing on individual finger configurations, while the hand pose adds orientation conditions to the hand shape, specifying how the hand must be positioned relative to the user.
The icons I selected are based on the ones I use frequently.
Here are the key features and workflows implemented in the project. 
The final demo is shown below.
Dynamic Hand UI

In this project, menus are created dynamically using both hands in a T-pose. When both hands are placed close together, a small menu appears, which can be resized as long as the pose is maintained. Once the UI is created, the user can move it around in 3D space, as each UI element has a holder that can be grabbed and repositioned. If the UI obstructs the view in 3D space, it can be minimized to the hand.
The project uses XR Hands, along with hand poses and hand shapes, to enable these interactions.
As shown in the examples above, the canvas can be created, and minimized by tapping the button either directly with the hands or by using the near-far interaction to select the minimize button.
The windows can be stored on both hands and enables users to organize windows in a left/right-hand-side configuration, such as placing frequently used UI panels on the left for right-handed users.

The final demo is shown below.