Adam Varga showcased some hand-tracking-based UI concepts running in real-time on Quest Pro.
Product Designer and AR/VR Developer Adam Varga, also known as dmvrg, has recently shared a jaw-dropping demo showcasing some augmented reality (AR) user interfaces with hand interactions running in real-time.
Leveraging Unity, the developer prototyped some outstanding hand-tracking-based UI concepts, with said prototypes running in real-time on a standalone Meta Quest Pro VR headset. As shown in the demo, the collection features four different concepts, including a "mid-air slider gesture for smart home app, 3D space input for rapid image and video editing, surface touch gestures for AR desktop productivity apps, and 3D 'magnifying glass' feature to reveal the spatial aspect of 2D data".
"These are just some examples, barely scratching the surface of the possibilities," commented the developer. "I'm very excited about the upcoming hand-tracking-based interfaces in Spatial Computing (AR/VR) and I hope more designers and developers will explore this radically new and fascinating paradigm."
If you would like to learn more about using Unity for AR/VR projects, here are some great tutorials that might help you out: