Unity-Powered UI Concepts in Augmented Reality

Adam Varga showcased some hand-tracking-based UI concepts running in real-time on Quest Pro.

Product Designer and AR/VR Developer Adam Varga, also known as dmvrg, has recently shared a jaw-dropping demo showcasing some augmented reality (AR) user interfaces with hand interactions running in real-time.

Leveraging Unity, the developer prototyped some outstanding hand-tracking-based UI concepts, with said prototypes running in real-time on a standalone Meta Quest Pro VR headset. As shown in the demo, the collection features four different concepts, including a "mid-air slider gesture for smart home app, 3D space input for rapid image and video editing, surface touch gestures for AR desktop productivity apps, and 3D 'magnifying glass' feature to reveal the spatial aspect of 2D data".

"These are just some examples, barely scratching the surface of the possibilities," commented the developer. "I'm very excited about the upcoming hand-tracking-based interfaces in Spatial Computing (AR/VR) and I hope more designers and developers will explore this radically new and fascinating paradigm."

And here are some of Adam's earlier works, we highly encourage you to visit the artist's LinkedIn and Twitter pages to see more:

If you would like to learn more about using Unity for AR/VR projects, here are some great tutorials that might help you out:

Don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on InstagramTwitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more