Channel | Publish Date | Thumbnail & View Count | Download Video |
---|---|---|---|
Publish Date not found | 0 Views |
This feature makes it possible to improve XR experiences by making interactions gaze-aware, which provides:
Hover: Basic movements for entering and exiting the hover
Gaze selection and deselection: activated by how long you look at an object and fully configurable
Gaze assist: creates a volume around the target object for easy selection and is also fully configurable
Useful resources for VR eye tracking:
VR eye tracking tutorial with motion SDK:
https://youtu.be/ZoySn7QlMfQ
VR tutorial for eye tracking with the XR Interaction Toolkit:
https://youtu.be/4ZgI5QhyO4Y
Unity XR Toolkit eye-tracking subsystems are based on OpenXR and therefore many platforms could adhere to that implementation as well, making the
Oculus did not follow OpenXR standards and created their own custom OpenXR backend.
Additional XR development training is available at https://learnxr.io/xr-courses and we are currently offering an early registration pre-sale ending March 13!
Support me by subscribing to avoid missing future videos!
https://www.youtube.com/@dilmerv
Consider becoming a Patreon today:
https://www.patreon.com/dilmerv and GET MY /"Full Source Code/" layer
What do you get from Patreon?
Access this video GitHub stores all the code I work on for each video
Access to special Patreon discord group where I can answer questions
Get XR and game development tips from me on Twitter
https://www.twitter.com/dilmerv
Learn and get my XR training from:
https://www.learnxr.io
My blog/newsletter (Subscribe to stay up to date with XR news)
https://blog.learnxr.io
#xr #metaverse #unity #openxr
Please take the opportunity to connect and share this video with your friends and family if you find it helpful.