Developer Showcases a Smooth Transition from Hand to Controller Tracking in Quest
Hand tracking for Oculus Quest will be a ubiquitous functionality in the future. A developer now shows how it could get even better.
Oculus released its experimental hand tracking update for Oculus Quest in 2019. The update enables users’ hands to become an input device as soon as they slide them into the field of view of cameras. The hand interactions still have some slight delays but the feature is sufficient to operate menus. However, it is still not as refined as early experimental programs have shown so you are still not able to use your hands to flawlessly play piano in virtual reality, for instance.
From a technical standpoint, the successful integration of hand tracking into virtual reality is a major breakthrough as users are now able to have real-time hand tracking running on the smartphone chip in the Oculus Quest headset. The tracking efficiency was realized thanks to Artificial Intelligence.
However, users still have to switch between the manual and the controller tracking in the menu. Instead of simply having the player’s hands as input and the controller as an advance VR control tool, users must factor in advance whether they wish to navigate through the virtual reality menu with their hands or a Quest controller. In the end, the user will usually opt for the controller since they will have to hold them in their hands anyway.
Microsoft Developer has Optimized Oculus Quest Hand Tracking
A Mixed Reality Developer Eric Provencher has now shown in Microsoft’s programming software Mixed Reality Toolkit for Unity how this could be done more efficiently. The developer adapted the HoloLens 2 hand control demo for Oculus Quest and managed to integrate a smooth transition from hand to a controller-based control.
Improved my fork of #MRTK for Quest.
Now with support for swapping hands for controllers at runtime! Toggling for hands from the system settings also works!
Also added a few other bug fixes, including correct finger tip cursor rotation and much better feeling pinching. pic.twitter.com/h33QAhhwoV
— Eric prvncher (@prvncher) December 31, 2019
In order to switch from manual to the controller, there is a button in the controller that must be pressed in advance. To return to hand control, there is the well-known detour through the Oculus menu system that is still pending. The perfect solution is one where the hand and controller control would always be recorded at the same time. This would go a long way in creating new use-case scenarios.
To test this demo, you can download and install it for free from the SideQuest software platform. SideQuest also some of the first hand tracking apps (experimental apps) for Oculus Quest.
In the second demo, Provencher expands the Oculus hands using menus which appear as soon as the user turns the palm of their hands upwards. The Mixed Reality Toolkit is available on GitHub.
– Fixed palm rotations to match what HoloLens 2 provides
– This makes hand menus work + added a nifty flip behavior to the hand menu
– Make use of Oculus hand rays, which are much more stable pic.twitter.com/QJmKQyEEZT
— Eric prvncher (@prvncher) December 30, 2019