New Quest Demos Show Networked Hand Tracking
There are two new Quest hand tracking demos that demonstrate just how hand and finger detection could have massive implications for social VR.
When two people meet in the real world, they are able to communicate with each other on various levels including spoken words, facial expressions, gestures as well as the body language. Only a small part of this world can be recreated in virtual reality with the current technologies. The current commercial virtual reality headsets are only able to capture the head and hand movements.
With the developments in hand tracking technology, another component of the face-to-face interactions can now be incorporated into virtual reality and could soon become a part of the social VR experience: hand gestures.
Hand Tracking Unlocking More Expression
The standalone virtual reality headset Oculus Quest has recently began supporting both hand and finger tracking as standard. Oculus has been running the hand tracking beta test from November 2019 and over the period, developers on the platform have created a number of innovative demos for Quest’s hand and finger tracking capabilities.
For the first time, however, we are now seeing experiments that feature hand tracking in a social VR context. The first experiment is from the developer Daniel Beauchamp who previously designed a Jenga VR game that users can play with their own hands. He also developed a mini version of Beat Saber where the player’s fingers become the lightsabers.
Beauchamp has now programmed a room where Oculus Quest users can meet with the hand tracking functionality activated and published the result on Twitter.
Behold! Networked hand tracking!
Words can't describe how much this adds to social VR. pic.twitter.com/ioNJKfY3S0
— Daniel Beauchamp (@pushmatrix) June 9, 2020
The short demo video demonstrates how gestures could enrich communications in social VR by providing virtual reality users more expression.
Touch in VR
The second hand tracking demo was from the developer Jorge Gonzalez and it takes it one step further. The demo showcases credible hand interactions between two virtual reality users like clapping and shaking hands. Gonzalez’s software was able to simulate the subtler touches and even perform some social interactions that would not be possible to simulate without hand tracking support.
https://twitter.com/jorgejgnz/status/1270679451140001792
The interactions showcased including the physical collision model look so realistic that it is only after a hard look can you discern that the person is opposite is merely a reflection of the virtual reality user.
In a post on Reddit, Gonzalez stated that he had used a freely available code to create the hand tracking experiment. He also confirmed that he was working on a multiplayer version of the mode. It is still unclear whether or when it will be available on SideQuest, Oculus Quest’s sideloading app.
https://virtualrealitytimes.com/2020/06/14/new-quest-demos-show-networked-hand-tracking/https://virtualrealitytimes.com/wp-content/uploads/2020/06/Hand-Tracking-Social-VR-Demo-600x338.jpghttps://virtualrealitytimes.com/wp-content/uploads/2020/06/Hand-Tracking-Social-VR-Demo-150x90.jpgTechnologyThere are two new Quest hand tracking demos that demonstrate just how hand and finger detection could have massive implications for social VR. When two people meet in the real world, they are able to communicate with each other on various levels including spoken words, facial expressions, gestures as well...Sam OchanjiSam Ochanji[email protected]EditorVirtual Reality Times - Metaverse & VR