Meta Introduces New High-Quality Interactions for Quest (2)
Hand interactions are the Holy Grail for many virtual reality apps but they are also complex to build. Now Meta is offering high-quality standard solutions.
During Connect 2021, Meta introduced the Presence Platform which provides a broad range of machine perception and AI capabilities that enable the development of more realistic mixed reality, interaction as well as voice experiences. They provide for the seamlessly blending of virtual content in the user’s physical world. Following the unveiling last year, Meta went on to release a Spatial Anchors Experimental, Voice SDK as well as Passthrough. This week, the company unveiled a list of mixed reality capabilities with the Interaction SDK Experimental.
With the launch of the experimental version of the Interaction SDK, developers can now try out the new interface although they still can’t implement it in the apps on the Oculus Store and App Lab. The software’s capabilities have to be refined further to allow for its implementation in user-facing apps. For now, it is experimentally available rather than being generally available.
The Interaction SDK supports both optical hand tracking and touch controllers. The purpose of this is to provide developers with a base of high-quality hand interactions. This will save a lot of time and resources that studios can, in turn, invest in content development.
The Interaction SDK Experimental
The v37 SDK release has introduced Interaction SDK Experimental. The SDK allows for easy and seamless development and integration of hand and controller interactions into virtual reality apps. Meta says this is made possible via interaction components that are not only robust but also composable and modular. Interaction SDK provides for more flexible interactions than the traditional interaction framework. It allows you to use only the pieces that you need and integrate these into your existing architecture.
The Interaction SDK Experimental offers the following capabilities: –
Grab, Resize and Throw: Objects in VR can be grabbed, resized, thrown, or passed from one hand to the other. Objects can even be summoned and then grabbed at a distance. This grab capability can also be used on world constrained objects such as levers.
Hand Grab: The Interaction SDK can make hands conform to a virtual object. The Meta toolkit enables developers to easily create poses, a task that is often labor-intensive.
Pose Detection: Developers can easily create custom gestures through per finger information such as flexion and finger curl to enable multiple gestures.
Direct Touch: Developers can use buttons that feel nice to build virtual surfaces. It provides button poke mechanics which are resilient to both missed and false activations. These buttons can also simulate physicality and haptics via “touch limiting” whereby your virtual hands are prevented from going through the buttons thereby enhancing the realism. There is also a scrolling gesture with which developers can create sophisticated 2D UIs such as curved surfaces.
Targeting and Selection: This enables developers to implement the same targeting and selection that’s seen in the system UI.
A Versatile Tool for Hand Interactions
An SDK can be used flexibly. Developers can pick and choose the hand interactions they need for their virtual reality apps.
Among things, the software simulates the throwing, grabbing, and passing of digital objects. With a single tool, developers can create natural-looking hand and finger positions that arise when gripping objects. This is now possible with lesser effort than before.
The software also supports gesture recognition and simulates interactions with keys, virtual knobs, and user interfaces. Controllers and hands can now also be used for selecting controls and precise targeting.
Meta tested the SDK with many studios implementing the hand interactions in their virtual reality games. The feedback flowed into further development and improvement of the software. Partners included the studios behind ForeVR Darts, Chess Club VR, and Finger Gun.
Tracked Keyboard SDK
Keyboard Tracking SDK is set to appear at the same time as the Interaction SDK. Developers are now able to implement the latter in their store apps.
This interface enables the integration of Meta keyboard tracking which visually captures the physical keyboards and brings them into virtual reality as a digital copy. The hands are also captured and rendered. It currently supports the Logitech K830 and Apple’s Magic Keyboard. Entries are transmitted via Bluetooth.
Keyboard tracking is designed to make working in virtual reality a lot easier and will, primarily find applications in productivity apps. Meta did advance testing with developers as well, in this instance, including with vSpatial which has been incorporating keyboards into its productivity apps.
Meta Quest 2 Now Has A Bevy of New Interfaces
Both Interaction SDK and Tracked Keyboard are part of the Presence Platform which is a group of new mixed reality and interface SDKs that was unveiled during Connect 2021.
Below is an overview of the Presence Platform elements that have been announced so far along with their status. Generally available means the interface is ready for integration into store apps.
- Insight SDK for mixed reality apps. It consists of the Passthrough API (generally available) for the use of passthrough mode; Spatial anchors (experimentally available) for the permanent spatial anchoring of digital objects in the physical space; Scene Understanding (Not Available) for machine recognition along with the classification of physical objects.
- Voice SDK: This is for voice control. It is generally available.
- Interaction SDK: For improved hand interactions. It is experimentally available.
- Tracked Keyboard SDK: For the tracking of keyboards. It is generally available.
Some of the SDKs have already been integrated by several studios in their pre-releases. Read more on Oculus blog.
https://virtualrealitytimes.com/2022/02/06/meta-introduces-new-high-quality-interactions-for-quest-2/https://virtualrealitytimes.com/wp-content/uploads/2022/02/Meta-Quest-Interaction-SDK-600x336.jpghttps://virtualrealitytimes.com/wp-content/uploads/2022/02/Meta-Quest-Interaction-SDK-150x90.jpgTechnologyHand interactions are the Holy Grail for many virtual reality apps but they are also complex to build. Now Meta is offering high-quality standard solutions. During Connect 2021, Meta introduced the Presence Platform which provides a broad range of machine perception and AI capabilities that enable the development of more...Sam OchanjiSam Ochanji[email protected]EditorVirtual Reality Times - Metaverse & VR