Apple Unveils ARKit 3 Which Features Human Occlusion, Body Tracking and Motion Capture
Lost amidst the announcements and reveals during the WDDC keynote was the announcement by Apple of the launch of its ARKit 3 augmented reality platform which promises to elevate AR experiences into a new frontier of immersion. The new platform will include plenty of new exciting features like Motion Capture that allows for the capturing of human motion which can subsequently be injected into the augmented reality experience; People Occlusion that enables users to layer digital content in front of people or behind them.
The new ARKit 3 release also includes functionalities such as multiple face tracking, simultaneous front and back camera usage among other features. Some of these new augmented reality features in the ARKit 3 were showcased on stage during the WWDC keynote.
People First AR Experience
The people occlusion feature is perhaps one of the most consequential in the ARKit 3 release. The feature provides users with a more realistic augmented experience while making life easier for developers. Through people occlusion, it will be possible for augmented reality content to realistically appear in front of or at the back of a user in the real world. This helps make these experiences more immersive while also activating screen-style effects in any kind of environment.
The people occlusion technology is capable of tracking up to three human faces simultaneously while also making it possible to integrate human movements as input in the augmented reality app so as to further enhance the immersive effect. The new multiple face tracking functionality with ARKit 3 does the simultaneous face-tracking by utilizing a front-facing TrueDepth camera on various kinds of high-end iOS devices including iPhone XR, iPhone X, iPhone XS Max, iPhone XS and the iPad Pro.
Motion Capture
Motion Capture enables developers of immersive experiences to make use of gestures, poses or movements in real-time as input for augmented reality experiences. It is quite a cutting-edge functionality. The new ARKit will now allow for the capturing of human motion in real time with the use of a single camera. It’s capable of perceiving the body position along with the movements as a series of joints and bones and uses these motions and poses as inputs in an augmented reality scene and even manipulates them as animated objects. A perfect example of this was demo’d in the Minecraft Earth game where the body movements of a player in the real world animated a doll in the virtual Minecraft world. The Motion Capture functionality puts people right at the center of the augmented reality experience. It’s no longer just a remote experience but you are part of the environment and the events that happen therein.
Simultaneous use of front and back cameras
The ARKit can simultaneously utilize both the front and back cameras for face and world-tracking thereby enabling the user or player to interact with the augmented reality content in the back-camera view. This opens up lots of new possibilities when it comes to AR content creation. Users, will for example, be able to simply use their faces to interact with augmented reality content in the back camera view.
The ARKit 3 update allows for up to 100 images to be detected at a time with automatic estimates for the actual image size. It also provides for enhanced 3D object detection so as to ensure better object recognition and utilizes machine learning for faster and more efficient detection of planes within an environment.
Collaborative Sessions
The collaborative sessions is also a major new update in the ARKit 3 framework. The functionality was showcased during the Minecraft Earth demo on stage at the WDDC. The live collaborative sessions between multiple people enable participants to build collaborative world maps thereby accelerating the process of creating AR experiences. It also enables players to immerse themselves in shared augmented reality experiences such as the multiplayer games. The multiplayer function was demoed on stage with the Minecraft Earth AR rendering of the original Minecraft.
RealityKit and Reality Composer apps
On top of the ARKit 3 augmented reality platform, Apple also announced the RealityKit and the Reality Composer apps which simplify the process of creating rich interactive augmented reality experiences. These features have been built for developers, particularly those who may not have an extensive 3D experience.
Reality Composer is a new authoring app for iPadOS, MacOS and iOS. The app simplifies the development process enabling relatively new 3D developers to easily create prototypes and produce augmented reality experiences through a simple drag-and-drop interface along with a prebuilt library of animations and 3D objects. It’s also possible for users to import their own USDZ files.
It allows for the flexibility of moving around between the desktop Mac, iPhone or iPad while also enabling developers to build with live linking. There is so much that you can do with this. For example, it is possible to have animations that let you do lots of interesting things such as moving, scaling or adding a spin to virtual objects and then setting them off with a preferred trigger. The AR objects cans also be placed anywhere, moved or rotated so as to create new augmented reality experiences that can be directly integrated into an app with the use of Xcode. These AR objects can also be exported to AR Quick Look.
There are also improved tools in the RealityKit framework which can be used in realizing environment mapping, photorealistic rendering, motion blur, environment reflections, support for camera noise as well as for the grounding of shadows so as to integrate real and virtual content.
The framework also lends itself to animation, spatial audio and physics and can thus be leveraged with the new RealityKit Swift API by developers.
Users can also use the AR Quick Look feature to place 3D objects in the real world. It provides support for scenes and models which have been built with the Reality Composer so developers leverage this in creating interactive experiences which can be viewed and even shared on the iOS 12 platforms and above.
The ARKit 3 framework will be available as of the iOS 13 developer beta release.
https://virtualrealitytimes.com/2019/06/05/apple-unveils-arkit-3-which-features-human-occlusion-body-tracking-and-motion-capture/https://virtualrealitytimes.com/wp-content/uploads/2019/06/ARKit-3-for-more-immersive-augmented-reality-600x300.jpeghttps://virtualrealitytimes.com/wp-content/uploads/2019/06/ARKit-3-for-more-immersive-augmented-reality-150x90.jpegAugmented RealityTechnologyLost amidst the announcements and reveals during the WDDC keynote was the announcement by Apple of the launch of its ARKit 3 augmented reality platform which promises to elevate AR experiences into a new frontier of immersion. The new platform will include plenty of new exciting features like Motion...Sam OchanjiSam Ochanji[email protected]EditorVirtual Reality Times - Metaverse & VR