Google’s ARCore Depth API Update to Make AR More Realistic
Since its inception last year, Google has been working behind the scenes to improve its augmented reality platform ARCore. The latest update that the company is set to unveil includes some next-generation updates on the depth detection as well as the physics that has been achieved. The new API update promises to make augmented reality experiences appear more realistic in the future.
The upgrades constitute ARCore’s all new Depth API which will soon enable developers to perform occlusion. In occlusion, an artificial object is blocked from view by the other real-world objects within a scene. For example, an artificial object that is superimposed in a real-world living room can be blocked from view if the camera is angled in such a way that it places real world objects in between the camera and the virtual image.
Currently, without occlusion, the virtual objects always appear in front since ARCore does not have a good understanding of the depth of the real objects that are in the scene. Similarly, positional tracking in virtual reality tracks the high-contrast features that are in the scene instead of tracking the whole scene.
The ARCore Depth API is different in that it estimates the depth of all the objects that the camera sees. This is what enables occlusion, where the virtual objects can now appear behind the real objects if the real-world objects are nearer to the camera.
In augmented reality, occlusion is as important as positional tracking is to virtual reality. Without occlusion, the augmented reality view will generally ruin the illusion of realism of the experience as depth conflicts are generated.
In virtual experiences, occlusion creates greater realism and makes the scenes appear more believable. The depth detection that is happening under the hood enables the smartphone to better understand every object in the scene including how far apart each of the objects in the scene is from the other. Google says this depth detection is made possible by optimizing the existing software so that the user doesn’t have to rely on a phone that has a specific sensor or a specific type of processor. This is also occurring within the device itself without relying on the cloud. As long as the end user has a phone that supports ARCore, they are able to access these features. Practically all the new Android phones that have been launched in the last few years support ARCore thereby spreading the benefits of depth detection to a wider constituency of users.
This is not the first time that we are seeing occlusion on mobile phones. In July 2018, Niantic, creator of the wildly popular Pokemon Go game, showcased a video of an occlusion demo which featured a tiny virtual pikachu that darted about an urban plaza, dashing in between the objects and seamlessly blending with the surrounding environment. However, this was simply a video, not an actual demo that the audience could see and interact with in real time. But the team managed to build the real-time demos that showcase the depth detection technology. Some have already experienced this in Google’s test environment and proven that the technology actually works. The feature will be available from today and will be part of the Houzz’s app updates and Google’s AR in Search feature.
With the Houzz update, the furniture found in the home improvement portal’s app which form a part of its “View in My Room 3D” feature will now have occlusion support. Google has also stated that over 200 million Android devices will also get occlusion for any object with an AR model in Google Search.
Some of the Depth API’s new capabilities won’t be coming to the commercial apps and services today. However, Google has announced that these advancements will be availed to developers in the future once the tech giant has refined some its approaches in close cooperation with developers and other collaborators. These new advancements will go beyond occlusion and venture into physics and 3D mapping.
Not even Apple’s ARKit for iOS and iPadOS provides end users with full depth occlusion yet. Some of Apple’s more recent and powerful devices do enable occlusion for human bodies. Before the current ARCore Depth API update, both ARCore and ARKIt have been on the same plane as far as their occlusion capabilities are concerned. With the ARCore update, they will now begin to diverge as Google’s augmented reality platform begins to offer developers and users greater realism. For developers, the diverging capabilities for ARCore and ARKit may create some difficulties but it also creates some opportunities to build some specific innovations for each of the platforms.
The update creates a real-time understanding of the scene depth that could be very useful for virtual reality experiences. The current series of virtual reality headsets keep users aware of their real-world surroundings by having them draw the playspace (Boundary) during the setup of the system. When the wearer of the headgear moves closer to the boundary, the boundary is shown thereby warning users that they are about to step outside the playspace. The occlusion technology could make this automatic and also 3-dimensional. When a user walks too close to an object, it could simply appear in the headset.
Facebook has also showcased some real-time depth mapping although the product hasn’t been shipped yet. Currently, the ARCore Depth AI isn’t publicly available and Google is yet to announce a release window. However, developers can sign up for approval if they wish to try it out.
https://virtualrealitytimes.com/2019/12/11/googles-arcore-depth-api-update-to-make-ar-more-realistic/https://virtualrealitytimes.com/wp-content/uploads/2019/12/ARCore-Depth-AI-Update-600x400.jpghttps://virtualrealitytimes.com/wp-content/uploads/2019/12/ARCore-Depth-AI-Update-150x90.jpgAugmented RealityTechnologySince its inception last year, Google has been working behind the scenes to improve its augmented reality platform ARCore. The latest update that the company is set to unveil includes some next-generation updates on the depth detection as well as the physics that has been achieved. The new API...Sam OchanjiSam Ochanji[email protected]EditorVirtual Reality Times - Metaverse & VR