Google first launched its software development kit for AR apps, the ARCore, two years ago. To date, the software kit has been used to build thousands of AR titles for Android, iOS and Chrome OS which have been used in hundreds of millions of devices.

This week, Google has announced that it is improving its ARCore augmented reality platform on multiple fronts. The first enhancement is the Augmented Reality Faces API which it launched early this year as part of the ARCore 1.7. Now Google has streamlined the creation process by adding a new face effects template. The API will now play nicely with iOS which will enable XR developers to seamlessly create effects for over one billion users.

The Augmented Reality Faces API will work with the front-facing Android cameras and will be capable of recognizing a face before it applies a 468-point 3D mesh that can be used in adding mask-like objects, textures as well as facial retouching. The API will be capable of tracking and even altering a person’s movements automatically even without a depth camera.

The next enhancement is on the Cloud Anchors API which enables several devices to tap into the cloud and then share information about objects that are located in the real-world scene. The API is now able to more efficiently host and resolve anchors due to the improvements in the cloud anchor generation as well as the visual processing. Users are now able to capture more angles across vast areas in the scene for a more comprehensive 3D feature map when they are creating an anchor. According to Google, the visual data will be deleted once the map has been created, leaving only the anchor IDs which will be shared for the tracked objects to display correctly from each perspective. It’s also now possible for multiple anchors in the scene to be resolved simultaneously, hence cutting down on the amount of time required to launch the shared augmented reality experience.

Google has also stated that it’s working on broadening the use of shared augmented reality experiences through the persistent Cloud Anchors. These are like the “Save” button for augmented reality that allow users to store the locations of their augmented reality creations indefinitely. It gives your virtual creations the same static property as the physical phenomena in the environment. For example, you will be able to store an AR creation in a park and the next time someone views the park through their smartphone cameras, they will be able to see those digital creations that you left behind.

Once anchored, these anchors will sit there indefinitely regardless of the surface or distance. Google hasn’t decided yet on how long these “persistent” anchors will stick around but it has stated that the anchors in active use won’t be deleted. Google says it is planning to work with developers to decide on the time-frame within which these anchors will stay afloat. The dream is to create a portal that will enable users to unearth the hidden world beneath the real world where users will be able to discover several layers of user-generated virtual content. With Google’s Augmented Faces API for iOS, these anchors will also be a lot easier to use.

One app already leveraging these anchors is MarkAR, a social app developed by iDreamSky and Sybo which allows users to create augmented reality art or virtual graffiti in real-world locations which can be discovered and viewed by other users or even modified over a long period of time.

The Persistent Cloud Anchors are already available for private viewing from today. If interested in taking part in this, you can sign up for an early access.

ARCore enhancements announced today follow hot on the heels of the improvements made to the Augmented Images API which let users point their cameras at 2D images and bring these images to life. There is also the Light Estimation API which allows for a single ambient light intensity that can simulate the real-world lighting in digital scenes. From May this year, developers have been able track images and also use the new Environmental HDR mode which relies on machine learning to decode “high dynamic range illumination” in 360 degrees by using the available light data and extending this into a virtual scene.

https://virtualrealitytimes.com/wp-content/uploads/2019/09/Google-ARCloud-600x390.jpghttps://virtualrealitytimes.com/wp-content/uploads/2019/09/Google-ARCloud-150x90.jpgSam OchanjiAugmented RealityTechnologyGoogle first launched its software development kit for AR apps, the ARCore, two years ago. To date, the software kit has been used to build thousands of AR titles for Android, iOS and Chrome OS which have been used in hundreds of millions of devices. This week, Google has announced...VR, Oculus Rift, and Metaverse News - Cryptocurrency, Adult, Sex, Porn, XXX