New iPhone to be Released This Year to Have a Rear-Facing 3D Camera
One of the iPhones set to be released this year will have some impressive augmented reality capability according to a report by Fast Company.
According to the report, one of the upcoming Apple iPhones will be fitted with a 3D depth camera attached on its back. This world-facing 3D scanner will offer users better augmented reality experiences. The feature has been rumored for years and this must be the year it finally comes to fruition. The Fast Company report quotes Apple-related sources.
The rear-facing camera will be a laser, sensor and a software system and it will work on the time-of-flight principle. A time-of-flight sensor sends out a light pulse and then measures the time it takes for the pulse to reach the object and travel back to the camera sensor. Using the principle, the camera is able to measure the rooms and objects in 3D.
The detailed and accurate depth information captured by the time-of-flight camera allows for the creation of new photo as well as video effects. It also provides much improved AR experiences.
Apple has been working on a rear-facing (world-facing) 3D camera for two years now and rumors had been rife that it would be installed on some of the previous iPhone launches. For close to two years, it was one of the features on the shortlist of tech analysts as next in line to be added to the iPhones. This has not materialized, however. From these reports, the feature is in the design and we will see it later in the year if COVID-19 outbreak doesn’t interfere with Apple’s production schedules and launch plans.
Apple is purchasing the laser needed for the working of the new 3D time-of-flight camera from Lumentum, which is based in San Jose. This is the same company supplying the laser used in iPhone’s front-facing 3D cameras.
Some manufacturers have already incorporated rear-facing depth cameras into their smartphones. Samsung’s Galaxy S20+, Galaxy S20 Ultra and the Galaxy Note 10+ already have the 3D depth cameras. So do a constellation of other Android-based smartphones. It is hoped that Apple will bring out new novel ways of leveraging the time-of-flight camera technology to enable new user experiences. Going by past experience, Apple is also more likely to be showy in how it brands and markets the 3D camera/AR experiences.
iPhones already have the front-facing depth cameras (TrueDepth). These are mainly used in FaceID security feature as well as in some of Apple’s fun messaging effects like Animoji.
Last year, Apple launched the iPhone 11 Pro and the iPhone 11 Pro Max which feature three camera lenses on the back: a 12-megapixel wide angle camera, a 12-megapixel 2X telephoto lens as well as a 12-megapixel ultra-wide-angle lens. All these three lenses give end users a breadth of photo-taking options but the 3D depth camera would provide depth information thereby taking it a notch higher.
The main depth effect that is currently available in the iPhone is its Portrait mode. This provides the photos with a “bokeh” effect. This blurs the background layer and positions the foreground subject in a sharp focus.
The 3D depth camera will add the depth data that allows for the creation of better-looking bokeh effect and makes it easier to more accurately distinguish the foreground from the background layers. It also adds additional depth layers that blur or focus. This functionality could make it possible to in the future adjust the layers of photos that are blurry and the ones that are now focused after editing mode.
3D mapping could also be used together with the photo software features of the iPhone. The 3D depth camera will have the biggest impact on the quality of the AR apps. Apple’s ARKit framework for building Augmented Reality apps was released three years ago although it is yet to pick up with consumers. The depth information enabled by the 3D sensing allows for greater accuracy in the functionality of the apps as they are able to more precisely place objects in space. This is likely to result in very powerful app functionality.
Apple is also building an Augmented Reality app for iOS 14 that will make it possible for users to point their smartphones at items in Apple Stores and Starbucks and they will subsequently see digital information about the items displayed around those items via the phone screen.
The rear-facing 3D cameras also allow users to create content that is easily shareable on social media. The phone’s camera and software can, for example, be used to share images of one interacting with holographic projections of celebrities or even animals. This is not so different from the functionality of the Holo app developed by 8i. With depth data, however, it can deliver much improved experiences. For instance, with depth sensing, it could show the feet of the hologram relative to the tabletop, in a very convincing way.
The 3D camera technology in Samsung phones powers AR features such as Live Focus that enables the user to blur out backgrounds in still images and videos thereby placing an emphasis on the image in the foreground. There is also the Quick Measure feature that can be used in approximating the dimensions, area and volume in the camera frame. Apple’s depth sensing camera feature may follow a similar path by highlighting features that can be activated by the rear-facing 3D camera.
AR on smartphones unlocks lots of opportunities but there is always the limitation of awkwardly viewing the AR effects via a smartphone screen held in front of the user’s face. The better AR experience would be via normal-looking AR glasses or headsets which Apple is reportedly silently working on. In the future, this might be Apple’s primary spatial computing device.