Apple Reportedly Launching Its Sensor-Packed Mixed Reality Headset in 2020
Reports emerged last week indicating that Apple’s much-vaunted mixed reality headset project might have been killed but a recent patent filing discovered yesterday seems to suggest that the project is still on course and that Apple will most likely release its mixed reality headset in 2020 as reported last year.
For the past year, it has been suspected that Apple is discreetly working on a sophisticated headset that will be capable of both virtual reality (VR) and augmented reality (AR) experiences and there are reports that this headset might be unveiled as soon as 2020.
A newly-filed patent reveals some of the ideas that Apple might be looking at. These include sensors on an Apple head-mounted display that will be capable of capturing the user’s gestures and expressions such as the head pose, gaze, surrounding light, hand gestures as well as the facial expressions such as the headset wearer’s mouth, eyebrows and jaw movements.
The Apple AR glasses will reportedly have 8K displays per eye along with sets of sensors and cameras that will track the wearer’s face and the environment. The Apple project has been codenamed T288.
A mixed reality headset is simply another virtual reality headset which has been equipped with cameras that capture the real environment around the user. Instead of simply overlaying the 3D graphics over the real-world view of the user like the augmented reality devices, mixed reality headsets take both the video capture with camera and the synthetic image and merges them to form a video.
Hopes for an Apple mixed reality headset have come alive following revelations that the tech giant filed a patent with the US Patent and Trademark Office (USPTO). The patent was filed in March 2019 and published on July 18 2019. It describes a mixed reality headset that will have a single display for every eye, a camera along with sensors that will capture the user’s face. The setup will make it possible for the device to merge reality and 3D into one while also tracking the user’s expressions so as to render their avatar in the virtual environment.
It will include multiple outward-facing sensors that will collect information on the user’s environment such as the depth information. These images will subsequently be rendered on the user’s display.
The concept involves representing the wearer’s gestures, body and facial expressions in a 3D view which the user will be able to see through their headset’s viewer. Apple says the device will use some of the inputs collected via the user-facing sensors to render an avatar of the user for display in the 3D virtual viewer and this will subsequently be blended with the information which is captured from the world-facing sensors.
Apple’s idea, as represented in the patent, is capturing the real world and then showing representations of this on the wearer’s display on the mixed reality device. The approach is markedly different from that taken by other mixed reality hardware manufacturers such as Magic Leap and Microsoft’s HoloLens. The Apple approach captures the real-world images and then renders these on the display of its device.
The emphasis on the use of sensors to capture and render facial expressions would also auger well for devices that use Apple’s Animoji and Memoji avatars in the tech company’s latest iPhones.
Apple has filed numerous AR-related patents and made several AR-focused acquisitions that strongly indicate that the company is up to something AR-related. Apple CEO Tim Cook is also a big fan of AR and considers virtual reality to be too closed and too isolating while AR offers the opportunity to connect users in the virtual world with others in the real world. Apple’s AR-focused strategy prompted the company to make multiple AR-related acquisitions in the past four years.
These acquisitions date back to 2015 when the company took control of Metaio along with the soft code that went to serve as the basis for Apple’s iOS AR development kit, ARKit.
In 2017, Apple made a number of AR-based acquisitions such as the US dot-based image sensor manufacturer InVisage Technologies; the French computer vision company Regain, the Canadian AR displays hardware manufacturer Vrvana and the eye-tracking hardware and software maker SensoMotoric. One key recent acquisition has been of Akonia Holographics which had more than 200 AR patents at the time of acquisition.
In July last year, Apple filed an AR patent titled “Augmented Reality Device to Warp Images” which was published by the USPTO. This patent described augmented reality glasses that are designed to assist people with visual impairment. The glasses do this by capturing the user’s view and then “warping” part of the reality captured to make it bigger. Our brains do this naturally whenever we need to focus on a fast-moving object such as a ball coming towards you. In this instance, the brain will instinctively oversize the ball to enable the user to catch it more efficiently. The Apple system offers a similar functionality albeit in a digital implementation.
The kind of design thinking that Apple will embrace for its headset is still not clear and will only become apparent once the device is released. The current patent is a pointer to a device which will be packed with lots of sensors with some of these being drawn from the camera setup for FaceID which is in the various versions of the iPhone including X, XS and XR.
The Apple illustrations are also a pointer to a dedicated sensor for every eyebrow as well as two sensors for the lower jaw and multiple downward-facing infrared hand sensors. There will also be some eye-tracking sensors, two cameras to be used in capturing video and approximately three world sensors including the LIDAR sensor used for depth information.