Apple has been actively creating virtual reality and augmented reality for quite some time now but a just-published patent shows that it is finally showing an interest in recording. The patent shows Apple is working on developing a fully recorded AR/VR experience that can be documented and then be replayed later.

Currently, it is already possible to record livestreams such as the gameplay in virtual reality. However, Apple is aiming to capture not just the computer-generated elements but also the real-world ones, too. This will constitute the additional content.

The streams of conventional 2D videos such as the YouTube streams in TV shows, movies, trailers or commentaries in games are visually flat.

Apple wants to change this when it comes to augmented reality and virtual reality streaming. The patent application is called “Media Compositor for Computer Generated Reality‘ and shows Apple has been working on composting multiple streams to create new possibilities such as enabling AR and VR viewers to watch the streamed content from their preferred angles.

The technology required for this is complex but the end product is very easy to grasp. Instead of simply recording both video and audio from one perspective like in the composited 2D view seen in the live videos from an iPhone or iPad camera with AR content like Pokémon in a scene, Apple wants to record separate streams which capture the entirety of the experience from the perspective of the recorder as well as the time-stamped extra data that can be used later on to change the experience for viewers. So the viewer is not just restricted to the recorded digital content but they are able to pull up some extra information from the real-world scene.

According to the Apple patent, the inclusion of 3D models or virtual objects in a composited stream enables users to have an enhanced experience of the recording or the live streaming. It could allow viewers to experience a scene from different viewpoints than the viewpoint of the creator; it could even allow viewers to move or rotate objects captured in the scene.

In the Apple system, a viewer of the AR/VR recording would even be able to move their heads and see the different viewpoints and also “experience sounds based on their own head orientation, relative positioning to the audio sources, etc

The patent application consists of a series of descriptions on how any device/computer with sufficient processing power and storage can capture elements separately and at the same time simultaneously. These can subsequently be recombined “to form a composited stream” that will contain the entire experience.

The device will capture the first data consisting of rendered frames as well as one or more additional data streams that will constitute the additional data. The rendered frame content such as 2D images or 3D models will represent the real content or the virtual content that has been rendered in a computer-generated reality (CGR) content experience.

Apple calls the computer that will record these streams as being a “media compositor” that will be capable of recording multiple and separate data streams and then keeps them synchronized.

One use case for this is that the technology could enable a viewer to experience the AR/VR content from an angle that is different from that of the streamer by using the same 3D models as well as the embedded positional audio details in order to appropriately recreate the scene from the perspective of the mixed reality headset.

Another implementation could involve separately recording multiple viewing angles which viewers can subsequently switch between. Based on the capabilities of the AR/VR hardware, other possible implementations could include bringing in certain aspects of the original scene such as the illumination, temperature along with the human participants into or out of the stream.

Ultimately, Apple wants to create AR/VR recordings that can be easily stored and which have a “rich” depth that viewers can experience in multiple ways. This could potentially appear in the form of a web plugin which passively plays the composited content such as a video but then morphs into more interactive content that users can engage with.

Apple has had a history of creating interactive iAds and this could be its bet for the next-level user engagement with advertisements or next evolution of Twitch-style video streams for augmented reality games.

This is a patent application, though, and there is no set timeline for the rollout of this technology. However, the tech could also tie in nicely with Apple’s current AR/VR hardware development efforts which it has been tight-lipped about so far.

Five inventors have been credited in the patent application: M. Movshovich, Ranjit Desai, Perry A. Caro, Gurjeet S. Saund and Venu M. Duggineni.

Sam OchanjiInventionsApple has been actively creating virtual reality and augmented reality for quite some time now but a just-published patent shows that it is finally showing an interest in recording. The patent shows Apple is working on developing a fully recorded AR/VR experience that can be documented and then be...VR, Oculus Rift, and Metaverse News - Cryptocurrency, Adult, Sex, Porn, XXX