Google Maps’ Immersive View Launches in Five Cities
Google Maps’ Immersive View feature was first revealed at the I/O 2022 event. The feature leverages both computer vision and Artificial Intelligence to blend Street View and aerial photography into a 3D format. The feature is meant to generate a detailed perspective of the street below, including buildings and various rich aspects of the environment including the general busyness and traffic conditions.
Google is now rolling out the feature in five cities – London, New York, Los Angeles, Tokyo, and San Francisco.
Are you the sort of person who needs to get the feel of somewhere before you commit? 🗺
With immersive view on Google Maps, you can see what a neighborhood is like before you even set foot there📍
— Google Europe (@googleeurope) February 8, 2023
In September 2022, Google began providing a preview of the feature in the five cities where it has now been rolled out. Over the next few months, Immersive View will be rolled out in additional cities including Venice, Amsterdam, Florence, and Dublin.
Google pitches the Immersive View as a travel planner. It adds contextual information to locations including weather conditions, traffic, as well as the busyness of a particular location at different times of the day.
The feature will enable you to hover above buildings in a location and view details such as the location of entrances to attractions, enabling you to figure out locations and other details a lot faster.
The Immersive View was developed via an AI technique known as neural radiance fields (NeRF) which converts photographs into 3D representations. NeRF enables Google to “recreate the full context of a place” with details such as its lighting, what is in the background as well as the texture of materials. In a blog post, Google says the feature could, for instance, enable you to see a bar in moody lighting so you can decide whether it offers you the perfect vibe for a date night.
Google added other updates to the Maps. For instance, there is a feature known as “glancable directions” that you can use to track your trip directly from your lock screen or the route overview, irrespective of the activity you are partaking in be it walking, biking, or using public transit.
The ‘glancable directions’ feature will even notify you of where to turn and update you on the estimated arrival time. If you switch up your route, it updates the information.
Glancable directions is set to roll out on Android and on iOS where you can access via Live Activities in the next few months.
Google is also planning to additional features for electric vehicles that have Google built in. The system will take into account the charging stops for shorter trips that need it. The system will also offer suggestions of the best charging station according to variables like traffic, the current battery level as well as the amount of energy that the EV is using. It will give users access to a “very fast” filter for locating stations with chargers of “150 kilowatts or higher.” The system will also enable EV users to filter search results to include only locations with charging stations so they can quickly figure out, for instance, the supermarkets for topping up batteries while shopping for batteries.
In the next few months, Google will also expand Search with Live View as well as Indoor Live View to several more locations.
Google has also provided a peek into Bard, its upcoming Search chatbot.https://virtualrealitytimes.com/2023/02/07/google-maps-immersive-view-launches-in-five-cities/https://virtualrealitytimes.com/wp-content/uploads/2023/02/Immersive-View-600x334.pnghttps://virtualrealitytimes.com/wp-content/uploads/2023/02/Immersive-View-150x90.pngTechnologyGoogle Maps’ Immersive View feature was first revealed at the I/O 2022 event. The feature leverages both computer vision and Artificial Intelligence to blend Street View and aerial photography into a 3D format. The feature is meant to generate a detailed perspective of the street below, including buildings and...Rob GrantRob Grant[email protected]AuthorVirtual Reality Times