Google has announced Android XR, a new platform dedicated to VR headsets and AR smart glasses.
Android XR is a new version of the Android operating system that was built from the ground up for XR devices like VR headsets and AR smart glasses.
Compared to other flavors of Android, Android XR was completely developed around Google Gemini at its core.
In fact, Google says that Android XR is the “first Android platform built for the Gemini AI era.” The Android maker hasn’t been alone in developing Android XR, though.
Rumors suggest Samsung’s XR smart glasses could weigh just 50 grams, which is very close to the highly lauded Ray-Ban Meta smart glasses.
Billions of devices around the world are running the Android operating system. Although phones make up the majority, many of them are also tablets, smartwatches, televisions, automobiles, and other sporadic IoT devices. Though officially only supporting Android on phones, tablets, watches, TVs, and automobiles, Google is now extending the OS to include extended reality (XR) devices. Android XR, a new platform specifically designed for VR headsets and AR smart glasses, has been announced by Google.
Android XR: What is it?
A new version of the Android operating system called Android XR was created specifically for XR devices, such as AR smart glasses and VR headsets. Although it has been significantly modified to support XR experiences, it is based on AOSP, the open-source framework that underpins all Android devices.
the Android XR logo.
It’s been years since Google released a completely new version of Android; the last time it did so was with Android Automotive in 2017, long before generative AI became popular. In contrast to other Android versions, Android XR was entirely built with Google Gemini at its center. (If you don’t know, Google’s AI chatbot and extensive language model family is called Gemini. Indeed, according to Google, Android XR is the “first Android platform built for the Gemini AI era.”. “.”.
However, the Android developer has not been the only one to create Android XR. To make Android XR a reality, it worked closely with Qualcomm, Samsung, and other partners. The first hardware running Android XR, which is anticipated to launch in 2025, is being developed by Samsung. In the meantime, Qualcomm is developing the chipsets that will drive these gadgets.
However, Google sees Android XR as the single unifying platform for a variety of XR scenarios, from VR headsets for gaming and productivity to smart glasses for lifestyle and healthcare. Qualcomm and Samsung are just the first of many companies working on Android XR hardware. For instance, businesses like Sony, XREAL, and Lynx are already developing their own Android XR smartphones. But since Google hasn’t made a “full determination” on whether or not to make the Android XR source code publicly available, it’s unclear if any enterprising startups will be able to produce hardware for it—at least not without partnering with Google.
What’s driving Google to develop Android XR?
Since Google launched and then abandoned the AR-focused Google Glass project and the VR-focused Daydream VR platform, regular readers are likely aware that the company is not new to XR. Google thinks that while its initial idea for XR was “correct,” the technology was just not yet ready. The company continued to pursue its XR goals after ending both projects, shifting its focus to phone-based augmented reality projects like ARCore.
Google is now certain that it can finally make VR popular thanks to recent advances in AI. engaging in multimodal interactions with AI chatbots, ie. is going to be the “killer app” of this era, according to Google, both through speech and vision, but it’s just too awkward to do so with a smartphone at the moment. Following the Project Astra demonstrations earlier this year, we can support the company’s belief that VR headsets and AR smart glasses are a far more natural form factor for these kinds of interactions.
An Android phone demonstration of Project Astra. Google served as the source.
Google thinks that this is the ideal moment to release Android XR. Given its “unique” position in AI, the company thinks it’s well-positioned to launch the platform. From state-of-the-art AI models in the cloud to on-device AI models to an ecosystem of developers it can connect with, Google has a “full stack” of technology at its disposal to introduce AI to XR platforms.
On an Android XR headset, what can you do?
The only way to get developers to embrace Android XR is if there are no users to sell apps to, and the only way to get users to purchase Android XR headsets is if there are amazing apps and experiences available when the device launches. Google gave us a preview of some of the experiences that Android XR headsets will provide today. This includes demonstrations of how the operating system allows for an infinitely customizable and immersive viewing experience that can be managed through multimodal, natural AI interactions.
For instance, Android XR enables the display of apps such as YouTube, Google Photos, and Google TV in windows that float over real-world objects. Using hand gestures, you can minimize or close these windows, drag or drop them, and move them around. In Android XR, each app has a header bar and occasionally a bottom bar with a number of buttons that can be operated with Gemini conversational controls or hand gestures. Due to Android XR’s ability to “see what you see, hear what you hear, and react to your gestures alongside your voice,” Gemini can interact with and even control your apps. “.”.
The following example demonstrates how Google Photos has been tailored for Android XR. A familiar tablet user interface is displayed for the app, but when the “Immersive” button at the bottom is pressed, a borderless photo appears. You can navigate through a carousel of images and videos with ease by tapping another button.
The second demo displays the Google TV app’s immersive user interface. TV series and films are displayed with high-resolution thumbnails on big, expansive cards. For a more immersive experience, trailers can be viewed in large, borderless, floating windows that can be placed inside a virtual theater.
As demonstrated in this demo, Android XR devices can then access the entire library of 180° and 360° videos that are accessible via the YouTube app. Because YouTube has integrated Gemini, you can even ask questions about the video and receive answers.
Android XR will also be supported by Google Maps and Chrome, the latter of which allows you to browse the web on multiple virtual screens and the former of which uses the Immersive View feature to view cities and landmarks in virtual space.
On Android XR, you will even be able to utilize Google’s Circle to Search feature. After selecting text and images in your view using Circle to Search, you can Google them and then add 3D objects to your surroundings.
Google claims that these are only a handful of the numerous experiences that Android XR devices will offer. Hopefully, a lot more will become accessible as third-party developers begin working with the new Android XR SDK to create apps and obtain prototype hardware. It’s likely that we won’t realize the full scope of what will be feasible until hardware actually arrives on store shelves, which is fortunately scheduled for sometime next year.
Which device will be the first to run Android XR?
Samsung’s VR headset, known as Project Moohan, will be the first gadget to run Android XR. It will go on sale at an undisclosed price at some point next year. Though there aren’t many details available, you can read more about the headset here.
What date is the arrival of Android XR-powered smart glasses?
Samsung will not ship the smart glasses until after the release of the “Moohan” virtual reality headset, even if the company does manage to showcase them next month. This can be attributed to Samsung and Google’s deliberate decision to prioritize VR headsets over smart glasses.
VR headsets are the “most suitable form factor,” according to both companies, to begin developing a core XR ecosystem. This is because, in comparison to smart glasses, they provide a greater degree of immersion and higher resolution displays. They can also switch between mixed and virtual reality with ease and provide hand, head, and eye recognition. Conversely, because XR smart glasses are smaller and therefore unable to contain as much hardware, their immersion and input options are somewhat more constrained.
Wearing smart glasses every day is made possible by their lightweight and compact design.
For this reason, developing smart glasses that run Android XR is focusing on creating unique experiences. Instead of focusing on gaming or productivity, Google wants smart glasses to be more of a lifestyle accessory. Just like regular glasses, smart glasses are expected to be worn in public, and you are expected to talk to them as you move around. Voice interactions are even more crucial for smart glasses than for VR headsets, but AR smart glasses will also support voice controls through Gemini AI, just like VR headsets do.
You can, for instance, ask Gemini to send messages to contacts, look up information about local businesses and eateries using Google Maps, get turn-by-turn navigation directions to a location, and summarize the content of group chats in Google Messages.
Along with asking Gemini questions about the sign’s content and receiving real-time translations while conversing, you will also be able to point your smart glasses at a sign and ask Gemini to translate it.
Finally, you can use your phone to ask Gemini some general questions. It will have access to what you’re viewing for context, and it may even recall what it has seen before.
Much of the computation that goes on behind the scenes to enable these features actually takes place on your phone rather than the glasses themselves because smart glasses typically have weaker processors and much smaller batteries than VR headsets. Android XR on smart glasses makes use of what Google refers to as a “split-compute configuration,” in which a large portion of the processing is transferred to your smartphone, which then sends sensor and pixel data to the smart glasses. Because of this, smart glasses can be made without heavy hardware. According to rumors, Samsung’s XR smart glasses might weigh only 50 grams, which is extremely similar to the much-praised Ray-Ban Meta smart glasses.
Although Google’s announcement strongly suggests the first batch of Android XR smart glasses will have displays, as demonstrated in the demo videos Google shared with us, we do not yet know when Samsung’s smart glasses will launch or if they will even have an in-lens display. In the future, Android XR will run on glasses without screens, but for the time being, Google thinks that displays are crucial to the form factor because they enable richer content and more output-side capabilities.