Google accelerates with Android XR: new AI glasses, Galaxy XR headsets, and Project Aura at the heart of the ecosystem

Last update: 09/12/2025

  • Google enhances Android XR with features like PC Connect, travel mode, and realistic avatars for Galaxy XR.
  • In 2026, two types of AI glasses with Android XR will arrive: one without a screen and one with an integrated screen, in collaboration with Samsung, Gentle Monster, and Warby Parker.
  • XREAL is preparing the Project Aura wired glasses, lightweight XR glasses with a 70-degree field of view and a focus on productivity and entertainment.
  • Google opens Developer Preview 3 of the Android XR SDK so developers can easily adapt their Android apps to the space environment.

Android XR glasses

Google has decided to step on the gas with Android XR and the new glasses With artificial intelligence, they're charting a roadmap that combines mixed reality headsets, wearable glasses, and developer tools in a single ecosystem. After years of low-key experiments in augmented reality, the company is back on the scene with more mature offerings designed for everyday use.

In recent months, the firm has detailed New features for Samsung's Galaxy XR viewer, has shown progress in the first AI glasses based on Android XR and has given a preview of Project AuraThese are wired XR glasses developed in collaboration with XREAL. All of this is integrated around Gemini, Google's AI model, which becomes the core of the experience.

Android XR takes shape: more features for the Galaxy XR headset

During the event “The Android Show: XR Edition”, held on December 8th from Mountain View and followed closely in Europe, Google confirmed that Android XR is now operational on the Galaxy XR viewer The platform also boasts over 60 games and experiences on Google Play. The goal is to transform this system into a common layer that unifies headsets, smart glasses, and other devices. w spatial.

One of the great novelties is PC Connect, an application that allows Connect a Windows computer to the Galaxy XR and display the desktop within the immersive environment as if it were just another window. This way, the user can work on their PC, move windows, use office applications, or play games, but with virtual screens floating in space in front of him.

It also incorporates the travel modeThis option is designed for those who use the display while moving, for example on a train, plane, or car (always as a passenger). This function stabilizes the on-screen content so that the windows do not "escape" when moving your head or due to vehicle jolts, reducing the feeling of dizziness and making it more comfortable to watch movies, work or browse the Internet on long journeys.

Another relevant piece is Your Likenessa tool that generates a three-dimensional avatar of the user's face This digital model is created from a scan performed with a mobile phone and replicated in real time. Facial expressions, head gestures, and even mouth movements during video calls on Google Meet and other compatible platforms, offering a more natural presence than classic cartoon avatars.

PC Connect and travel mode are now available available to Galaxy XR ownersWhile Your Likeness is currently in beta, Google has also announced that it will be released in the coming months. System Autospatialization, a function planned for 2026 that It will automatically convert 2D windows into immersive 3D experiences.allowing videos or games to be transformed into real-time space scenes without the user having to do anything.

Exclusive content - Click Here  Complete guide to migrating your Fitbit data to a Google account

Two families of AI-powered glasses: with and without a screen

Android XR models with and without screen

Beyond the headsets, Google has confirmed that It will launch its first AI-powered glasses based on Android XR in 2026.In collaboration with partners such as Samsung, Gentle Monster, and Warby Parker, the strategy is based on two product lines with distinct but complementary approaches: Screenless glasses focused on audio and camera, and others with integrated screen for lightweight augmented reality.

The first type of device are AI glasses without a screenDesigned for those who want smart assistance without changing their perspective on the world. These frames incorporate microphones, speakers and cameras, and they rely on Gemini to respond to voice commands, analyze its surroundings, or perform quick tasks. Its intended uses include: take photos without taking out your phone, receive spoken directions, ask for product recommendations or ask questions about a specific place.

The second model takes it a step further and adds a screen integrated into the lens, capable of displaying information directly in the user's field of vision. This version allows you to see Google Maps directions, real-time translation subtitles, notifications, or reminders superimposed on the real world. The idea is to offer a lightweight augmented reality experience. without reaching the weight or volume of a mixed reality viewerbut with enough visual information to make it useful.

During internal demonstrations, some testers have been able to use monocular prototypes —with a single screen on the right lens— and binocular versionswith a screen for each eye. In both cases it is possible to see floating interfaces, video calls in virtual windows and interactive maps that adjust to the direction of the gaze, taking advantage of the microLED technology that Google has been developing after the purchase of Raxium.

These prototypes have been used to test, for example, Music playback with on-screen controls, the visualization of video calls with the other person's image floating in view, or the real-time translation with superimposed subtitlesGoogle's Nano Banana Pro model has even been used to edit photos taken with the glasses themselves and see the result in a matter of seconds, without needing to take the phone out of the pocket.

Integration with Android, Wear OS and the Better Together ecosystem

One of the advantages that Google wants to exploit with these Android XR glasses is integration with the Android and Wear OS ecosystemThe company insists that any developer already programming for Android has a significant advantage: Mobile applications can be projected from the phone to the glasses, offering rich notifications, media controls, and spatial widgets without requiring major initial changes.

In pre-launch demonstrations, it has been seen how Photos taken with the screenless glasses can be previewed on a Wear OS watch through an automatic notification, reinforcing the idea of ​​a connected ecosystem, “Better Together.” Furthermore, it has been shown hand gestures and head movements to control the Android XR interface, reducing reliance on physical controls.

In the area of ​​navigation, Android XR takes advantage of the Google Maps Live View experiencebut transferred to the glasses. The user sees only a small card with the next address when looking straight ahead, while when tilting the head downwards A larger map unfolds with a compass indicating the direction you're facing. According to those who have tried it, the transitions are smooth and the feeling is reminiscent of a video game guide, but integrated into the real environment.

Exclusive content - Click Here  How can Alexa's greetings be customized?

Google is also encouraging third parties, such as transportation services, to take advantage of these capabilities. One example shown was integration with transportation applications like Uberwhere the user can follow step by step the route to the pick-up point at an airport, seeing instructions and visual references directly in their field of vision.

Looking ahead to 2026, the company plans deliver Android XR monocular glasses development kits selected programmers, while everyone will be able to experiment with un optical pass emulator in Android StudioThe user interface has been designed to have a complexity similar to a home screen widget, something that fits better with quick and contextual uses than with traditional desktop applications.

Project Aura: XR glasses with cable and expanded field of view

Xreal Google AR Project Aura-3

Alongside the development of lightweight AI glasses, Google is collaborating with XREAL on Project Aura, One Wired XR glasses powered by Android XR that aim to position themselves between a bulky headset and everyday glasses. This device focuses on a lightweight designHowever, it relies on an external battery and a connection to computers to increase its power.

Project Aura offers a field of vision of about 70 degrees and use optical transparency technologies which allow digital content to be superimposed directly onto the real environment. With this, the user can Distribute multiple work or entertainment windows in the physical space, without blocking what is happening around you, something especially useful for productivity tasks or for following instructions while performing other activities.

One practical use would be follow a cooking recipe in a floating window placed on the countertop while the actual ingredients are being prepared, or Consult technical documentation while working hands-free. The device is powered from an external battery or directly from a computerwhich can also project your desktop into the mixed reality environment, turning the glasses into a kind of spatial monitor.

Regarding control, Project Aura adopts a hand-tracking system similar to that of the Galaxy XRAlthough it has fewer cameras, this makes it easier for users to adapt quickly if they have already tried other XR devices. Google has announced that it will offer More details about its launch throughout 2026, the date on which it is expected to start arriving on the market.

This category of wired glasses reinforces the idea that Android XR isn't limited to a single type of device. The same software base aims to encompass From immersive headsets to lightweight goggles, including hybrid solutions like Aura, so that the user can choose at any time the level of immersion and comfort they need.

Partnerships with Samsung, Gentle Monster and Warby Parker

Google Android XR Gentle Monster

To avoid repeating the mistakes of Google Glass, the company has opted for collaborate with brands specializing in optics and fashionSamsung handles much of the hardware and electronics, while Gentle Monster and Warby Parker contribute their expertise in saddle design that can pass for conventional glasses and be comfortable for many hours.

Exclusive content - Click Here  How to turn off ok google on LG

During The Android Show | XR Edition, Warby Parker confirmed that He's working with Google on lightweight, AI-enabled glasses.with a planned launch in 2026. Although details on pricing and distribution channels have not yet been released, the company speaks of frames designed for everyday use, far removed from the experimental aspect that Google's first attempts had a decade ago.

In this context, Android XR and Gemini provide the technological layer, while the partners focus on achieving Discreet mounts, with a good fit and manageable weightThe goal is clear: the glasses should look and feel like any other commercial model, but with integrated AI and augmented reality capabilities that add value without drawing too much attention.

These alliances position Google in direct competition with Meta and his Ray-Ban Meta Glassesas well as with Apple's advances in spatial computing. However, the company's strategy involves open platforms and industrial collaborationtrying to bring traditional glasses developers and manufacturers into the Android XR ecosystem.

Tools and SDKs: Android XR opens up to developers

Android XR Show

To make all these pieces fit together, Google has launched Android XR SDK Developer Preview 3which officially opens the APIs and tools needed to create space applications for both viewers and XR glasses. The interface follows the design of Material 3 and the design guidelines that Google internally calls Glimmer, adapted to floating elements, cards, and 3D panels.

The message for the sector is clear: Those who already develop for Android are, to a large extent, ready to make the leap to Android XRThrough the SDK and emulators, programmers can begin porting their mobile applications, adding augmented reality layers, integrating gesture controls, or customizing how notifications appear in space.

Google insists it doesn't want to overwhelm users with complex interfaces. That's why many elements of Android XR are designed to be simple. lightweight cards, floating controls, and contextual widgets They appear when needed and disappear when they no longer provide relevant information. In this way, The aim is to avoid the feeling of a "permanent screen" in front of the eyes and fosters a more natural relationship with the environment.

The company has made it clear that Android XR is an open platformAnd that hardware manufacturers, video game studios, productivity companies, and cloud services will have room to experiment. From Europe, it is hoped that this approach will help to new business, educational and communication applications adopt mixed reality without having to develop solutions from scratch.

Google's move with Android XR and the new AI glasses points to a scenario in which Mixed reality and intelligent assistance are spread across different device formats: immersive viewers Like the Galaxy XR for immersive experiences, lightweight glasses for everyday use, and wired models like Project Aura for those who prioritize productivity and image quality. If the company manages to square the circle of design, privacy, and usability, it's likely that in the coming years these glasses will cease to be seen as an experiment and will become a technological accessory as commonplace as the smartphone is today.

Controllers and accessories X
Related article:
XR Controllers and Accessories: What's Worth Buying and What to Skip