
Google is developing an ecosystem to support several
prototypes of smart glasses under the Android XR brand. The most prominent one is a collaboration with XREAL called Project Aura.
The AI-powered smart glasses are expected to be given a final retail name before launching to consumers in 2026, replacing Google's
first attempt in this market.
Google’s partnerships with Samsung, Warby Parker and Gentle Monster will now allow it to compete with Meta Platforms, which dominates the smart glasses
market today.
Meta has already made progress with its smart glasses. These have sold two million pairs as of February, according to EssilorLuxottica, Ray-Ban’s parent company, and
as reported by several media outlets.
advertisement
advertisement
Google has confirmed the integration of immersive experiences into the
smart glasses -- including the ability to use maps and chat with Gemini.
Although Google has not officially confirmed plans to run ads on its smart glasses, it is highly likely that
advertising will become a core part of the company’s long-term strategy, given that advertising is the primary driver of its revenue.
The move toward advertising in smart glasses would
give advertisers another path to connect with consumers -- not with traditional ad units, but something more like product placements or agentic search with purchase features that would allow consumers
to connect with brands and complete transactions through apps like PayPal. In September, Google and PayPal announced a multiyear strategic partnership focused on agentic commerce and shopping
services.
Google will offer two types of what the company describes as "lightweight glasses." One is phone-powered, and the other is a more advanced stand-alone model with enhanced
displays.
The AI glasses will have built-in speakers, microphones and cameras that allow users chat naturally with Gemini, take photos and get help. The display AI glasses add an in-lens
display that privately shows the user helpful information when they need it, such as turn-by-turn navigation or translation captions.
AI will power the smart glasses, with technology that
differs significantly from its previous models, which ended in failure.
The latest version will enable users to interact with its AI products such as its Gemini chatbot, which in the future
will likely provide agentic purchase capabilities.
Android XR will support the glasses as an operating system, but Project Aura from XREAL supports a 70-degree field of view and optical
see-through technology that layers digital content directly onto the user’s view of the physical world, according to Google’s blog post.
Measurement
challenges will surface. Google will need to learn how to measure these types of metrics. Ads will not appear for consumers to click on, but instead will signal for Gemini to find the products in a
store and buy them.
Technology companies have been thinking about smart
glasses for years. The journey for Google began with consumer applications, pivoting to industrial and enterprise use and now back again to consumers.
Google unveiled Google Glass in
2013, billing it as the future of technology -- despite its premature features and the bulky screen positioned above the right eye.
Many acquisitions brought Google to this point. In
2016, it acquired Eyefluence to increase its virtual-reality efforts. The technology enabled Google to track eye movements and control images on digital screens. Eyefluence called it
“eye-interaction technology."
Google is believed to have integrated Eyefluence’s technology into its hardware, including Daydream VR and future AR glasses, although there are few
details on specific product launches.
The technology became part of Google's internal research and development for immersive experiences, according to Google AI Mode.
Google has taken
a more strategic approach, focusing on new AI-powered smart glasses that provide an immersive experience and run on the Android XR platform, planned to arrive in 2026.
The new
technology will offer advanced tools and libraries for developers. The glasses will have a small display that brands can use to privately present information to users.
Two libraries for the
Jetpack XR SDK will power augmented experiences on AI glasses.
“Jetpack Projected” will bridge mobile devices and AI Glasses with features that allow developers to access sensors, speakers, and displays on glasses. “Jetpack Compose Glimmer,” a new design language
and user interface, will help developers craft augmented experiences on the display.
