Google's New AI Glasses

Google’s New AI Glasses

Written by: Sitara Nair

Google, one of the largest and most successful American tech companies, hopes to release AI glasses, lightweight eyewear for users’ best interests. Let’s dive into Google’s vision for AI eyewear and what it means to experience artificial intelligence directly through your line of sight.

You may have heard of ChatGPT, OpenAI’s advanced chatbot designed to interpret and produce human-like text on a variety of topics: user questions, composing long-form content, or even just conversation. While ChatGPT is one of the most well-known and user-friendly AI chatbots, Google also has its own AI chatbot, called Gemini. Google Gemini also has multimodal understanding; it possesses the ability to process/combine text, images, audio, video, and code for user benefit. Much like ChatGPT, users can converse with Gemini and receive feedback and help on nearly any topic they can think of. However, Google, being a multibillion-dollar tech company, is known for expansion, and has integrated Gemini with tons of its other apps, like Gmail, Maps, and Docs, to help efficiency and management across these apps! 

Now, Google plans to further expand its use of Gemini by working with Samsung and partners at Gentle Monster and Warby Parker to create lightweight, normal-looking glasses for customers. Google announced the anticipated arrival of these 2026 glasses in a blog post from Monday, December 8th. They explained how these glasses are designed to work without a screen, using systems like built-in speakers, microphones, and cameras so users and Gemini can communicate. Additionally, these glasses will possess the ability to take photos in real time, similar to how smart glasses we have today, like the “Ray-Ban Meta”, can record. These glasses are what Google announced to be  “screen-free assistance”, but they also talked about a second type of glass known as “display” AI glasses, which have a bit of a different identity. Display AI glasses will, as it says in the name, display “helpful information, right when you need it, like turn-by-turn navigation or translation captions.”, according to the Google blog. However, Google has not released a specific time frame for when these “display” glasses will be available to the public. 

This isn’t the first time Google has released smartglasses; in 2013, it launched its first: Google Glass. Google Glass was an augmented-reality smart glass with a small prism display just above the right eye lens, featuring WiFi, Bluetooth, bone-conduction audio, and simple apps (navigation, photos, etc.), all controlled by voice and touch. The idea of being able to access apps and see information right in front of your face, wherever you are, without an extra device, was so exciting and interesting for many. However, unfortunately, these glasses weren’t a hit. Because many consumers did not see the use in the glasses themselves, as they believed a phone could do the exact same things but better, combined with the $1,500 price made it a hard buy for customers. Now, Google is preparing an AI-enhanced, sleeker version for next year; one that might finally reveal the true potential of Google Glass. Only time will tell! 

Both of these new AI-driven glasses will be using Android XR, “the first Android platform built in the Gemini era, and it powers an ecosystem of headsets, glasses, and everything in between.”, Google says in their Blog. On December 8th, the tech giant also explained the development of the Android XR wired glasses, calling it Project Aura from XREAL. Project Aura’s main selling point is it’s blend of “headset-like immersion” and “real-world presence” in a portable way. While it may seem like the same glasses mentioned earlier, they are not exactly the same. Project Aura is a lightweight “optical see-through” XR (extended reality) glasses project, developed by Xreal in partnership with Google, built on the XR platform. The compute and battery is offloaded into a tethered “puck-like” device that the glasses connect to, so the glasses can stay relatively light. So, Project Aura is Google’s full AR/XR headset-style glasses, built with Xreal to overlay digital content into your real world using optical see-through displays and Android XR. The newly announced AI glasses are simpler, everyday smart glasses focused on hands-free AI assistance (voice, camera, captions), with some models not even using a display, which make them more casual and less immersive than Aura.

So, are you excited for AI Glasses? Do you think they will take the place of the normal day-to-day glasses we know and love?

References

USA Today ,Snider, Mike. “Google AI Glasses Coming in 2026, Company Confirms.” USA Today, December 9, 2025. https://www.usatoday.com/story/tech/2025/12/09/google-ai-glasses-2026-release/87689145007/

Google Blog — Android XR Gemini Glasses Google. “Experiencing AI Naturally with Android XR Glasses.” Google Blog, 2025. https://blog.google/products/android/android-xr-gemini-glasses-headsets/

BBC News, BBC News. “Google Reveals New AI-Powered Glasses Coming in 2026.” BBC News, 2025. https://www.bbc.com/news/articles/cwyx83n00k6o

Google Blog — Android Show XR Edition Updates. “Android Show XR Edition: Updates for Developers.” Google Blog, 2025.
https://blog.google/products/android/android-show-xr-edition-updates/

CBS News “Google Glass Cost $1,500 to Buy — but Only $80 to Make, Court Documents Reveal.” CBS News, 2023.
https://www.cbsnews.com/news/google-glass-1500-to-buy-80-to-make/