Google recently confirmed it is working on two versions of smart glasses that combine artificial intelligence, everyday convenience, and a push toward seamless wearable tech. The company is targeting a 2026 release date and aims to make these smart glasses lightweight, stylish, and practical enough for daily use worldwide.
One version of the glasses will be “screen-free.” This model will rely on built-in cameras, microphones, and speakers so users can tap into the power of Google’s AI assistant, Gemini, without staring at a display. Wearers will still be able to take photos, ask questions about their surroundings, get contextual assistance, and control functions hands-free.
The second version ups the ante with an in-lens display. With this, users could potentially see directions, translations, live captions, or other useful overlays directly within their field of vision. Both models will run on Android XR, Google’s platform for next-gen wearable and spatial computing devices, and will connect to a smartphone for processing.
To bring this to life, Google has teamed up with several established eyewear and tech-hardware partners: Warby Parker, Gentle Monster, Samsung, and others. The idea is to blend cutting-edge AI and XR hardware with the kind of style and comfort people expect from everyday eyewear.
Google’s 2026 smart glasses initiative could be a serious turning point for wearable tech. By combining AI, lightweight design, and real-world utility, the glasses aim to turn science fiction into everyday reality.
If the company delivers what it promises: comfort, seamless AI assistance, reliable privacy controls, these glasses will be more than just cool tech. They could shift how we use technology throughout the day with less screen time in hand, more interaction through voice and vision.
For the tech-curious, it’s time to keep eyes on Android XR.
Are AI smart glasses finally ready for mainstream use in 2026?
Let us know!
If you liked this article, check out our other articles on Google.
