Google, ai and Gemini
Digest more
Top News
Overview
Highlights
In a series of video demos, Google showed off how people wearing Android XR glasses might interact with apps such as Google Maps. A user asked their glasses’ Gemini AI chatbot for directions, and the device brought up a small hologram-like map at the bottom of the internal display.
Google finally addressed Android XR hardware, saying at I/O 2025 that Gemini AI smart glasses are coming soon.
At Google I/O 2025, the company shared more about how Android XR will be integrated into the smart glasses form factor, including helpful Gemini features.
During its Google I/O 2025 keynote on Tuesday, Google tossed around the Gemini name nonstop, to no one's surprise. It also spent some time talking about something called Project Astra, a key part of its visual AI technology.
At its I/O developer conference today, Google announced two new ways to access its AI-powered “Live” mode, which lets users search for and ask about anything they can point their camera at. The feature will arrive in Google Search as part of its expanded AI Mode and is also coming to the Gemini app on iOS,
Google says Gemini will make interactions in the car feel far more fluid. Instead of issuing stiff, robotic commands — “Navigate to Starbucks” — you’ll be able to speak naturally and combine multiple tasks. For example: “Text Sarah, I’m on my way, and tell me how long it’ll take to get to the Thai place we went to last month.”
Warby Parker Inc. has joined forces with Alphabet Inc.’s Google to develop AI-powered glasses for all-day wear, in an attempt to take on Facebook parent Meta’s smart glasses. Warby Parker and Google said Tuesday that the first line of smart glasses is expected to launch sometime after this year and will incorporate artificial intelligence with both prescription and non-prescription lenses.