Android XR: Google demonstrates AR glasses at TED conference

Google demonstrated smart glasses with integrated display and Gemini functions during a TED conference. The manufacturer is apparently progressing in the development.

Google seems to be preparing for the launch of the first AR glasses with the new Android XR operating system. In the course of a TED conference in VancouverCanada demonstrated Google Manager the prototype of AI glasses with an integrated screen live on stage. Google Smart Glasses can hardly be distinguished from conventional glasses.

Like the tech portals Axios and Goodgoodgood Writing, Android-XR boss Shahram Izadi and Nishta Bathia, product manager for Glasses and AI, showed various functions and scenarios on stage.

During the event, Izadi showed a live translation from Farsi to English and demonstrated the image detection by scanning a book. The translation appears in subtitles on the display of the glasses.

Izadi explained that real -time features such as seeing, hearing and reacting are “quite rudimentary”. A function called “Memory” represents the next stage: According to Android-XR boss, the AI ​​glasses use a “rolling context-related window in which the AI ​​remembers what you see without having to say what to keep an eye on it”. A camera is integrated into the glasses for the recording of the surroundings.

To find a lost hotel card, Bathia Gemini asked: “Do you know where I last put the card?” “The hotel map is located to the left of the record,” replied Gemini, pointing over the glasses to the objects on the shelf behind her.

The “memory” function was first demonstrated in a video in the course of Google I/O 2024 and is part of Google’s Project Astra. Just a few days ago, Google missed the Ki Gemini Live “Eye” to enable the function described.

Izadi also explained that the glasses do not work self -sufficient, but had to be coupled with a smartphone. Data are back and forth between the devices, which means that the smartphone acts as a data center. The advantage of this solution is that the glasses can be kept light and compact.

Demos like those of Google and Meta indicate that the technology has now been miniaturized so far that augmented reality is now technically feasible in glasses.

The first, possibly mass -compatible smart glasses are said to be on the market this year. Despite the current demo, one of the first smart glasses based on Android XR could not necessarily appear from Google – although this is not completely excluded – but from the hardware partner Samsung.

The “Project Haean” should Smart glasses with a display based on Android XR, which is to be presented by the end of 2025. In early 2025, the company confirmed that Samsung has smart glasses in the works in the course of the Galaxy S25. First, however, the presentation of the XR headset “Project Moohan” can be expected.

Meta also counts with similar glasses this year. This is said to be a further development of the Ray Ban glasses developed under the code name Hyllernova. This is a glasses with an integrated display. According to Bloomberg, it should cost more than $ 1,000 and thus become noticeably more expensive than the previous Ray Ban models that are available in Germany from 329 euros.


Discover more from Apple News

Subscribe to get the latest posts sent to your email.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.