Google won't ship technology from Project Astra, a broad effort to build AI apps and “agents” that enable real-time, multimodal understanding, until next year at the earliest.
Google CEO Sundar Pichai revealed this schedule during his remarks at Google's third-quarter earnings conference today. “[Google is] We build experiences that allow AI to see and reason about the world around it,” he said. “Project Astra offers a glimpse into that future, and we are working to deliver experiences like this by 2025.”
Project Astra, which Google demoed at its May 2024 I/O developer conference, includes everything from a smartphone app that recognizes the world around it and answers relevant questions to an AI assistant that can perform actions on your behalf. , includes a wide range of technologies.
In a pre-recorded demo during I/O, Google showed off a prototype of Project Astra that answers questions about what's in the field of view of a smartphone's camera, such as the region a user is in or the name of a broken bicycle part.
The Information reported this month that Google plans to launch a consumer-centric agent experience as early as December of this year. This agent experience lets you buy products, book flights, and do other chores. That currently seems unlikely, unless the experience in question is separated from Project Astra.
Anthropic recently became one of the first companies to have a large-scale generative AI model that can control apps and web browsers on your PC. But, demonstrating how difficult it is to build AI agents, Anthropic struggles with many basic tasks.