Google won't ship the technology of Project Astra, its wide-ranging project to develop AI apps and “agents” for real-time multimodal understanding, until next 12 months on the earliest.
Google CEO Sundar Pichai announced the timeline in remarks during Google's third-quarter earnings call on Tuesday. “(Google) is constructing experiences where AI can see and think in regards to the world around you,” he said said. “The Astra project is a glance into this future. We are working to supply such experiences as early as 2025.”
Project Astra, which Google unveiled at its I/O developer conference in May 2024, encompasses a variety of technologies, from smartphone apps that may recognize the world around them and answer relevant inquiries to AI assistants that perform actions on behalf of a user.
In a pre-recorded demo during I/O, Google showed off a Project Astra prototype that answered questions on things in the sector of view of a smartphone's camera, comparable to what neighborhood a user is perhaps in or an element of a broken bike is known as.
The information reported This month, Google announced that it plans to launch a consumer-focused agent experience — one which enables you to buy a product, book a flight, and perform other such tasks — as early as December. That now seems unlikely – unless the experience in query is separate from Project Astra.
Anthropic recently became considered one of the primary firms with a big generative AI model able to controlling apps and web browsers on a PC. However, showing how difficult developing AI agents will be, Anthropic struggles with many basic tasks.