Why Google And Openai Don’t See Eye To Eye On Voice Assistants

Google has spent the past year and a half chasing OpenAI’s conversational artificial intelligence. But as the technical gap narrows between their products, the companies this week revealed an important difference in how their AIs will interact with people. (To read the five main takeaways from Google’s two-hour spate of announcements, see this article we published last night.)
Here’s what’s going on. OpenAI on Monday unveiled an emotionally expressive female-sounding AI voice assistant reminiscent of “Samantha” from the film “Her,” including digressions and humor that made it seem human. On Tuesday at its annual conference for developers, Google took a different approach with an assistant, Project Astra, that was also voiced by a female but spoke in a more matter-of-fact tone and focused on practical tasks. It identified a neighborhood based on a user’s view from their window and named a technical component of a sound speaker, both of which the assistant could see from the user’s phone. In case you’re wondering, Astra stands for “advanced seeing and talking responsive agent, and Google executives on Tuesday used the adjective “agentive” no fewer than five times, so get ready to hear that word a lot!
Recent
Recent Products
- T-shirts with multiple colors
$34,50
$49.99 - T-shirts with multiple colors
$34,50
$49.99 - T-shirts with multiple colors
$34,50
$49.99 - T-shirts with multiple colors
$34,50
$49.99 - T-shirts with multiple colors
$34,50
$49.99