Google Gemini Live is the first AI that almost encourages you to be rude
Google has made its Gemini AI assistant a little more human today by letting you interrupt or switch topics mid-conversation. The tech giant announced the release of the long-promised Gemini Live for mobile devices at its Made by Google 2024 event. Instead of the specific commands common to Google Assistant or Alexa, Gemini Live will respond to casual language and can even simulate speculation and brainstorming. The idea is to make conversations with the AI feel more natural.
Gemini Live is a bit like being on the phone with a really fast personal assistant. The AI can talk and complete tasks at the same time. The multitasking is currently available to Gemini Advanced subscribers on Android devices, but Google said it will expand to iOS soon. The personal choices extend to what Gemini sounds like, too, with 10 new voice options of varying styles. Google claims the upgraded speech engine involved also delivers more emotionally expressive and realistic interactions.
Despite similarities, Gemini Live isn’t just Google’s version of OpenAI‘s ChatGPT Advanced Voice Mode. ChatGPT in Voice Mode can struggle with long-term conversations. Gemini Live is built with a larger context window, making it better at remembering what you said a little earlier.
Gemini Live forever
Google also unveiled a longer list of Gemini extensions, integrating the AI more deeply with Google’s suite of apps and services. Upcoming extensions will include integrations with Google Keep, Tasks, and expanded features on YouTube Music. The company described how you could ask Gemini Live for to retrieve a recipe from Gmail and add the ingredients to a shopping list in Keep, or create a playlist of songs from a specific era using YouTube Music. This level of integration allows Gemini to interact more seamlessly with the apps and content on a user’s device, offering assistance that is tailored to the context of their activities.
Still, Gemini Live isn’t quite where the demo at Google I/O 2024 suggested it would be. The visual processing showcased there is still in the future. Those will allow Gemini to see and respond to users’ surroundings via photos and video taken with the mobile device. That could significantly expand the utility of Gemini Live. The AI assistant’s new features fit well with Google’s efforts to integrate Gemini into every part of your life. Google’s vision is a conversation with Gemini that never ends.
You might also like
Source: www.techradar.com