- Gemini can now chain actions together to complete complex tasks
- Gemini Live gets multimodal capabilities on the latest phones
- Gemini will evolve into a full-featured AI assistant with Project Astra
To coincide with the launch of the Samsung S25 series of devices, at today’s Galaxy Unpacked, Google has announced some impressive updates to its Gemini AI platform. Many of the improvements are specific to devices like the new Samsung S25, but some also work on the older Samsung S24 and Pixel 9 phones.
The standout feature is Gemini’s new ability to chain actions together. This means you can now do things like connect to Google Maps to search for nearby restaurants, then draft a text in Google Messages to send to people you’d like to invite to lunch, all through Gemini- commander.
The chaining capability will be added to all devices running Gemini “dependent on extensions”, meaning that the extensions to link that app to Gemini must be written by a developer in order to be included. Of course, all the major Google apps already have extensions for Gemini, but extensions are also available for the Samsung Reminder, Samsung Calendar, Samsung Notes, and Samsung Clock apps.
Gemini Live goes multimodal
Google’s Gemini Live, the part of Gemini that lets you have a natural, human-like conversation with AI, is also getting some big multimodal upgrades. You will now be able to upload pictures, files and YouTube videos to the conversation you are having, so for example you can ask Gemini Live: “Hey, take a look at this picture of my school project and tell me how I could make this better”, then upload the picture and get a response.
However, the multi-modal Gemini enhancements are not available everywhere and will require a Galaxy S24, S25 or Pixel 9 to work.
Look at
Project Astra
Finally, Google has announced that Project Astra capabilities will be coming in the next few months, and they will arrive first on the Galaxy S25 and Pixel phones. Project Astra is Google’s prototype AI assistant that enables you to interact with the world around you by asking questions about what you’re looking at and where you’re using your phone’s camera. So you can simply point your phone at something and ask Gemini to tell you something about it, or ask it when the next stop on your bus route will be.
Project Astra works on mobile phones, but takes your experience to the next level when combined with Google’s prototype hands-free AI glasses, so you can simply start asking Gemini questions about what you’re looking at without having to interact with a screen at all.

Look at
While there’s still no news on a release date for these next-gen Google Glasses, they’ll join Meta Ray-Ban glasses in the emerging market for AI wearables when they finally become available.