- Meta has launched a new standalone -App to its Meta AI assistant, powered by Llama 4
- The app creates connection across meta-platforms and devices, including Ray-Ban Meta Smart Glasses
- META AI Personalizes its behavior based on your Instagram and Facebook activity
Meta AI moves into its own space with the launch of a new standalone app. The animal of Meta’s new Llama 4 AI model is the new app at the same time an independent product and a replacement for Meta View, which was previously used to connect to Ray-Ban Meta Smart Brilles.
Meta plays a big game here and places voice interactions as the most intuitive and natural way of interacting with your AI. The app supports hands-free chat and even includes a demo of full-dplex speech, a feature that allows you to talk and listen at the same time.
It is very useful considering how keen meta is to connect Meta AI with the company’s larger product portfolio, especially Ray-Ban Meta Smart Glasses. These AI-activated glasses now work via the Meta AI app and replace the Meta View app they are currently relying on.
This means you can start a conversation on one platform and easily transition to another. All you have to do is open the Devices tab on the app and repeat your settings and saved information.
Ask a question through your smart glasses, get an answer from Meta AI, and then pick up the same thread on your phone or desk later. You can switch from voice chat in your glasses to read the conversation in your app’s history tab. For example, you can be on a trip and ask Meta AI through your glasses to find a nearby bookstore. The answer is stored in your Meta AI app for later review.
The other most important element of the Meta AI app is Discover Feed. You can see publicly shared things as successful quick ideas and pictures they have generated on the feed, and then remix them for your own purposes.
In addition, the Desktop version of Meta AI is also renewed with a new interface and more image generation options. There is also an experimental document editor for composing and editing text, adding visuals and exporting it as a pdf.
Meta has spent many months spreading Meta AI across Instagram, Facebook, Messenger and WhatsApp, but now this is the first time Meta AI is not hosted in another mobile app.
AI’s connection to Meta’s other apps gives it an edge (or an error, depending on your view) by letting it customize its behavior based on what you do on the other apps. Meta AI draws on your Instagram and Facebook activity to customize its answers.
Ask it where to go to dinner and it may suggest that a ramen -spam your friend was sent around last week. Ask for tips for an upcoming holiday and it will remember that you once sent that you love to “travel light but overpack emotionally” and suggest an itinerary that may fit this attitude.
Meta clearly wants Meta AI to be central to all your digital activities. The way the company puts on the app looks like you will always check in with it, whether on your phone or on your head.
There are obvious parallels with the chatgpt app with regard to style. But Meta seems to want to differentiate its app from Openai’s creation by emphasizing the personal over the wider utility of an AI assistant.
And if there is one thing that Meta has more of than almost anyone is personal data. Meta AI, which is tapping on your social data, voice habits and even your smart glasses to deliver answers designed to you, feels a lot of on-brand.
The idea of Meta AI that forms a mental scrapbook in your life based on what you liked on Instagram or sent on Facebook may not appeal to everyone, of course. But if you are worried, you can always take on the smart glasses and ask Meta AI for help.



