Meta launches standalone AI app

2 min read
Share on

This week, at its inaugural LlamaCon developer conference, Meta launched the first version of the Meta AI app. Based on Llama 4, Meta says the standalone app is designed to offer a personalised and socially integrated AI experience.

What does this mean? The app leverages data from users’ Facebook and Instagram accounts to provide more personalised responses, and it has a "Discover" feed where users can share and explore AI-generated prompts. The app supports text, voice, and image inputs, and a demo full duplex voice  - allowing simultaneous transmission and reception and delivering a more natural voice experience - is also included with the app. According to Meta; “It doesn’t have access to the web or real-time information, but we wanted to provide a glimpse into the future by letting people experiment with this.” The app is also being merged with the Meta View companion app for RayBan Meta glasses.

The idea is to demystify AI and show “people what they can do with it,” Meta’s VP of Product, Connor Hayes, told The Verge. A paid tier for the AI assistant, that offers further features, is also planned for later this year. 

Meta has been playing catch-up with ChatGPT and others, as a TechCrunch article points out, so the launch of this app shows Meta leveraging what makes it different from other AI companies.  Meta already has a wealth of data on its users, their social interactions, likes and dislikes, based on their Facebook and Instagram accounts. As the company says: “Your Meta AI assistant also delivers more relevant answers to your questions by drawing on information you’ve already chosen to share on Meta products, like your profile, and content you like or engage with.”