Meta AI, Meta's AI-powered assistant that runs on Facebook, Instagram, Messenger and the web, can now converse in more languages and take stylish selfies. Also, starting today, Meta AI users can route questions to Meta's latest flagship AI model, Llama 3.1 405B, which the company says can handle more complex queries than previous models on which Meta AI is based.
The question is whether these improvements are enough to improve the overall experience of Meta AI. Many reviewers, including TechCrunch's Devin Coldewey, found Meta AI very disappointing upon release. Early versions of Meta AI struggled with facts, figures, and web searches, and were often unable to complete basic tasks like finding recipes or airfares.
The Llama 3.1 405B could potentially be a game changer: Meta claims that the new model is particularly good at math and coding problems, making it perfect for solving math homework problems, explaining scientific concepts, debugging code, and more.
But there's a catch: Meta AI users will have to manually switch to use Llama 3.1 405B, and will be limited to a certain number of queries per week before Meta AI automatically switches to a less capable model (Llama 3.1 70B).
Image credit: Meta
Meta is calling the Llama 3.1 405B integration a “preview” for the time being.
Generative selfies
Besides the Llama 3.1 405B, Meta AI has another new generative AI model that enhances the selfie feature.
Image credit: Meta
Called “Imagine Yourself,” the model creates an image based on a photo of a person and prompts like, “Imagine me surfing” or “Imagine me on vacation at the beach.” Available in beta, “Imagine Yourself” can be invoked in Meta AI by typing “Imagine me” followed by something that's not NSFW.
Meta did not disclose what data was used for Imagine Yourself training, but the company's terms of service state that it covers public posts and images on its platform. The policy, and its complicated opt-out process, has not been well received by all users.
Image credit: Meta
Meta AI's new editing tools in Imagine Yourself let users add, remove, change and edit objects with prompts like “change a cat to a corgi.” Starting next month, Meta AI will add an “Edit with AI” button to surface additional fine-tuning options. Meta said that in the coming days, Meta AI users will also see new shortcuts for sharing Meta AI-generated images to feeds, stories and comments across Meta apps.
Support for new languages and quests
Meta AI is also replacing the voice command feature on the Meta Quest VR headset, which will be rolling out in “experimental mode” in the US and Canada next month. Meta says that with pass-through enabled, users will be able to use Meta AI to ask questions about objects in their physical surroundings, such as “can you tell me what kind of top would complete this outfit” while holding up a pair of shorts.
Meta says Meta AI is currently available in 22 countries, including Argentina, Chile, Colombia, Ecuador, Mexico, Peru, and Cameroon. The assistant currently supports French, German, Hindi, Hindi-Roman, Italian, Portuguese, and Spanish, with Meta promising more languages to come.