Meta on Wednesday unveiled a new suite of generative AI features that seem designed to fill Facebook and Instagram feeds with even more of the AI-generated engagement bait and scams that users have been complaining about.
During the company’s Connect event, CEO Mark Zuckerberg announced that Meta AI can now process image inputs, allowing it to answer questions about photos and edit them. Want to know what kind of flower you took a picture of or how to make a multi-colored cake? Just ask Meta AI. See a picture of a goat on your Instagram feed and want to repost it as a picture of a goat on a surfboard? Just ask Meta AI. Not sure what to say about the goat on a surfboard? Don’t worry, Meta AI will also suggest captions for your stories on Facebook and Instagram.
And if chatting with Meta AI yourself sounds like too much work, the company will be testing a new feature on Facebook and Instagram that injects unsolicited AI-generated images “based on your interests or current trends” directly into your feeds, allowing you to “tap a suggested prompt to take that content in a new direction or swipe to imagine new content in real-time,” according to pre-event briefing materials.
Research–albeit on older models than Meta’s current offerings–has shown that creating an image is among the most energy-intensive tasks performed by generative AI. In some cases, a single image consumes the same amount of power as fully charging a smartphone.
Not to be outdone by OpenAI’s release of a voice assistant for ChatGPT earlier this week, Zuckerberg announced that users can begin talking to Meta AI and it will talk back in the voices of celebrities like Awkwafina, Dame Judi Dench, John Cena, Keegan Michael Key, and Kristen Bell.
The company is also beginning “small tests” in the U.S. and Latin America of a deepfake feature that will translate video content on Instagram and Facebook from Spanish to English, or vice versa. The Meta AI translation tool will automatically “simulate the speaker’s voice in another language and sync their lips to match,” according to the company’s briefing material.
Meta did not immediately respond to questions about whether creators on Facebook and Instagram will be asked for their consent before their images and voices are manipulated.
During his keynote at the Connect event, Zuckerberg also announced the release of the 3.2 versions of the company’s open-source Llama family of foundational generative models. The lighterweight versions, dubbed Llama 3.2 1B and 3B, are designed to require less computing power so they can run locally on devices while the heftier Llama 3.2 11B and 90B models will be able to process images for tasks such as determining which month a business had the most sales based on a graph of monthly sales.
Trending Products