Meta AI could never generate accurate images for seemingly simple prompts like “Asian man and white friend” or “Asian man and white wife” edge . Instead, the company’s image generator seemed biased toward creating images of people of the same race, even when explicitly prompted.
Engadget confirmed these results in our own testing of the Meta image generator. The prompt “an Asian man and a white female friend” or “an Asian man and a white wife” will produce images of Asian couples. When asked about a “diverse population,” Meta AI produced a grid of nine white faces and one person of color. A few times, it created a single result that reflected the prompt, but in most cases, it failed to accurately describe the prompt.
as edge pointed out that there are other more “subtle” signs of bias in Meta AI, such as a tendency to make Asian men look older and Asian women appear younger. Image generators will sometimes add “culture-specific clothing” even if it’s not part of the prompt.
It’s unclear why Meta AI struggled with such prompts, although it’s not the first generative AI platform to come under scrutiny for its racial depictions. Google’s Gemini image generator has paused its ability to create images of people after over-correcting for diversity cues in historical figures. Google said its internal safeguards failed to account for situations where different results would be inappropriate.
Meta did not immediately respond to a request for comment. The company has previously described Meta AI as a “beta” and therefore prone to mistakes. Meta AI also struggled to accurately answer questions about current events and public figures.
3 Comments
Pingback: Meta’s AI image generator struggles to create images of interracial couples – Tech Empire Solutions
Pingback: Meta’s AI image generator struggles to create images of interracial couples – Paxton Willson
Pingback: Meta’s AI image generator struggles to create images of interracial couples – Mary Ashley