Yesterday, I reported that Meta’s AI image generator was making everyone Asian, even when the text prompt specified another race. Today, I briefly had the opposite problem: I was unable to generate any Asian people using the same prompts as the day before.
The tests I did yesterday were on Instagram, via the AI image generator available in direct messages. After dozens of tries, I was unable to generate a single accurate image using prompts like “Asian man and Caucasian friend” and “Asian man and white wife.” Only once was the system able to successfully create a picture of an Asian woman and a white man — it kept making everyone Asian.
After I initially reached out for comment yesterday, a Meta spokesperson asked for more details about my story, like when my deadline was. I responded and never heard back. Today, I was curious if the problem was resolved or if the system was still unable to create an accurate image showing an Asian person with their white friend. Instead of a slew of racially inaccurate pictures, I got an error message: “Looks like something went wrong. Please try again later or try a different prompt.”
Weird. Did I hit my cap for generating fake Asian people? I had a Verge co-worker try, and she got the same result.
I tried other even more general prompts about Asian people, like “Asian man in suit,” “Asian woman shopping,” and “Asian woman smiling.” Instead of an image, I got the same error message. Again, I reached out to Meta’s communications team — what gives? Let me make fake Asian people! (During this time, I was also unable to generate images using prompts like “Latino man in suit” and “African American man in suit,” which I asked Meta about as well.)
Forty minutes later, after I got out of a meeting, I still hadn’t heard back from Meta. But by then, the Instagram feature was working for simple prompts like “Asian man.” Silently changing something, correcting an error, or removing a feature after a reporter asks about it is fairly standard for many of the companies I cover. Did I personally cause a temporary shortage of AI-generated Asian people? Was it just a coincidence in timing? Is Meta working on fixing the problem? I wish I knew, but Meta never answered my questions or offered an explanation.
Whatever is happening over at Meta HQ, it still has some work to do — prompts like “Asian man and white woman” now return an image, but the system still screws up the races and makes them both Asian like yesterday. So I guess we’re back to where we started. I will keep an eye on things.
Screenshots by Mia Sato / The Verge
source
Ramona Emerson is a Diné writer and filmmaker originally from Tohatchi, New Mexico. He has a bachelor’s in Media Arts from the University of New Mexico and an MFA in Creative Writing from the Institute of American Indian Arts. After starting in forensic videography, she embarked upon a career as a photographer, writer, and editor. He is an Emmy nominee, a Sundance Native Lab Fellow, a Time-Warner Storyteller Fellow, a Tribeca All-Access Grantee and a WGBH Producer Fellow. In 2020, Emerson was appointed to the Governor’s Council on Film and Media Industries for the State of New Mexico. He currently resides in Albuquerque, New Mexico, where she and her husband, the producer Kelly Byars, run their production company Reel Indian Pictures. Shutter is her first novel