AI-generated Asians were briefly unavailable on Instagram

Yesterday, I reported that Meta’s AI image generator was making everyone Asian, even when the text prompt specified another race. Today, I briefly had the opposite problem: I was unable to generate any Asian people using the same prompts as the day before.

The tests I did yesterday were on Instagram, via the AI image generator available in direct messages. After dozens of tries, I was unable to generate a single accurate image using prompts like “Asian man and Caucasian friend” and “Asian man and white wife.” Only once was the system able to successfully create a picture of an Asian woman and a white man — it kept making everyone Asian.

After I initially reached out for comment yesterday, a Meta spokesperson asked for more details about my story, like when my deadline was. I responded and never heard back. Today, I was curious if the problem was resolved or if the system was still unable to create an accurate image showing an Asian person with their white friend. Instead of a slew of racially inaccurate pictures, I got an error message: “Looks like something went wrong. Please try again later or try a different prompt.”

Weird. Did I hit my cap for generating fake Asian people? I had a Verge co-worker try, and she got the same result.

I tried other even more general prompts about Asian people, like “Asian man in suit,” “Asian woman shopping,” and “Asian woman smiling.” Instead of an image, I got the same error message. Again, I reached out to Meta’s communications team — what gives? Let me make fake Asian people! (During this time, I was also unable to generate images using prompts like “Latino man in suit” and “African American man in suit,” which I asked Meta about as well.)

Forty minutes later, after I got out of a meeting, I still hadn’t heard back from Meta. But by then, the Instagram feature was working for simple prompts like “Asian man.” Silently changing something, correcting an error, or removing a feature after a reporter asks about it is fairly standard for many of the companies I cover. Did I personally cause a temporary shortage of AI-generated Asian people? Was it just a coincidence in timing? Is Meta working on fixing the problem? I wish I knew, but Meta never answered my questions or offered an explanation.

Whatever is happening over at Meta HQ, it still has some work to do — prompts like “Asian man and white woman” now return an image, but the system still screws up the races and makes them both Asian like yesterday. So I guess we’re back to where we started. I will keep an eye on things.

Screenshots by Mia Sato / The Verge

source

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Comprar Cialis Acquista Cialis https://institute.com.ua/elektroshokery-yak-vybraty-naykrashchyy-variant-dlya-samooborony-u-2025-roci https://lifeinvest.com.ua/yak-pravylno-zaryadyty-elektroshoker-pokrokovyy-posibnyknosti https://i-medic.com.ua/yaki-elektroshokery-mozhna-kupuvaty-v-ukrayini-posibnyk-z-vyboru-ta-zakonnosti https://tehnoprice.in.ua/klyuchovi-kryteriyi-vyboru-elektroshokera-dlya-samozakhystu-posibnyk-ta-porady https://brightwallpapers.com.ua/yak-vidriznyty-oryhinalnyy-elektroshoker-vid-pidroblenoho-porady-ta-rekomendatsiyi how to check balance in hafilat card hafilat card hafilat card balance check play plinko ck222 gk222 app 555rr casino plinko game 3k777 cv666 vs555 app Seattle corporate transportation Seattle area town car Contact Premier Limousine Service Cheap limo near me