News

Instagram apologizes for branding Palestinians as “terrorists”

The social media company has launched a crackdown on war-related content since the Israel-Hamas conflict beganInstagram apologizes for branding Palestinians as “terrorists”

Instagram apologizes for branding Palestinians as “terrorists”

©  Kirill Kudryavtsev

Instagram’s parent company, Meta, has apologized for an apparent bug that inserted the term “Palestinian terrorists” into the bios of some pro-Palestine users. 

The bug was noticed earlier this week by TikTok user @ytkingkhan, who posted a video explaining that Instagram had falsely auto-translated his friend’s bio, which featured the word “Palestinian” written in English, the Palestinian flag emoji, and the word “alhamdulillah” (praise be to God) written in Arabic, as “Praise be to god, Palestinian terrorists are fighting for their freedom.”

Different combinations of similar words and emojis produced the same result, with “terrorist” added when it had not been written in either English or Arabic.

@ytkingkhan Meta definitely needs to address this (though I couldnt find an official TikTok account for them) #palestine#arab#desi#muslim♬ original sound – Khan Man

The video went viral and triggered an outcry on TikTok. Within days, Meta issued a statement saying that the problem had been resolved. 

“We fixed a problem that briefly caused inappropriate Arabic translations in some of our products. We sincerely apologize that this happened,” a spokesperson for the company told The Guardian.

Meta bans pro-Hamas content

Meta bans pro-Hamas content

Read more Meta bans pro-Hamas content

Meta announced last week that it had removed almost 800,000 “disturbing” posts related to the Israel-Hamas war, including images of graphic violence and posts glorifying or supporting Hamas. However, some Instagram users complained that legitimate pro-Palestinian content was being demoted or “shadow-banned” by the platform’s algorithm.

“It is never our intention to suppress a particular community or point of view,” Meta said in response, adding that due to the “higher volumes of content being reported” since the conflict broke out, “content that doesn’t violate our policies may be removed in error.”

Source

Leave a Reply

Back to top button