Meta apologizes after Instagram users see graphic and violent content

Instagram has installed a new privacy setting which will default all new and existing underage accounts to an automatic private mode.
Brandon Bell | Getty Images
Meta apologized on Thursday for a mistake that resulted in some Instagram users reporting a flood of violent and graphic content recommended on their personal “Reels” page.
“We are fixing an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake,” a Meta spokesperson said in a statement shared with CNBC.
The apology comes after a number of Instagram users took to various social media platforms to voice concerns about an influx of violent and “not safe for work” content in their feeds.
Some users claimed they saw such content, even with Instagram’s “Sensitive Content Control” enabled.
According to Meta policy, the company works to protect users from disturbing imagery and removes content that is particularly violent or graphic.
Prohibited content includes videos “depicting dismemberment, visible innards or charred bodies,” as well as content that contains “sadistic remarks towards imagery depicting the suffering of humans and animals.”
However, Meta says it does allow some graphic content if it helps users to condemn and raise awareness about important issues such as human rights abuses, armed conflicts or acts of terrorism. Such content may come with limitations, such as warning labels.
On Thursday, CNBC was able to view several posts on Instagram reels that contained gory and violent content. The posts were labeled as “Sensitive Content.”