Facebook to remove content if it celebrates, glorifies violence
Facing flak for temporarily removing the video of Philando Castile, an African-American who was fatally shot by a police officer during a traffic stop last week, social media giant Facebook has said that it only removes content if it celebrates or glorifies violence, not if it’s only graphic or disturbing.
The video of the whole incident was broadcast live by Castile’s girlfriend Diamond Reynolds immediately after her fiance was shot by police in his car and created a furore across the US and on social media.
The video disappeared from Facebook Live after nearly one million views on social media and news websites and the social media giant said it was temporarily unavailable due to a technical glitch. However, Facebook refused to tell exactly what caused the glitch.
The video, however, returned to the site after about an hour with a warning labeled “disturbing”.
On Saturday, Facebook explained its censorship policy for Live video “that contradicts theories that the video disappeared due to Facebook waffling on whether it should stay up, a high volume of reports of it containing violent content, a deletion by police who had taken possession of Castile’s girlfriend’s phone and Facebook account or a request from police to remove it”, technology website Techcrunch reported.
The temporary removal raised questions about Facebook’s roles and responsibilities for hosting citizen journalism that could be controversial or graphic.
According to Facebook’s Community Standards, what is and is not allowed on the social network, like pornography to violence to hate speech, applies to Live video, photos and other videos as well.
“The policy on graphic content is that Facebook does not allow and will take down content depicting violence if it’s celebrated, glorified or mocks the victim. However, violent content that is graphic or disturbing is not a violation if it’s posted to bring attention to the violence or condemn it,” the company was quoted as saying.
Users can report any content, including Live videos in progress, as offensive for one of a variety of reasons, including that it depicts violence.
Facebook’s Community Standards team, which operates 24/7 worldwide, reviews a content even if the content has a single report flag.
The social media giant clarified that the team can review content whether it’s public or privately shared.
“The volume of flags does not have bearing on whether content is or is not reviewed and a higher number of flags will not trigger an automatic take-down,” the company noted.
There is no option to report content as “graphic but newsworthy,” or any other way to report that content could be disturbing and should be taken down, the report said.
“Instead, Facebook asks that users report the video as violent, or with any of the other options. It will then be reviewed by team members trained to determine whether the content violates Facebook’s standards,” it added.