Facebook, Preparing for Chauvin Verdict, Will Limit Posts That Might Incite Violence
The company said it planned to limit posts that contain misinformation and hate speech related to the trial to keep them from spilling over into real-world harm.,
Facebook, preparing for Chauvin verdict, to limit posts that might incite violence.
April 19, 2021, 12:34 p.m. ET
By Davey Alba
Facebook on Monday said it planned to limit posts that contain misinformation and hate speech related to the trial of Derek Chauvin, the former Minneapolis police officer charged with the murder of George Floyd, to keep them from spilling over into real-world harm.
As closing arguments began in the trial and Minneapolis braced for a verdict, Facebook said it would identify and remove posts on the social network that urged people to bring arms to the city. It also said it would protect members of Mr. Floyd’s family from harassment and take down content that praised, celebrated or mocked his death.
“We know this trial has been painful for many people,” Monika Bickert, Facebook’s vice president of content policy, wrote in a blog post. “We want to strike the right balance between allowing people to speak about the trial and what the verdict means, while still doing our part to protect everyone’s safety.”
Facebook, which has long positioned itself as a site for free speech, has become increasingly proactive in policing content that might lead to real-world violence. The Silicon Valley company has been under fire for years over the way it has handled sensitive news events. That includes last year’s presidential election, when online misinformation about voter fraud galvanized supporters of former President Donald J. Trump. Believing the election to have been stolen from Mr. Trump, some supporters stormed the Capitol building on Jan. 6.
Leading up to the election, Facebook took steps to fight misinformation, foreign interference and voter suppression. The company displayed warnings on more than 150 million posts with election misinformation, removed more than 120,000 posts for violating its voter interference policies and took down 30 networks that posted false messages about the election.
But critics said Facebook and other social media platforms did not do enough. After the storming of the Capitol, the social network stopped Mr. Trump from being able to post on the site. The company’s independent oversight board is now debating whether the former president will be allowed back on Facebook and has said it plans to issue its decision “in the coming weeks,” without giving a definite date.
The death of Mr. Floyd, who was Black, led to a wave of Black Lives Matter protests across the nation last year. Mr. Chauvin, a former Minneapolis police officer who is white, faces charges of manslaughter, second-degree murder and third-degree murder for Mr. Floyd’s death. The trial began in late March. Mr. Chauvin did not testify.
Facebook said on Monday that it had determined that Minneapolis was, at least temporarily, “a high-risk location.” It said it would remove pages, groups, events and Instagram accounts that violated its violence and incitement policy; take down attacks against Mr. Chauvin and Mr. Floyd; and label misinformation and graphic content as sensitive.
The company did not have any further comment.
“As the trial comes to a close, we will continue doing our part to help people safely connect and share what they are experiencing,” Ms. Bickert said in the blog post.