Software doesn't get satire. In monitoring speech, Facebook acts against opponents of extremism. So ''Those of us speaking truth to power are being caught in the net intended to capture hate speech.''

Since 2013, Matt Boris has made a living as a left-leaning cartoonist on the Internet. His site, The Nib, runs cartoons from him and other contributors that regularly skewer right-wing movements and conservatives with political commentary steeped in irony.

One cartoon in December took aim at the Proud Boys, a far-right extremist group. With tongue planted firmly in cheek, Mr. Bors titled it ''Boys Will Be Boys''' and depicted a recruitment in which new Proud Boys were trained to be ''stabby guys'' and to ''yell slurs at teenagers'' while playing video games.

Days later, Facebook sent Mr. Bors a message saying that it had removed ''Boys Will Be Boys'' from his Facebook page for ''advocating violence'' and that he was on probation for violating its content policies.

It wasn't the first time Facebook had dinged him. Last year, the company briefly took down another NIB cartoon - an ironic critique of then-President Donald J.Trump's pandemic response, the substance of which supported wearing masks in public - for ''spreading misinformation'' about the coronavirus.

Instagram, which Facebook owns, removed one of his sardonic anti-violence cartoons in 2019 because, the photo-sharing app said, it promoted violence.

WHAT was Mr Bors encountered was the result of two opposing forces unfolding at Facebook. In recent years, the company has become more proactive at restricting certain kinds of political speech, clamping down on posts about fringe extremist groups and on calls for violence.

In January, Facebook barred Mr. Trump from posting on its site altogether after he incited the crowd that stormed the U.S. Capitol.

At the same time, the misinformation researchers said, Facebook has had trouble identifying the slipperiest and subtlest of political content : satire. While satire and irony are common in every day speech, the company's artificial intelligence systems - and even its human moderators - can have difficulty distinguishing them.

That's because such discourse relies on nuance, implication, exagerration and parody to make a point.

That means Facebook has sometimes misunderstood the intent of political cartoons, leading to takedowns. The company has acknowledged that some of the cartoons it expunged - including those from Mr. Bors - were removed by mistake and has later reinstated them.

''If social media companies are going to take on the responsibility of finally regulating incitement, conspiracies and hate speech, then they are going to have to develop some literacy around satire,'' Mr. Bors, 37, said in an interview.

Emerson T. Brooking, a resident fellow for the Atlantic Council who studies digital platforms, said Facebook ''does not have a good answer for satire because a good answer doesn't exist.''

Satire shows the limits of content moderation policy and may mean that a social media company needs to become more hands-on to identify that type of speech, he added.

Many of the political cartoonists whose commentary was taken down by Facebook were left-leaning, in a sign of how the social network has sometimes clipped liberal voices. Conservatives have previously accused Facebook and other Internet platforms of suppressing only right-wing views.

In a statement, Facebook did not address whether it had trouble spotting satire. Instead, the company said it made room for satirical content - but only up to a point. 

Posts about hate groups and extremist content, it said, are allowed only if the posts clearly condemn or neutrally discuss them, because the risk for real-world harm is otherwise too great. 

The World Students Society thanks author Mike Issac and Cade Metz.


Post a Comment

Grace A Comment!