Fixing Facebook groups may help stop false claims. The QAnon conspiracy theory, promotions of bogus health treatments and calls for violence based on false claims of election fraud have a common thread : Facebook groups.

Those forums for people with shared interests can be wonderful communities for avid gardeners in the same neighborhood or parents whose children have a rare disease.

But for years, it's also been clear that the groups turbocharge some people's inclinations to get into heated online fights, spread engrossing information whether it's true or not and scapegoat others.

I don't want too oversimplify and blame Facebook groups for every bad thing in the world. And mitigating the harms of Facebook is not as simple as the company's critics believe.

But many of the toxic side effects of Facebook groups are a result of of the company's choices. I asked experts in online communications what they would do to reduce the downsides of the groups.

Here are some of their suggestions :


Facebook has said it would extend a temporary pause on computerized recommendations for people to join groups related to politics. Some experts said that that Facebook should computer-aided group suggestions entirely.

It's nice that Facebook suggests a forum about growing roses to someone who posts about gardening. But for years, Facebook's group recommendations have proved to be easily manipulated and to have pushed people toward increasingly fringe ideas.

In 2016, Facebook's research found that  two-thirds of people who joined extremist groups did so at Facebook's recommendation, The Wall Street Journal reported.

Automated group recommendations was one of the ways that the QAnon conspiracy theory spread, my colleague Sheera Frenkel has said.

Ending these computerized suggestions isn't a silver bullet. But it's nuts how often activists and academics have screamed about how harmful recommendations are, and Facebook has only tinkered at the margins.


The social media researchers Nina Jankowicz and Cindy Otis have proposed not allowing groups above a certain number of members to be private - meaning newcomers must be invited - without regular human review of their content.

''A lot of truly toxic groups are unsearchable and invite-only, and that's hugely problematic,'' Jankowicz told me.


Renee DiResta, a disinformation researcher at the Internet Observatory at Stanford, said that Facebook needs to ''take more decisive action'' against thee groups that repeatedly engage in harassment or otherwise break Facebook's rules. Facebook did take some steps in this direction last year.

Jade Magnus Ogunnaike, a senior director at the racial justice organization Color of Change, also said that Facebook should stop using contractors to review material on the site.

She said converting those workers to employees would be fairer and could help improve the quality of oversight over groups.


Joan Donovan, the research director of Harvard University's Shorenstein Center on Media, Politics and Public Policy has suggested that big Internet companies should hire thousands of librarians to provide people with vetted information.

The World Students Society thanks author Shira Ovide.


Post a Comment

Grace A Comment!