6/03/2025

THE ETHICIST TAP : [ A ] META 1/2

 


I'M NOT an expert on META'S internal workings, but like many I've long assumed that social media plays a role in political polarization.

Meta's platforms, notably Facebook, are engineered to keep users engaged, and outrage would seem to be a reliable tool for doing that. Still, recent research complicates the picture.

A study published in Science found that during the 20/20 election cycle, Facebook users who were switched to a raw, chronological feed were no less politically polarized than those using the algorithmic one.

That doesn't let Facebook off the hook - some of the design choices have clearly intensified the worst dynamics online. A telling example :

In 2017, not long after the '' angry'' emoji was introduced, Facebook's algorithm gave it five times five times the weight of a ''like,'' figuring that stronger emotion meant stronger engagement. 

Internal research later showed that anger generating posts were often hateful or misleading ; maybe a bigger problem for  Facebook was that users didn't like seeing that emoji on their posts. By 2020, the company zeroed out its weight.

A result was less misinformation - and, reportedly, no drop in engagement. The people at Facebook didn't set out to promote extremism, but its model encouraged it, until they decided to course-correct.

Some people are worried that Facebook is now drifting in the opposite direction - making more room for political extremism and hate speech.

Earlier this year, Meta rolled out '' community notes '' for U.S. users, a crowd-sourced system for '' adding context, '' modeled on an X feature and replacing the use of professional third -party fact checkers.

Many see this as a response to sustained  conservative pressure, particularly to the charge that Facebook's fact checkers disproportionately targeted right-wing content.  [ Fact checkers have countered that conservative outlets simply produced more misinformation.]

Meta's revised content guidelines have raised further concerns. One recent change, for instance, allows users to impute mental illness or call people '' weird '' on the basis of gender or sexual orientation, citing '' political and religious discourse '' and '' common nonserious usage."

Now, I support broad latitude for free expression online.

If someone wants to call someone else '' weird, '' for any reason, I don't think Facebook needs to step in. 

I'd just Block the user if this became a habit.

The analysis of the [ Q ] and the answer [ A ]  continues to Part 2. The World Students Society thanks Professor Kwame Anthony Appiah.

0 comments:

Post a Comment

Grace A Comment!