Facebook and other social media: The problems are deeper than they seem

2025-01-20 16:50:49 / TRENDING ALFA PRESS

 

Facebook and other social media: The problems are deeper than they seem
Meta CEO Mark Zuckerberg has faced a lot of criticism since announcing last week that his company would abandon its fact-checking process and limit content moderation on its platforms. The criticism is understandable, given the uncertainty over how Meta's new rules will address misinformation and other harmful material.

It’s worth remembering, though, that the company’s content moderation strategies — and those of nearly all other social media platforms — haven’t worked as expected. As Zuckerberg noted in a video about the changes, Meta’s automated moderation systems have often gotten it wrong. Even when they correctly identified “misinformation” — a vague term that’s much harder to define than you might think — they’ve struggled to remove it, given the volume and persistence of “bad” actors.

In any case, the problems that social media creates for its users go much deeper than content moderation. The biggest concerns relate to how the platforms distribute content. Technology companies should help address these concerns by doing much more to disclose their algorithms to the public, allowing for greater scrutiny of how they work.

Companies should also provide access to their data so that researchers and policymakers can analyze the impact that social networks have on users and on society itself.

The need for such transparency has become even more apparent since Zuckerberg’s announcement. Instead of using external fact-checkers to determine the accuracy of content, he said Meta will adopt a system similar to Platform X’s Community Notes feature, which relies on user-based verification, to debunk false claims. In the meantime, the company will relax its content filters to prioritize review of “illegal and high-risk” violations, such as terrorism, child sexual exploitation, and fraud.

It remains to be seen how effective the Community Notes system will be in combating misinformation. The idea has potential, but X (formerly known as Twitter) has had mixed results with this model. Some critics say it has difficulty coping with the influx of false claims. It is also unclear how Meta's algorithms will promote content that would have previously been deleted. Will they allow material, for example, that is abusive towards certain groups, to spread unhindered? Will content that is marked as inaccurate be deprioritized?

The answers to these questions will have real-world consequences. Multiple studies have shown that frequent social media use among teens is associated with greater exposure to cyberbullying and, as a result, more frequent suicidal thoughts. Allowing public scrutiny of Meta’s algorithms would help ensure that the company takes any potential harm from its content seriously.

There is ample evidence that social media platforms create echo chambers that further reinforce what users already believe. Internal research conducted by Facebook itself and uncovered by whistleblower Frances Haugen in 2021, for example, suggests that, despite the company’s efforts to remove anti-vaccine content, misinformation about coronavirus vaccines spread on the platform.

Such studies show how much insight can be gleaned from the data that online platforms collect. In the right hands, this data could help society identify and address the negative effects of social media use. Imagine, for example, if public health researchers could analyze how vaccine skeptics consume information and which messages resonate with them most. They could develop better strategies to address vaccine skepticism, thus combating misinformation more effectively than content moderation.

A few years ago, a bipartisan group of lawmakers in Congress proposed legislation that would force platforms to share their data with university-affiliated researchers while establishing standards for privacy and cybersecurity.

Such an idea should be reconsidered. Lawmakers should also consider requiring social media companies to be more transparent about their algorithms.

Experience has shown that social media companies cannot eliminate all harmful content from their platforms. This does not mean that their efforts have been in vain, but that, even with millions of dollars in investment, there is a limit to what can be done.

If Meta and other social media companies want to rebuild trust with their users, openness is essential. Although it has become less common to acknowledge the good that social media can bring, there was once much optimism that it would improve society. With transparency reaching all the way to the foundation of these platforms, this potential may still exist. / bota.al

Happening now...