A survey of eight countries carried out in 2024 by Reutersinstitute[1] shows that people generally want platforms themselves to set the rules on what content appears online. Across the countries, between 51 and 70 percent of respondents said platforms should manage their own policies. Only minorities in any country supported governments taking the lead.

The UK recorded the highest support for government intervention, with close to half saying authorities should play a stronger role across platform types. Germany also leaned in that direction, a reflection of its earlier law from 2017 that obliges platforms to remove content deemed illegal. By contrast, in countries like Brazil and Argentina, well over 60 percent preferred companies to hold responsibility.

Differences by Platform Type

When asked about social media networks such as Facebook, X, or TikTok, around six in ten respondents said these platforms should set their own rules. Video sites such as YouTube showed a similar pattern, with roughly 58 to 65 percent across countries assigning responsibility to the companies themselves. Search engines like Google and Bing recorded comparable results. Messaging apps including WhatsApp, Messenger, and WeChat followed the same trend.

Generative AI platforms produced a more divided response. In the UK, 50 percent wanted government oversight of AI companies, compared with only about 40 percent who trusted the companies to regulate themselves. In the US and Germany, the split was much closer, with small gaps of just a few percentage points. In Japan, South Korea, and Brazil, clear majorities still placed responsibility with the companies.

Little Variation by Demographics

When broken down by gender, age, or education, differences were narrow. Men and younger respondents were slightly more likely to trust AI platforms to manage their own content, while women and older people leaned a bit more toward public oversight. Yet even in these groups, support for company responsibility stayed above half.

Political affiliation mattered more but not enough to change the overall picture. Among people on the political left, about 43 percent wanted governments more involved, compared with only 25 percent on the right. Even so, 57 to 69 percent across all political groups said platforms should hold the main responsibility.

Views on Misinformation

When asked specifically about false or misleading content, majorities in every country wanted platforms held accountable. Depending on the country, between 56 and 76 percent said companies should be responsible when users spread misinformation. Opposition was much smaller, ranging from 14 to 33 percent.

In the US, about two thirds supported platform responsibility for falsehoods, lower than the levels seen in Spain or Argentina where roughly three quarters agreed. Germany showed similar patterns to the US, while in South Korea and Japan majorities of around 60 percent said platforms should carry responsibility.

Age differences were modest. Older respondents were more likely to hold companies accountable, with rates climbing above 70 percent in some countries, compared with closer to 60 percent among younger groups.

One notable result came from YouTube. Users of the platform were more likely than non-users to say the company should be held responsible for misleading content, suggesting first-hand exposure shaped views.

Political Splits

Politics again shaped the results. On the left, 80 percent across countries said social networks should be accountable for false information. On the right the figure was 65 percent, and among centrists it fell between those two points. A similar gap appeared for video sites, with 78 percent of left-leaning respondents favoring accountability compared with 65 percent of those on the right.

Real-World Consequences

The survey connected with recent events that showed how damaging misinformation can be. In the UK, false rumors about the identity of a murder suspect in 2024 sparked riots with anti-immigration sentiment. In the US, false claims of electoral fraud fed into the January 6 Capitol attack. Brazil faced a comparable incident in 2023 when crowds stormed the National Congress after similar claims.

Current Models of Oversight

Different platforms already use different approaches. X allows users to add “Community Notes” to give context on posts. Meta dropped outside fact-checkers in the US and now relies more on user-led systems. Reddit and Mastodon use community moderators, while search engines usually downrank poor-quality pages through algorithms.

General Picture

Across countries and groups, the findings suggest people are skeptical of government control and more comfortable with platforms setting their own rules, though they also expect those same companies to take responsibility for false content. Generative AI remains the one area where more people favor a government role, especially in the UK.

Notes: This post was edited/created using GenAI tools.

Read next: A New Encyclopedia in the Making? Musk Thinks Grok Can Replace Wikipedia Editors[2]

References

  1. ^ Reutersinstitute (reutersinstitute.politics.ox.ac.uk)
  2. ^ A New Encyclopedia in the Making? Musk Thinks Grok Can Replace Wikipedia Editors (www.digitalinformationworld.com)

By admin