pilic.work

An anonymous reader quotes a report from Gizmodo: The Royal Society is the UK’s national academy of sciences. On Wednesday, it published a report on what it calls the “online information environment,” challenging some key assumptions behind the movement to de-platform conspiracy theorists spreading hoax info on topics like climate change, 5G, and the coronavirus. Based on literature reviews, workshops and roundtables with academic experts and fact-checking groups, and two surveys in the UK, the Royal Society reached several conclusions. The first is that while online misinformation is rampant, its influence may be exaggerated, at least as far as the UK goes: “the vast majority of respondents believe the COVID-19 vaccines are safe, that human activity is responsible for climate change, and that 5G technology is not harmful.” The second is that the impact of so-called echo chambers may be similarly exaggerated and there’s little evidence to support the “filter bubble” hypothesis (basically, algorithm-fueled extremist rabbit holes). The researchers also highlighted that many debates about what constitutes misinformation are rooted in disputes within the scientific community and that the anti-vax movement is far broader than any one set of beliefs or motivations.

One of the main takeaways: The government and social media companies should not rely on “constant removal” of misleading content [because it is] not a “solution to online scientific misinformation.” It also warns that if conspiracy theorists are driven out of places like Facebook, they could retreat into parts of the web where they are unreachable. Importantly, the report makes a distinction between removing scientific misinformation and other content like hate speech or illegal media, where removals may be more effective: “… Whilst this approach may be effective and essential for illegal content (eg hate speech, terrorist content, child sexual abuse material) there is little evidence to support the effectiveness of this approach for scientific misinformation, and approaches to addressing the amplification of misinformation may be more effective. In addition, demonstrating a causal link between online misinformation and offline harm is difficult to achieve, and there is a risk that content removal may cause more harm than good by driving misinformation content (and people who may act upon it) towards harder-to-address corners of the internet.”

Instead of removal, the Royal Society researchers advocate developing what they call “collective resilience.” Pushing back on scientific disinformation may be more effective via other tactics, such as demonetization, systems to prevent amplification of such content, and fact-checking labels. The report encourages the UK government to continue fighting back against scientific misinformation but to emphasize society-wide harms that may arise from issues like climate change rather than the potential risk to individuals for taking the bait. Other strategies the Royal Society suggests are continuing the development of independent, well-financed fact-checking organizations; fighting misinformation “beyond high-risk, high-reach social media platforms”; and promoting transparency and collaboration between platforms and scientists. Finally, the report mentions that regulating recommendation algorithms may be effective.


Read more of this story at Slashdot.