How Algorithms Amplify Right Wing Disinformation
Technology Policy #118 | By: Mindy Spatt | October 14, 2024
Featured Photo: abcnews.go.com
__________________________________
Donald Trump’s bizarre rants about Haitian immigrants eating dogs, televised during the US presidential debate, were immediately challenged and disproven on air and afterwards. That didn’t stop the false and dangerous story from spreading online, thanks in part to algorithms. Algorithms determine which content a social media user will see based on their previous engagement, often amplifying disinformation and highly ideological content for Trump’s already right wing base.
Analysis
Algorithms are the automated systems used by social media platforms to recommend content to users based on their previous online activities.
It is well-known that these distribution systems allow sensational and extreme disinformation to travel quickly in a variety of ways:
- Through fake accounts, which are prevalent in online conservative forums,
- Through “echo chambers” that show users content they’ve already engaged with
- By manipulating metrics that push disinformation to the top of social media platforms and searches.
The disinformation is traveling in a distinctly right wing direction. According to a study published in Nature, content labeled as deceptive by fact checkers is more likely to be seen by conservative Facebook users and a majority of the political websites sourced by Facebook reflect conservative views
Researchers at Queens University in Charlotte, North Carolina, found that after watching 20 popular videos suggesting election systems are rigged, TikTok’s algorithms began pushing more “election disinformation, polarizing content, far-right extremism, QAnon conspiracy theories and false Covid-19 narratives” toward the user, regardless of what terms they searched under.
All of this can keep an extremely partisan political content at the top of users’ feeds, and means they are unlikely to see contrary information, regardless of its veracity.
And it pays. Going viral online can translate into real money for the people who amplify disinformation. According to the New York Times, in the days after Trump promoted his false pet eating stories on TV, videos that repeated the story appeared on You Tube with advertising from major corporations including Adobe and Mazda, likely meaning a huge payday for the creators of the disinformation and You Tube owner Elon Musk.
In a report in March of this year, Public Knowledge had several recommendations for reducing election disinformation, including increased transparency regarding their algorithms and the underlying data they are based on, paid advertising and the outcomes of “algorithmic decision-making.” The information should be made available to “researchers, regulators, independent auditors, and/or the public.”
The report highlighted that product design can make a differences; it can impact the time it takes for disinformation to spread and the level of user engagement and can push increasingly extreme content over time. In the groups’ view, a dedicated regulator is needed to implement liability for defective designs that cause harm.
But right now those guardrails are not in place, which may be why Hillary Clinton warned has warned of an October surprise aimed at Vice President Harris similar to the ridiculous “Pizzagate” story about her that gained enormous traction online in 2016. “This is dangerous stuff,” she said. “It starts online often on the dark web. It migrates. It’s picked up by the pro-Trump media. It’s then reported on by everybody else, which makes sure it has about 100 percent coverage, and people believe it.”
Engagement Resources:
- Misinformation on Social Media, Queens University of Charlotte, https://library.queens.edu/misinformation-on-social-media/algorithms
- Lies All the Way Down – Combating 2024 Election Disinformation By Lisa Macpherson and Rajan Srinivasan, March 18, 2024, https://publicknowledge.org/lies-all-the-way-down/
- Social Media Algorithms Distort Social Instincts and Fuel Misinformation, Aug. 3, 2023, https://neurosciencenews.com/social-media-behavior-misinformation-23752/
- Tweaking Facebook Feeds Is No Easy Fix for Polarization, Studies Find, by Jeff Tollefson, 27 July 2023, https://www.nature.com/articles/d41586-023-02420-z
Stay in-the-know with the latest updates from our reporters by subscribing to the U.S. Resist News Weekly Newsletter. We depend on support from readers like you to aide in protecting fearless independent journalism, so please consider donating to keep democracy alive today!