Technology Brief #34

Facebook Profits From Political Polarization and Violence 

By Scout Burchill

January 28, 2021

Summary:

Facebook has been targeting online “patriot” and militia groups with ads for military gear such as body armor and weapon accessories. Despite a letter to Facebook CEO Mark Zuckerberg from members of Congress, calls from State Attorney Generals, and internal warnings by Facebook employees, research by the Tech Transparency Project reveals that these ads were still targeting users as late as January 17th.

According to the report published by the Tech Transparency Project, Facebook not only targeted users with ads for military tactical gear, but the company’s algorithms also actively recommended links to other nationalist and military groups.

In the wake of the violence on Capitol Hill, social media companies are coming under greater scrutiny for their role in contributing to the polarization, radicalization and rank division within American society. Although there have been plenty of debates about specific decisions that companies like Facebook have made in recent days, the dire state of affairs has put renewed emphasis on the built-in aspects of Facebook’s business model, which profits from conflict and division and has been criticized as an engine of radicalization.

 

 

Analysis:

Facebook makes money by selling targeted advertising. It sounds simple, but the results can be deadly. In order to sell targeted ads, Facebook surveils its users and collects massive amounts of data on each user so it can target them more effectively. They are very good at targeting potential customers simply because they know so much about us. Because the company profits from users seeing ads, the platform is designed to keep users on the site for as long as possible. To do this, their algorithms promote content that fosters ‘engagement.’ Engaging content also tends to be incendiary, sensationalistic and conflict-driven. For Facebook, it’s a winning business formula. The more we scroll, the more information they collect, the more ads we see and the better equipped they are at keeping us scrolling while plastering our eyeballs with ads for the perfect pair of shoes we never knew we needed. Businesses like these have inspired new economic terms like the attention economy, which describe how companies vie to win over people’s dwindling attention spans as if they were mining for scarce resources. The truth is, it’s an exploitative business that is wreaking havoc on our society’s collective mind. There is a mounting pile of evidence to suggest that Facebook rakes in profits while it radicalizes individuals and contributes to the growing amounts of political violence in our society.

The radicalization machine works in two ways. Firstly, the Facebook algorithm feeds users radicalizing, sensationalist content to capture their attention and keep them online. Content is promoted by Facebook’s algorithm not by standards of quality, but by how much engagement it receives. Of course this method tends to promote addictive or even incendiary content. Conspiracy theories, misinformation and conflict is part of the product. It’s no secret that over the past decade, social media has turned fringe actors into stars. Besides Facebook, one of the most extreme engines of radicalization is YouTube, which is owned by Google. Tristan Harris, co-founder of the Center For Humane Technology and star of Netflix’s excellent documentary The Social Dilemma, illustrates this built-in function of attention maximizing algorithms by pointing out that Alex Jones’ Youtube videos have been recommended to users 15 billion times. Last year, the New York Times published an article detailing how YouTube radicalized the youth of Brazil by systematically recommending far-right figures, priming the public and paving the way for Jair Bolsonaro, the current president of Brazil, who openly pines for the days of the military government.

The second way the attention maximizing business model radicalizes users is by rewarding users who post radical content. Last week, the New York Times published a remarkable op-ed showing how Facebook’s algorithms rewarded users who began to embrace more extremist views. One of the people featured in the article was a 26 year old college student and veteran from Georgia. For years his Facebook feed was relatively standard and garnered little attention from others. That all changed during the 2020 presidential election, when he posted his thoughts about suspicious activity surrounding the election. He instantly saw a sharp rise in engagement, mainly in the form of likes and comments, which encouraged him to lean into these views. That was November. By January he was marching on Capitol Hill to ‘stop the steal’. By the end of the article, he is disowned by his family for his views and actions, and is left with nowhere else to turn but back to Facebook to chase the feelings of community he once found there. This anecdote presents the flip side of the algorithm. By seeking to extract as much attention from users as possible it constantly nudges them toward increasingly conspiratorial and fringe views.

These perverse incentive structures are baked into the business model of targeted advertising. There are a myriad of other negative consequences, as well. For one, surveillance and data collection are a key part of this business. Every post a user reads, the algorithm does, too. Also, the personalization of the experience causes the algorithm to treat everyone differently. This makes it really hard to have a shared sense of reality, something that is pretty important if a society is to govern itself. Another serious harm is that these companies have monopolized their market and currently absorb nearly all digital ad revenues, effectively sucking money away from funding good journalism and allocating it toward clickbait instead. This has wide-ranging effects on society as good information is the lifeblood of healthy democracies. One final point is that many smaller businesses now depend on social media for customers, which can leave them entirely at the mercy of small imperceptible changes in policy or algorithmic emphasis. Tinkering with the search algorithm at Facebook headquarters could decimate your favorite niche outlet, which is exactly what happened to Little Things, a small digital publisher that shared positive stories and informative articles for mothers and parents.

If the Biden Administration wants to tackle the harms of Big Tech, taking on the exploitative ad-driven and attention-mining business model would be a great way to do so. It avoids the highly politicized debates about specific instances of deplatforming and gets to the root source why fringe conspiracies and violent groups proliferate on social media. There are many organizations and individuals already pushing for this type of reform with different ideas about how to accomplish it. Some argue that there should be reforms to Section 230 in which content that is promoted by algorithms should not be immune from liability, others argue for treating social media like other communications infrastructure and regulating it as such, others argue for a combination of antitrust and regulation, while others advocate for rules that promote more ethical designs. All of these ideas should be considered by the administration because nearly anything is better than what we have now.

 

Engagement  Resources:

https://www.humanetech.com/

https://www.economicliberties.us/big-tech-monopolies/

 

Learn More:

Tech Transparency Project Report

https://www.techtransparencyproject.org/articles/how-facebook-profits-insurrection

https://www.buzzfeednews.com/article/ryanmac/facebook-profits-military-gear-ads-capitol-riot

 

Accountability and Action Letter From Congress to Facebook

https://beta.documentcloud.org/documents/20457291-210115-duckworth-blumenthal-brown-joint-letter-to-facebook-accountability-and-action-following-january-6-attack

 

Tristan Harris on Alex Jones

https://twitter.com/tristanharris/status/1187403044062760960?lang=en

 

How Youtube Radicalized Brazil

https://www.nytimes.com/2019/08/11/world/americas/youtube-brazil.html

 

NYT Op-Ed on Facebook Promoting Far-Right Views

https://www.nytimes.com/2021/01/14/opinion/facebook-far-right.html

 

DONATE NOW
Subscribe Below to Our News Service
x
x
Support fearless journalism! Your contribution, big or small, dismantles corruption and sparks meaningful change. As an independent outlet, we rely on readers like you to champion the cause of transparent and accountable governance. Every donation fuels our mission for insightful policy reporting, a cornerstone for informed citizenship. Help safeguard democracy from tyrants—donate today. Your generosity fosters hope for a just and equitable society.

Pin It on Pinterest

Share This