Skip to main content

Verified by Psychology Today

Media

How Algorithms Change How We Think

Algorithmic narrowing can limit our worldview.

Key points

  • Algorithms are employed by social media and search engines to quickly and simply present the most relevant information to a user.
  • Algorithms have a prioritized objective that drives what information they present, but it doesn't necessarily align with what we need.
  • The more we look at the same information, the more that same information is presented to us, narrowing our worldviews.

In a society that vehemently protects our right to free speech, we lack the same vigour when it comes to freedom of thought. Every day, the majority of us succumb to the simplicity of algorithmic manipulation, volunteering our minds to potent social experimentation.

Algorithms are simply a process or sets of rules followed by a computer as part of its problem-solving capabilities. They are not inherently good or bad. Yet they actively manipulate billions of people across the globe daily, making them the single most compelling propaganda machine to have ever existed. An analysis by MIT Technology Review in 2021 found that content from troll farms in Eastern Europe reached 140 million Americans via Facebook in the month before the 2020 U.S. elections, 75 percent of whom had never before engaged with the pages issuing the content.

Besides sowing discord, the purpose of the content remains unclear, but what quickly became obvious is that the reach was purely down to the algorithms that power the popular social media platform. As computer science Professor Stuart Russell OBE identified, social media platforms, and therefore algorithms, have control over more people globally than any dictator has ever had.

For both social media and search engines, algorithms are employed in order to quickly and simply present the most relevant information to a user. They are extremely useful in disseminating a large amount of information swiftly and cherry-picking what we are looking for. But they are also flawed. Algorithms typically have two purposes; to present you with information that you find relevant and to achieve their written objective, such as maximising clicks or shares. This means that as well as prioritising content similar to the information you have consumed before. It will seek to suggest content that is highly engaging and tangentially connected. It has nothing to do with the quality of the content and everything to do with its prioritised objective.

As users of algorithmic platforms, we are focused on consuming information that we find interesting or insightful, entertaining, or personally relevant. An algorithm's objective will be much more mathematical, such as maximising clicks or driving engagements. Therefore, while you value content that interests or excites you, the algorithm favours what you are most likely to click on, and those two things are not necessarily aligned.

It is through this that algorithms can manipulate us. Not only do they focus on what you want to consume and what will garner the biggest reaction, but they actively seek to change you by making you more predictable and aligned with their original purpose. In reality, this presents as an increase in additive and reactive content while simultaneously exacerbating behavioural extremes. Over time, it continues to narrow your field of content, slowly but surely impacting the way you think.

As humans, we are predisposed to cognitive processing that simplifies the consumption of mass data into manageable outputs. This means we exhibit:

  • Confirmation bias, favouring content that feels familiar and aligns with our current beliefs.
  • Truth bias, leading us to believe people are telling us the truth, even when we have some evidence to the contrary.
  • The illusory truth effect which is the increasing effect of information the more times that we encounter it, regardless of whether it is true or false.

We are perfectly primed for manipulation by algorithms. The more of the same content we see, the narrower our viewpoint becomes and the more predictable we become. An ever-decreasing cycle.

For many, this reality may seem insignificant. Seeing more of the content you enjoy might seem very appealing. The challenge is this automated narrowing is demonstrably contributing to the rise in extremist views, reactive behaviours, and even uplift in powerful conspiracy theories. Election outcomes, the Brexit vote, the Capitol riots, and even #Pizzagate have all been linked to some extent to algorithmic narrowing. It feels like we should be anything but satisfied with this systemic influence.

So, what can we do about it?

The best thing we can do as individuals is employ critical thinking and ignoring. Critical thinking is applied clear and rational thinking utilising intentional consideration of content to reduce our susceptibility to coercion. Critical ignoring is deliberately selecting what to consume and what we ignore in the face of information overabundance.

This means employing every tool that we can to maximise the breadth and depth of content we consume. Employing media literacy techniques, understanding the source of content, considering who benefits from the content, and actively seeking alternative content sources can all help. Lateral reading is also extremely powerful, actively utilising multiple sources to evaluate the credibility of what you are reading. At the same time, it is also essential to consider your own bias and wonder whether your assumptions are correct. This will broaden your viewpoint, prevent the inherent narrowing, and actively push against the responsible algorithm.

There is also hope from the engines themselves. Facebook is launching reporting and flagging tools to tackle fake news. Similarly, Google is actively seeking ways to minimise this coercive control. One key area of current research is on pre-bunking, highlighting the value of warning and teaching people about common manipulative techniques to minimise the opportunity for manipulation.

Whatever you do, active participation in what you consume is essential, while vigorously pursuing quality content and alternative viewpoints helps make you less predictable to algorithms, actively expanding the content you are presented with over time.

advertisement
More from The Open Minds Foundation
More from Psychology Today