Skip to main content

Verified by Psychology Today

President Donald Trump

When Twitter Banned Trump

The hopeful act of disrupting disinformation.

This post is in response to
Why Disinformation Campaigns Are Dangerous

Disinformation campaigns are dangerous. People become convinced of false information and act on it. Refusing to wear masks. Or invading the Capitol. But there is hope. We can disrupt disinformation.

As I have described in a companion post, many people have been swimming in a sea of lies. They have been constantly exposed to a series of disinformation campaigns. Because most sources in their environment spread the disinformation, some people have become convinced that the election was stolen. They may also believe that Covid isn’t very dangerous. They may also deny the reality of climate change. And these beliefs lead to actions. Maybe everyday things, such as refusing to wear a mask when shopping. Or maybe more violent actions such as invading the US Capitol.

Disinformation is dangerous. And when people are in a climate of constant and repeated lies and conspiracy theories, they will often reject true information. And they often reject the people trying to share a different and more accurate version of the world. Perhaps you’ve lost contact with friends and family members who have fallen into the sea of disinformation.

As someone who studies the effects of misinformation, I have written a lot about this topic in the last few years. It has felt hopeless at times. The reason: People find a media environment in which the false beliefs are repeated, in which the truth is excluded.

But I am becoming hopeful. I am hopeful because people are both talking about and acting on disrupting the flow of disinformation campaigns. In the days following the invasion of the US Capitol, multiple social media companies banned President Trump from their platforms. They did so based on his repeated violation of their terms of service. The social media companies have focused on the promotion of violence surrounding the attack on the Capitol. But some have also noted the repeated lies he presented.

This is why I am becoming hopeful. I want to highlight a critical thing about misinformation and disinformation campaigns on the Internet. It doesn’t take much to disrupt the flow of misinformation, disinformation, and conspiracy theories. Drop a few people. Change a few algorithms. Just a few changes, very few most likely, and we can emphasize truth on the Internet.

First, as the social media companies have done with Trump, they should de-platform the others who are the primary sources of disinformation campaigns. In new research, Kate Starbird and her research team have found that most of the false information is funneled through a small number of accounts. Remove people who repetitively share false information. How many? Probably fewer than 100. These are the people and accounts that most disinformation campaigns funnel through. As Starbird and her team have noted, people who shared election disinformation also shared false information on other topics. Many have quickly started sharing false information about last week’s attack on the Capitol. In keeping with this goal, this week Twitter has clarified their rules for responding to people who repeatedly share disinformation campaigns in a manner consistent with this approach. Removing, or de-platforming, a few repeat offenders reduces dramatically the likelihood that new individuals will be repeatedly exposed to disinformation campaigns.

Second, change the algorithms to stop promoting lies. Let’s face it, the lies are often compelling. They make interesting stories. Conspiracy theories hang together in entertaining ways. People seem to enjoy reading them. For this reason, the algorithms that various Internet companies use may often offer misinformation, disinformation, and conspiracy theories. Search for information about vaccines, and after a few links, you may find yourself down a rabbit hole of anti-vaccine websites. Social media and Internet search engines are already following algorithms. They are already choosing what information you see and what is easier to find. They do so with the goal of keeping you engaged and leading you to products of some sort eventually. Perhaps the algorithms should also value the reliability of the information presented. I know they have been working on this already.

These aren’t big changes. And I am hopeful because this week, some of the leading social media and Internet companies seem to be moving in this direction.

I want to close by noting that I am not leaving this post open for comments, although I generally do. But I suspect some people will find themselves wanting to argue with me. If you disagree with the statements regarding the outcome of the election or the risks of Covid, I would like to make a prediction and a suggestion.

First, the prediction. You’ve found yourself in changing social groups over the last few years. You’ve probably lost contact with some friends and family. Maybe you got tired of them trying to change your mind. Maybe they dropped you because you have changed. You’ve lost touch with people because of your belief in disinformation (although you say they follow fake news). And you’ve found a new group of people and news sources that echo your views. This happens when you believe disinformation campaigns. Your friends and family probably miss you.

The second is the suggestion. Change your information feed for the next month. Drop the extreme media sources and groups. Try relying on the mainstream media you’ve disdained. Try recontacting with friends and family. Try changing to more reliable information. In essence, try disrupting the flow of disinformation for yourself. Then use your critical thinking skills to reconsider the disinformation in light of a different information context. Maybe you will change your mind. And maybe you’ll reconnect with friends and family.

More from Ira Hyman Ph.D.
More from Psychology Today
More from Ira Hyman Ph.D.
More from Psychology Today