Skip to main content

Verified by Psychology Today


The Internet Needs New Rules: My Top Three

Deleting Facebook will not solve our problems.

Key points

  • The current algorithms are designed to secretly push precisely personalized experiences.
  • Secret algorithms will be replaced by a participatory model.
  • The new algorithms need to be designed to collectively push for balanced and diverse experiences.

Facebook has hogged the headlines in the past few weeks. The Congress hearings cover only the tip of a deeply-rooted issue. Yes, Facebook algorithms are not good for mental health, social cohesion, or democracy. But the problem is: Such algorithms are not unique to Facebook. They are baked into the current design of the Internet.

This year is the 32nd Anniversary of the World Wide Web. From a simple communication platform, the Internet has evolved into a global move propelled by algorithms. It is not a fair game. Through secretive manipulation, the Internet’s algorithms inveigle its users into precisely personalized experiences. Precise personalization means that all play but only a few see where the ball is.

Fining Facebook for its badly designed algorithms are repairs on the margin that won’t fix the Internet for all. We need new rules that will transform not only social media but the whole Internet. We need to optimize the Internet algorithms for less commercial and more social gain.

Here are the top three rules to achieve that:

Personalized and diversified

The current algorithms adapt generic content to individual users. They use personal data (such as your search and browsing history from websites or likes and views from social media) to adapt the content to what is of most interest to you. This personalization process works wonders for getting access to the most relevant information. When you see something relevant, you react to it. People behind the algorithms know that each reaction means more attention, more data, more money.

While great for engagement, personalization will not give you serendipitous insights. It will not broaden your horizons. Instead, it will put you into an echo chamber with people who have similar likes and viewpoints.

The first rule, therefore, is to combine personalized information with content that is not relevant to individuals but to collectives. Just like the Internet brings us relevant content, it needs to be designed to bring us ideas and opinions that surprise us. Ideas that are purposefully selected to be different from our own and from those of our friends. In research, we refer to this kind of principle as the personalization-pluralization principle. In practice, it looks like this search engine gives you search results of what you are not looking for.

Sinusoidal growth

The commercial interest behind the most popular websites (Facebook, YouTube, Google) is clearly visible in how the companies treat your personal data. All your likes, views, clicks, and taps are part of a neverending growth of data. Think of an exponential line that stretches further with more and more data from more and more people connecting online.

The data are collected on multiple platforms, stored in multiple places. The algorithms are designed with the intention to get more and more information because “more” is perceived as “better” in the business world.

The more data the algorithms have about you, the more precisely can they personalize the content for you. This works great for dating apps where a precise match can result in a relationship. Or for shopping vouchers where a discount for the exact type of tea you like can result in you actually purchasing the tea.

But exponential data growth is not sustainable. There must be a ceiling point, a point where we say that the algorithms have enough information about us. If we don’t design for such a point, we get unexpected negative consequences. Take personalized education as an example: collecting too much data on children, in the form of tests or screening, turns tests to be the teaching target. In such cases, schools do not teach children to develop holistically but to simply pass an exam.

Yes, we need some data to even out the differences between children. But too much data reduces children’s skills to a set of data points and test scores. Yes, teachers need some data to supplement their insights. But too much data, whether in the form of online dashboards or personalized learning platforms, reduces teachers’ role of learning mentors to progress monitors.

One size won’t fit all – the new algorithms need to be much more nuanced about what kinds of data are being collected for what purpose and for whom. The exponential data growth needs to be replaced by periodic ups and downs. Instead of a straight line, imagine a sinusoid that adapts to individuals and contexts.

An edited version of a Shutterstock image by the artist Marisha
Source: An edited version of a Shutterstock image by the artist Marisha

Collaborative and participatory

The current algorithms are developed, updated, and changed behind closed doors. The secrecy allows Facebook, Google, and Twitter to act as ‘oligarchs of speech’ and influence opinions. To restore democracy in communication and information, the new algorithms need to be about collaboration and participation.

Their workings cannot be designed by a few developers but by the communities of people using them. Not in a Marxist way where everyone designs their own website, but in a democratic way where all Internet users have the option to administer and the right to edit their online content.

If we implement these three rules, we will change the Internet. Secret algorithms will be replaced by a participatory model. Precise personalization will be balanced up with serendipitous diversity. The Internet will strike an optimal balance between artificial and human intelligence. Our online activities will be both automated by algorithms, and driven by human imagination. I hope you will join in the game.


Kucirkova, N. (2021). The Future of the Self: Understanding Personalization in Childhood and Beyond. Emerald Group Publishing.

More from Natalia Kucirkova Ph.D.
More from Psychology Today