Imagine a stack of newspapers on the tube where each one is different depending on the reader. The stories were chosen just for you. Would this be a dreamy utopia where you are getting exactly what you want?
The reality is that we are finding more and more news online via search engines that give us what we want, when we want it. This is the feel good world of algorithms.
Tech giants like Google hold masses of search data and guide users towards specific searches. When you start typing into a search engine you are served suggestions of how to complete the query that you began. This may seem harmless, but really companies and advertisers are steering users towards what is going to generate the best result or the most income.
What is best? Ever spent real amounts of time reading something similar? That’s an upvote. Ever seen an opposing perspective and quickly hit back and navigated elsewhere? That’s a downvote. It’s like a giant piece of spyware, scoring your likes and dislikes and reinforcing your existing beliefs while optimising to make the most money.
Collecting personal data strengthens this process as users are served results based on their similarity to others in their ‘bucket’. This algorithmic advertising traps users in virtual spaces where people interact with others with similar views. These echo chambers form bubbles where individuals are not exposed to diverse perspectives or challenging viewpoints whether it’s your favourite sports team right or your core ideological beliefs.
This should worry us. Extreme opinions start to become amplified over time and people become entrenched in their positions which widens society’s divisions. That leads to intolerance, hostility and political unrest. Consider the war in Ukraine - if a user only follows news sources with a particular bias, they may miss out on critical information that challenges their pre-existing beliefs and fail to understand on what’s actually going on.
The same is true for politics and climate change. Users don’t realise the results they’re being served are biased, they take it for fact which in the case of politics, can inform their voting decision. Add all of those votes up and you have a huge societal impact.
At a time when we are experiencing a cost-of-living crisis, the impact that biased information can have on different groups is huge. A user’s economic situation is dictating the information they consume on the state of the economy. You might believe that the highest earners in our society are creating the crisis through their inflated wages or you might believe that no one is being paid a living wage in the face of increasing costs. Each side of the argument is having their perspective amplified rather than understanding divergent views to come to a solution that works for all.
AI is going to take this to another level. Advanced computer systems more sophisticated than anything we’ve seen before will be analysing huge amounts of data to target average people. This will cause hyper-personalisation across every single search query, and every single retrieval of information. It will happen in such a subtle way we won’t even know these digital bubbles are being created.
The solution to fighting these biases is less personalisation. The more personalised the search experience becomes, the more companies like Google use data, and the more biased people’s perspectives and viewpoints become. With less personalisation, users are faced with a greater diversity of opinion which challenges people to think differently.
We deserve the right to live our digital lives on our own terms. Big tech corporations are only stopping us from doing this by taking any control we have over our personal data, right from our own hands.
Michael Levit is, Co-founder and CEO of private search browser and engine, Tempest