HomeTECHNOLOGYThe Filter Bubble: Why Every Internet User Ends Up In It Sooner...

The Filter Bubble: Why Every Internet User Ends Up In It Sooner Or Later

Filter Bubble: Algorithms are designed to get to know us and our interests better and filter content on the web according to our preferences. This is precisely where there is a great danger, as students at University of Applied Sciences found out. As a result, we all end up in the filter bubble sooner or later. 

Elvis lives! The moon landing was staged, and the city of Bielefeld doesn’t exist. Our world is full of conspiracy theories – some more plausible than others. But while these ideas can sometimes be exciting or entertaining to explore, they also have a dangerous side.

Anyone Online Will Eventually End Up In The Filter Bubble

Particularly reinforced by the Internet, people can be in a thematic “filter bubble.” Here their ideas – no matter how extreme, absurd or irrational they may be – are only confirmed by other users and the content they see. Critical opinions or contact with those who think differently no longer take place. Of course, this applies to conspiracy theories and preferences, political opinions, or social views.

But how do people end up in a filter bubble? What role do the Internet and the algorithms of certain websites play in this? The two are studying industrial engineering at the University of  To find out if this is the case and how it works, the two tested the bubble effect in the news app Google News and YouTube.

The Internet Knows (Almost) Everything About You

To do this, they created a brand new user profile and then deliberately controlled their topic search. Because intelligent algorithms remember very precisely what we are looking for on the Internet and when, which websites we surf on, how long we stay on a website, which browser we use, and what we buy. In other words, the Internet knows (almost) everything about you. Self-learning algorithms consciously use this knowledge to get to know users better. They are programmed by website operators such as Netflix, Amazon, Google, or YouTube.

Why?

Because they can make more money that way. Because if Amazon knows you very well, the shopping portal can suggest products for sale that interest you – and that you are more likely to buy. Social media portals such as YouTube, on the other hand, earn their money from advertisers. The more users you bring to your portal and the longer they stay there, the more attractive you are for advertisers.

And how do you get users to surf a website for as long as possible? By suggesting topics – or in this case, videos – that interest them. How many times have you searched for something specific on YouTube but then got completely lost with the excellent suggestions in the sidebar?

But if an algorithm suggests content you are most likely interested in, there has also been an elimination process beforehand. The algorithm has decided which content it will not show you or withhold. “Algorithms filter out content for us users without it was being transparent or users being aware of it,”

Once You’re In The Filter Bubble, It’s Hard To Get Out

They specifically searched for corona conspiracy theories on both Google News and YouTube to teach the algorithm they were interested in. They wanted to test how long it would take to end up in a corresponding filter bubble. “In the beginning, we were mainly reported to supposedly reputable sources. We had to make the search very specific, and it still took a surprisingly long time before we got into the filter bubble”.

Once in the filter bubble, it didn’t take long before they were in an “echo chamber.” In an echo chamber, your views are reinforced while opposing opinions remain entirely hidden. At some point, Jonas Wieskamp and Jan-Eric Müller almost only received suggestions for videos with content on relevant conspiracy theories on YouTube.

This shows two things: It is not relatively as easy to end up in the filter bubble as one might think. But anyone with relevant interests will end up in it sooner or later. These can be harmless topic bubbles, such as “Apple products” or “organic cuisine,” and more questionable such issues as conspiracy theories or politically extremist ideologies.

The two students admit that they consciously controlled their algorithms. But the underlying mechanisms of AI ​​remain the same. With typical Internet use, it just takes longer to end up in such a filter bubble.

ALSO READ: 5 Reasons Why You Should Use PDF Creator

More Transparency For Algorithms

Of course, there are also portals such as Netflix, Amazon, or YouTube. This is much more pronounced than, for example, in the regular Google search, where SEO criteria play a role in addition to your preferences. But even here, the algorithm will show you particular suggestions – and deliberately not others, thanks to the autocomplete function.

Ideally, website operators would disclose how these algorithms work, which suggestions have been filtered out and give users more freedom of choice, like Jonas Wieskamp and Jan-Eric Müller. This could be done in different ways. As a user, you could either intervene in the algorithm’s settings yourself and, for example, activate or deactivate topics of interest yourself. Of course, that would require users to break out of their comfortable filter bubble.

The algorithm would also be possible to automatically suggest new topics and sources in a very one-sided search on the web. Also conceivable: A brief display by the algorithm of which content has been filtered out. Theoretically, a legally prescribed minimum diversity for algorithms is also conceivable, although that would certainly not be very easy to implement.

In the long term, the two students hope that trained industrial engineers in companies will create more transparency and traceability for the functioning of the algorithms. In any case, after their seminar work, they are very aware of how algorithms control the displayed content on the Internet.

What Can You Do About The Filter Bubble?

But knowing how algorithms push us into the filter bubble is one thing. Take action against it, the other because algorithm transparency is still a future thing. So what can you as a user do now to surround yourself with more diversity on the Internet?

For example, you can disable cookies in your browser or partially turn off personalized content on various platforms. There is also the option to surf in incognito mode or via a VPN server, which offers you more protection from omniscient algorithms. On the other hand, some Internet browsers offer more anonymity than others.

Last but not least, you can, of course, also actively step out of your filter bubble and deal with other opinions. After all, that is often precisely the problem with bubbles: we lose the ability to rationally and calmly exchange ideas with those who think differently and thus allow new views.

ALSO READ: This Creepy Disney Robot Is Changing Human-Machine Communication

Learn Digital Techhttps://www.learndigitaltech.com
Leardigitaltech.com is a fast-growing online platform that keeps creativity and authenticity intact in everything we do. We are a bunch of SEO experts, digital marketing experts, and content creators with an innate zeal to publish innovative content on technology and related topics.
RELATED ARTICLES

Recent Articles