Now that’s way too much information! And as many of us (especially those who spend 10 to 15 hours a day online) would agree, almost 80 per cent of that information is useless. That being said, the fact is that this rapidly growing content grabs the attention of internet gatekeepers long before we even realize its existence. This eventually leads us to a tailor-made internet, or in the words of political activist and researcher Eli Pariser, our very own internet ‘filter bubble’.
What is a filter bubble?
When we were first introduced to the idea of the Internet, or Google for that matter (aren’t the two synonymous now?), we thought of Google as a huge digital library that, based on our queries, suggested matching results – right? Well, to a certain degree.
According to research conducted by Eli Pariser (for his book The Filter Bubble), things are not exactly as they seem. For instance, if you Google “Egypt”, and your friend also Google’s it at the exact same time, the search results are likely to be very different from each other. This is because Google uses hundreds of ways to figure out who you really are. Here are just some of the (many) things it investigates:
- Whether you’re using a Mac or a PC
- What kind of a browser you use; is it Google Chrome, Mozilla Firefox or Internet Explorer (that’s still there, right?)
- Your location at the time you made your search query
- Your browsing history
- The time you spent on each page
- Your number of clicks (choices)
These are just some of the many factors that help Google construct your own personalized and filtered bubble of information.
What’s the scary part?
Unlike the television, where you enjoy control over the programs you would like to watch, here, well-crafted computer algorithms do it for you. And surprisingly it’s not just Google, all major websites including Facebook, MSN and Bing decide the content that’s ‘fit” for you (according to them!). Now this control or filtering makes sense when it comes to product-based websites like Netflix or Amazon; for example, if I purchase Steve Jobs by Walter Isaacson on Amazon, the website would also recommend me other books, based on my taste, and the time and money I spend on the website. Now that doesn’t sound bad at all, right? Here a user is being facilitated and provided with choices that would save him or her further search efforts. But one should realize the difference between browsing for a commodity or content.
To the surprise of many, Facebook algorithms read your clicking patterns, and based on your browsing history on the website, takes unannounced actions that filter your association with your friends. In his research, Eli Pariser mentions how one day he realized that all his politically-conservative friends stopped appearing on his Facebook news-feed, unlike his liberal friends, who appeared a lot. After researching the matter, he came to the conclusion that Facebook (without our permission) wipes out updates from friends who we pay less attention to. Google engineer, Jonathan McPhie, explains that search results are different for every person; in fact, even Google doesn’t totally know how it plays out on an individual level. They can see what people are clicking at an aggregate level, but they can’t predict how each individual’s information environment is altered.
How is that even a problem?
Google, Bing, Aardvark and other search engines are investing to make the best use of the evolving social search phenomenon. After all, in real life we often seek the advice of our personal connections and experts before making a purchase. What’s so wrong about that? If anything, I am being spared from irrelevant information, right? To answer that, we need to go back and look into the ideology of the internet — the greatest global phenomenon. Besides connecting us to an overwhelming amount of information, it facilitates interaction between people by bypassing geographical and social differences like religion, language, gender, culture, political viewpoints etc. A tailored, highly-personalized internet restricts us in our own bubble and defeats its purpose — democracy.
When asked about the personalization of Facebook Timeline, Mark Zuckerberg said, “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” And unfortunately, this is true. But does this mean that we shouldn’t know anything about the people dying in Africa? Should it really be up to those algorithms to decide what does and does not interest us? According to Eric Schmidt, “It will be very hard for people to watch or consume something that has not, in some sense, been tailored for them.” Filtered versions of Facebook or Google search are just small examples of how our Web-surfing preferences can be used by websites to enhance their business. We do now, knowingly or otherwise, share much more information on Facebook than we did five years ago. And in doing so, we provide them with more information about ourselves than they could have gotten out of any survey or focus group.
Understandably, as only a handful of the millions of Facebook and Google users, we cannot always control things that are silently influencing our interests and shaping our views of the world. But in our own personal capacity, we should raise awareness and make an effort to safeguard our freedom to choose. We can start by demanding these websites to be more transparent with their policies and actions. At an individual level, protecting our online presence by following some basic internet security rules is extremely important. As a democratic pool of information, the internet is a wonderful thing. But if any technology starts to deter our understanding of the world in a holistic way, it must be understood and dealt with accordingly.