The Impact of Filter Bubbles on Society

My topic is FILTER BUBBLES. I decided to use this phenomenon as my topic because it brings a clear picture of modern tools of marketing or advertising on the internet or in other words modern tool of communication. Since there is a lot of politics about this method of relaying information about goods and services to the consumers and its applicability to assist them to easily search for what they want, I felt wise to discuss in details about it, taking into account what we explored about it in class. I will start by deeply defining filter bubbles and then expound on its significance as well as its flaws in the field of communication. Therefore this paper shall present a detailed concept and facts about the advantages and disadvantages as well as impacts on the society of “Filter Bubble”.


Discussion


This is a phenomenon of intellectual isolation which often takes place in the event where websites apply technological knowledge of algorithms to filter users intended information through use of assumptions without much involvement of them hence saving them time of searching a given item by giving options to them according to these assumptions. Websites make these logical guesses on the ground of user’s related data, like previous click behavior, the past browsing trends, search as well as location history. In that case it is a common behavior of several online advertising as well as marketing companies to present information that will only abide by the user's previously actions within their site. A filter bubble, thus, can make users to get appreciably less contact with confusing stances, hence resulting to the user(s) becoming intellectually isolated.


Personalized searches result as well as personalized information stream from Google and also from Facebook is an ideal example of filter bubbles phenomenon. The idea of filter bubble was discussed by Eli Pariser who is an internet activist, in his book: The Filter Bubble, as that which is hidden to the users, by the Internet (2011). He thus relates a situation whereby a user finds on the internet in Google search for BP and there pops up investment information regarding British Petroleum as the result of the search. In the search and for the same keyword, another user gets details about Deepwater Horizon oil spill out. Noticeably, the two users’ searches are dissimilar and as a matter of fact, could negatively impact on searchers impression on the info concerning the company- British Petroleum. As of Pariser, this phenomenon (filter bubble) impact could have undesirable results for public communication or relay of intended information. Nevertheless, others feel that the effect is insignificant.


 In the aftermath of a United State of America presidential election that appeared to distress the citizens in the country in the year 2016, a bigger number of America citizens are questioning how the popularity of Donald Trump was not visible during all that process. The unexpected outcome was due in part to the idea that personalized tools have isolated people from opposite viewpoints, namely, the “filter bubble” effect. Many people were left hanging from the outcome of the elections. There was no a clear picture of the translation of popularity to the results that the citizens received. In a TED Talk, Eli Pariser, CEO of Upworthy, first coined the phrase “filter bubble” of Facebook. Thanks to the launch of News Feed, one of the representative features of Facebook, its algorithm systematically is understood to filter out data that a given website, in particular, deems “of little interest” to a given group of people thus giving them the content they are thought to require.


Theoretically, by filtering out relevant information, the algorithm is beneficial for users to avoid being overwhelmed by the sheer abundance of online information. But in fact, this personalized filter bubble increases people’s chances of having an impression that our personal interest is all that exists, preventing the access to diverse information (Baer, 2016). From my own perspective, the filter bubble created by Facebook News Feed algorithm to some extent reinforces our bias because we can’t see the full picture of stories.


From a varied point of view, filter bubbles somehow could not sound a threatening prospect as such, however, they can result in two distinctive though related issues. In as much, Facebook makes it undemanding to filter that which we can see and as well hear most of the people are likely to end up following sources of information that go by their personal political direction. One of the obvious instances is, when you see merely the things you believe in, it can result in a growing confirmation bias which builds up gradually over time. An emphasized trouble that could erupt from this different sources of information among people are that it can lead to the real generation disconnects. The reason for that is they happen to unable to recognize how someone could think in a different way from themselves. These two consequences fuel the political climate to a more separated and individualized one at the expense of the ability to relate to the other side of people.


By sheltering information, Facebook's algorithm contributes to reinforcing our confirmation biases. Despite Facebook refuses to acknowledge its identity as a news source, it has increasing influenced how people learn about the word. According to a survey conducted by Pew Research Centre, sixty one percent of millennia's relies solely on Facebook as their main source of information concerning politics and also government (Bixby, 2016).As a major connector of people, Facebook is thought to be a platform with the intent to engage in online interactions. Ideally, we are supposed to share our opinions while understanding others' viewpoints.  Though in Facebook's bionetwork, there exists a labyrinth of walled precincts created by its “News Feed” feature, which puts a bigger emphasis on shared news from friends at the expense of posts from branded pages from news websites. The much clicks we make, likes and sharing stuffs which resonates with personal worldviews, in return Facebook feeds us with posts of the same nature. This has formulated our tunnel visions, trapping people with opposite views in their own bubbles.


As a result, we rarely questioned their own political beliefs and our arguments betrayed a lack of understanding about alternative views. Wittgenstein’s Lion, sets a question to us, that in the invent where our frames source information updates and social feeds could be different from each other, could at any one time ever hope to appreciate one another’s position?” (Martindale, 2017).One typical example of the case is the widespread shock regarding the victory of Trump's election. Peoples' views on Donald Trump with his counterpart Hillary Clinton elections outcome really appear to be taking place on different planets. Why? This is, in part, because of the Facebook’s algorithm — the autonomous sorting of information governing what articles end users would see. In this case, the political storyline was divided into two diverse filter bubbles- one was conservatives supporters while the other one for liberals’ supporters (a blue feed and a red feed) (Solon, 2016). In this case, if for example one's Facebook feed is pro Hillary filled and anti Trump views, one may be insulated from the flourished Trumpism. On the other hand, if one's feed is absolute opposite, stressing just the negatives of Hillary and giving benefits of Trump, one may encounter an opposite scenario. Curating our news to give us what we want to see, rather than what we perhaps need to see, has confined us to an extreme thought pattern that causes our confirmation bias.


   Nevertheless, this does not mean that we should at all times be in search of the contrarian opinion to every other occasion of the occurrence of a given case in the society, but all the same, we require a healthy alternative information to better position our arguments. In this case, democracy could really take effect on the society. Conversely, insufficient cultural communication is the crux of democracy. In his book, “The Filter Bubble: What the Internet is hiding from you”, Eli Pariser argues that democracy needs citizens to perceive things from each other’s line of thought, but unfortunately we are too much enclosed in self bubbles. Democracy therefore relies on common facts and beliefs; instead of that, we are being presented with matching but somewhat separate universe (Usman, 2016).


Ad senses too are no difference with filter bubbles as they to aim at giving an impression to the end users by playing about with their previous history of clicks. Once the user clicks on them they redirect them to a new page of a site that possibly they would wish to know about.


Conclusion


            To some extent, Facebook's, as well as Google search engine selective feed algorithms, help create the political homogeneity where we lose our capability to integrate different people’s views and a rounded sense of the debate.  Roughly speaking, the confirmation bias in the politic sphere runs counter to the ethics of democracy. It’s time burst our filters bubbles and find more sources for our daily newsfeed. We need to be aware that personalization is not just equivalent to convenient and fun; it also constrains us into the bubbles of extreme. However useful and helpful the phenomenon could be, it is wise to come up with a more organized and solid stand to clearly outline to what and which extent these algorithms should be put into practice. Failure to that could lead to a serious rift between people who by any chance could be having dissimilar beliefs concerning a given issue among them.


Works cited


Baer, D. (2016). “The Filter Bubble Explains Why Trump Won and You Didn’t See It Coming”,. The Cut . Retrieved From https://www.thecut.com/2016/11/how-facebook-and-the-filter-bubble-pushed-trump-to-victory.html,  November 9


Bixby, S. (2016). The End of Trump: How Facebook Deepens Millennials’ Confirmation Bias”. The Guardian . Retrieved From https://www.theguardian.com/us-news/2016/oct/01/millennials-facebook-politics-bias-social-media, October 1


Martindale, J. (2017). Forget Facebook and Google, Burst Your Own Filter Bubble”, . Digital Trends . Retrieved From https://www.digitaltrends.com/social-media/fake-news-and-filter-bubbles/, December 6


Solon, O. (2016). “Facebook’s Failure: Did Fake News and Polarised Politics Get Trump Elected?”. The Guardian .Retrieved Fromhttps://www.theguardian.com/technology/2016/nov/10/facebook-fake-news-election-conspiracy-theories, November 10


Usman, O. (2016). “How Invisible Filter Bubbles Shape Your Social, Political, and Religious Views”. ibn abee omar . Retrieved Fromhttps://www.theguardian.com/technology/2016/nov/10/facebook-fake-news-election-conspiracy-theories, August 25

Deadline is approaching?

Wait no more. Let us write you an essay from scratch

Receive Paper In 3 Hours
Calculate the Price
275 words
First order 15%
Total Price:
$38.07 $38.07
Calculating ellipsis
Hire an expert
This discount is valid only for orders of new customer and with the total more than 25$
This sample could have been used by your fellow student... Get your own unique essay on any topic and submit it by the deadline.

Find Out the Cost of Your Paper

Get Price