Filter bubble
A filter bubble is a result of a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history[1][2]) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google Personalized Search results and Facebook's personalized news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for "BP" and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were "strikingly different".[3][4][5][6] The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal[6] and addressable.[7]
Concept
Pariser defined his concept of filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms".[3] Other terms have been used to describe this phenomenon, including "ideological frames"[4] or a "figurative sphere surrounding you as you search the Internet".[8] The past search history is built up over time when an Internet user indicates interest in topics by "clicking links, viewing friends, putting movies in your queue, reading news stories" and so forth.[8] An Internet firm then uses this information to target advertising to the user or make it appear more prominently in a search results query page.[8] Pariser's concern is somewhat similar to one made by Tim Berners-Lee in a 2010 report in The Guardian along the lines of a Hotel California effect[clarify] which happens when Internet social networking sites were walling off content from other competing sites––as a way of grabbing a greater share of all Internet users––such that the "more you enter, the more you become locked in" to the information within a specific Internet site. It becomes a "closed silo of content" with the risk of fragmenting the Worldwide Web, according to Berners-Lee.[9]
In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information"[10] and "creates the impression that our narrow self-interest is all that exists".[4] It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users "too much candy, and not enough carrots".[11] He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook.[11] According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that it has the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation".[4] He wrote:
<templatestyles src="Template:Blockquote/styles.css" />
A world constructed from the familiar is a world in which there’s nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas.
— Eli Pariser in The Economist, 2011[12]
A filter bubble has been described as exacerbating a phenomenon that has been called splinternet or cyberbalkanization,[13] which happens when the Internet becomes divided up into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views; the term cyberbalkanization was coined in 1996.[14][15][16]
Reactions
There are conflicting reports about the extent to which personalized filtering is happening and whether such activity is beneficial or harmful. Analyst Jacob Weisberg writing in Slate did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting exactly the same search—the results of all five search queries were nearly identical across four different searches, suggesting that a filter bubble was not in effect, which led him to write that a situation in which all people are "feeding at the trough of a Daily Me" was overblown.[4] A scientific study from Wharton that analyzed personalized recommendations also found that these filters can actually create commonality, not fragmentation, in online music taste.[17] Consumers apparently use the filter to expand their taste, not limit it.[17] Book reviewer Paul Boutin did a similar experiment among people with differing search histories, and found results similar to Weisberg's with nearly identical search results.[6] Harvard law professor Jonathan Zittrain disputed the extent to which personalisation filters distort Google search results; he said "the effects of search personalization have been light".[4] Further, there are reports that users can shut off personalisation features on Google if they choose[18] by deleting the Web history and by other methods.[6] A spokesperson for Google suggested that algorithms were added to Google search engines to deliberately "limit personalization and promote variety".[4]
Nevertheless, there are reports that Google and other sites have vast information which might enable them to further personalise a user's Internet experience if they chose to do so. One account suggested that Google can keep track of user past histories even if they don't have a personal Google account or are not logged into one.[6] One report was that Google has collected "10 years worth" of information amassed from varying sources, such as Gmail, Google Maps, and other services besides its search engine,[5] although a contrary report was that trying to personalise the Internet for each user was technically challenging for an Internet firm to achieve despite the huge amounts of available web data. Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens, and would help a consumer looking for "pizza" find local delivery options based on a personalized search and appropriately filter out distant pizza stores.[5] There is agreement that sites within the Internet, such as the Washington Post, The New York Times, and others are pushing efforts towards creating personalized information engines, with the aim of tailoring search results to those that users are likely to like or agree with.[4]
Further reading
- Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You, Penguin Press (New York, May 2011) ISBN 978-1-59420-300-8
- Lua error in package.lua at line 80: module 'strict' not found.
- Friedman, Ann. "Going Viral." Columbia Journalism Review 52.6 (2014): 33-34. Communication & Mass Media Complete.
- Lua error in package.lua at line 80: module 'strict' not found.
See also
- Selective exposure theory
- Confirmation bias
- Communal reinforcement
- Echo chamber
- Group polarization
- Media consumption
- Content farm
- Search engine manipulation effect
- Serendipitous discovery, an antithesis of filter bubble
- Search engines that claim to avoid the filter bubble: DuckDuckGo, Ixquick, MetaGer, and Startpage.
References
<templatestyles src="Reflist/styles.css" />
Cite error: Invalid <references>
tag; parameter "group" is allowed only.
<references />
, or <references group="..." />
External links
- Filter bubbles in internet search engines, Newsnight / BBC News, June 22, 2011
- ↑ Web bug (slang)
- ↑ Website visitor tracking
- ↑ 3.0 3.1 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ 5.0 5.1 5.2 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ 6.0 6.1 6.2 6.3 6.4 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ 8.0 8.1 8.2 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ 11.0 11.1 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Note: the term cyber-balkanization (sometimes with a hyphen) is a hybrid of cyber, relating to the Internet, and Balkanization, referring to that region of Europe that was historically subdivided by languages, religions and cultures; the term was coined in a paper by MIT researchers Van Alstyne and Brynjolfsson.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ 17.0 17.1 Lua error in package.lua at line 80: module 'strict' not found.
- ↑ Lua error in package.lua at line 80: module 'strict' not found.