Filter Bubble
A filter bubble is a result state in which a website algorithm selectively guesses what
information a user would like to see based on information about the user (such as
location, past click behavior and search history) and, as a result, users become separated
from information that disagrees with their viewpoints, effectively isolating
them in their own cultural or ideological bubbles. Prime examples are Google’s
personalized search results and Facebook’s personalized news stream.
The term was coined by internet activist Eli Pariser as “that personal ecosystem
of information that’s been catered by these algorithms” (Pariser 2011 ); according to
Pariser, users get less exposure to confl icting viewpoints and are isolated intellectually
in their own informational bubble. For Pariser, the detrimental effects of filter
bubbles include harm to the general society in the sense that it has the possibility of
“undermining civic discourse” and making people more vulnerable to “propaganda
and manipulation”. This constitutes a concrete problem in particular for social network
service users and the possibility for them to act as a community: according
to Miconi ( 2013 ) being a bubble built upon individual tastes and preferences, it does
not allow any kind of sharing: in short, everybody is ‘alone’ in the bubble, condemned
to find his own way to knowledge. Again, the bubble it is invisible, and,
unlike traditional media, it does not reveal its bias and selectiveness. For the same
reason, whether users like it or not, they can not choose to enter the bubble: participants
are not allowed to actively select the filter.
In addition to this problem, filter bubble presents the same privacy issues connected
to algorithms collecting information concerning users: once a user has been observed,
profiled and recognized on subsequent visit, according to Parsier the risk posed in the
filter bubble are not undone with a simple ‘privacy settings adjustment’.