For those not familiar with the ‘Filter Bubble’, the term was coined by Internet activist Eli Pariser around 2010 and refers to a state of intellectual isolation resulting from personalisation applied to the delivery of web content. The suggestion is that users become separated from information that disagrees with their viewpoints. This effectively isolates individuals within their own cultural or ideological filter bubble.
While we could debate at length the extent to which this is a problem, few would deny its existence. In his 2011 TED Talk Eli Pariser suggests that search engines should curate results according to the following criteria, not just by relevance:
I personally don’t fully subscribe to this view. It requires that the search algorithms will somehow curate the results according to these ethics. Thus, we have to trust someone to decide what ethical parameters are good for us and code these into the algorithms! Or, of course we could go the other way and completely remove personalisation from any results and some search engines, such as Duck Duck Go , do not personalise results and so offer us this option. However, we then lose the ability to have more relevant results ranked higher and so more manual sifting of results by the user is needed. My view is that the Filter Bubble can be largely avoided if we provide visibility of the personalisation process and control over how our personal data is being applied.
Intellectual isolation resulting from the delivery of personalised content is clearly not peculiar to the Internet. In the western world we have the choices everyday of who to listen to, what to read, the media channels to watch and the ability to switch these on and off at will. If one chooses only to read or listen to right-wing views, then this is a personal right. Each individual is at least aware that not everyone shares these same views, and they can see that alternative reading and viewing material is available. Where Internet personalisation has a real problem is that this isolation can happen (indeed is happening) without there being visibility or self-awareness of it. It becomes especially dangerous if it is used to alter opinions for financial or political gain, and we are now aware of previous abuses of influence in recent elections and referendums.
I believe that users should be given visibility of the data they are revealing for personalisation. This transparency gives the ability for self-observation (see your digital-self as revealed to others), self-measurements (revealing of statistics, trends and actionable insights), and self-comparisons (how you measure up relative to your peers). This can theoretically be achieved using existing rights to data portability under GDPR and would be a useful first step in avoiding the Filter Bubble.
The next step would then be the ability to give the user control over the degree of personalisation delivered by any service. The user regulates the cost-benefit of revealing personal information and sees how the quality of content delivered is being influenced by this. In reality, if someone wishes to live in a state of intellectual isolation this should be a choice they have made, not a situation that they are unaware of or that has been imposed upon them.
The post Avoiding the Filter Bubble appeared first on Trisent.