Popping the filter bubble

Machines are throttling our consumption of media. Information served by algorithms, rather than human beings, isolates people in a cultural or ideological bubble. This is the thesis advanced by Eli Pariser in his book The Filter Bubble: What the Internet is hiding from you (Penguin, March 2012). His TEDTalk is worth watching to get the gist of his argument.

Blowing bubbles online

On social networks such as Facebook, Google+ and Twitter we seek out people with a common outlook and values.

We're unlikely to follow people that don't share our outlook, or we'll quickly unfollow them if we do.

Online shops such as Amazon make recommendations based on purchases by like-minded shoppers rather than challenging our outlook.

Google personalises searches according your network, search history and variables such as browser, device, and location.

Even incognito browsing, which masks your identity and search history, doesn't give common results from one user to another because of local conditions.

The filter bubble is exacerbated by Internet silos created by companies such as Facebook that isolate users and content.

Here Pariser finds support from the inventor of the web Sir Tim Berners-Lee, among others. Ultimately this walled garden approach risks fragmenting the Internet.

Pariser believes that this shift to algorithmic driven media is dangerous for democracy and will ultimately lead to polarised opinion.

He believes that filters close us off from new ideas, subjects and information. They stifle learning and development.

According to Pariser instead of consuming a broad diet of media sources and conflicting viewpoints we're increasingly fed content that reinforces our existing prejudices.

Whatever your views on human editing being usurped by algorithms haven't consumers always exercised their own filters?

We have always formed relationships with people that share our values. Its human nature.

Google results may be personalised to my own prejudices but I still get access to more information that would have been possible pre-Internet.

A Sun or USA Today reader is unlikely to read The Economist or the International Herald Tribune. Radio 1 listens don't tune in to Radio 4.

Companies will need to be more transparent about their how they apply algorithms. Changes to the Facebook newsfeed algorithm were met with howls of online protest by users in 2009.

Practical steps to counter the filter bubble

If you're concerned about the filter bubble there are some easy workarounds.

On social networks purposely follow people with a contrary point of view. Twitter is a good place to start.

Read widely by seeking out blogs or media outlets that provide a contrary point of view to your own.

I regularly cull my blog list, typically when people cease blogging, and seek out new ones.

Euan Semple recommends using a broad range of tools for different purposes. He tolerates noise as a penalty for not missing out on good stuff.

Get a grip of your Facebook network and newsfeed. You don't have to accept the content that the algorithm serves up.

Dial down the cats, segment the moaners and seek out the content that challenges you.

Pop your filter bubbles.

Previous
Previous

CIPR Scotland meetups

Next
Next

Updated: Tackling the gender pay gap