phraktle

Personalization Expands Filter Bubbles

12 Jun 2011

The subject is Eli Pariser’s TED talk on Filter Bubbles. (If you’re not familiar with the topic, go ahead and watch it, I’ll wait 9 minutes).

The talk, in the classic alarmist TED fashion, conjures a straw man arguing that personalization leads to a world where people only see what they “want”, as determined by obscure algorithms devised by large for-profit corporations. After skimming Eli’s book, it’s clear that he does make some much better considered points as well. However, most of these revolve around basic issues of privacy, data ownership, transparency, etc - most of which the EFF has been doing a decent job of explaining and representing. I am not here to argue with any of these points. However, the primary meme he’s broadcasting (that personalization traps people) is based on a naive view of how these methods work. I will argue, that the opposite is true, and put things into perspective.

Let’s take a look at the basic structure of the problem first: there’s quite a lot of information out there, and not nearly enough time available for us individually. The standard-issue human mind is not yet capable of containing the universe all at once (apart from brief glimpses on acid).

Even the small part of the cosmos that comes knocking to our senses directly is too much to process, so drastic filtering and ranking is inevitable. You already exist in a robust filter bubble.

Ok, with that cleared up, how do we improve the human condition? Phase one: collect underpants. Or perhaps collect and organize the world’s information, external to our marvelous, yet feeble brains (and please store a backup of our mind while you are at it :). Phase three: profit.

Hey, what’s phase two? Good question!

In an open society no single narrative of reality can be considered correct. We should encourage the co-existence of different filtering and ranking approaches and allow people to discover and explore different perspectives. Traditionally, this exploration was mostly limited to one’s immediate physical and social neighborhood, expanding with improved means of transportation. Today this still plays a larger formative role than the time you spend online: it is the primary source of deeper shared experiences. The horizon has grown with written and broadcast media, connecting a select few to larger masses.

Now, the Internet made knowledge accessible in an unprecedented way and enabled many-to-many connections. Due to the amount of information, you are again inevitably in a filter bubble online. The exciting news is that in this new medium we can implement novel methods of exploration, going beyond the random and top-down approaches available in the physical world. We can use algorithms to expand our bubble, to point out new relationships based on new triggers, collective intelligence, etc. The goal of personalization and recommendations is discovery, as Greg Linden also points out. They create serendipity. We are still in the early stages of research on personalization, we are not even doing a great job at filtering utter garbage. Still, we can already do better than pure random or expert recommendations. There are many experiments going on out there to find useful new approaches for discovery, filtering and ranking. And that’s a good thing.

The real harm is not personalization - it’s limiting accessibility and control of information. We need to fight censorship, maintain net neutrality, people need to own their personal data, companies need to provide transparency on how they use it, and so on.

So, folks, keep your information diet healthy: be curious, travel a lot, talk to people - and check your Facebook sometimes, until something better undoubtedly replaces it. And don’t expect algorithms to commit random acts of poetic terrorism yet.