Monday, January 16, 2012

The flip side of personalized web results

Google has recently launched a new feature in search, where they integrate the social networks, especially through Google+, into the search results. Google is thus combining knowledge from the social graph as well as the web graph, which until now primarily have been seen as segregated.

This is in itself a very interesting subject, but this initiative is however also part of a larger trend, on the web: to bring individually tailored web content to users. Facebook have long been sorting content in their news feed, Amazon and YouTube are giving personalized product/video recommendations etc. This tendency slowly changes “the web” to “you web” and “my web”.

Eli Pariser, writer of the book "The Filter Bubble", sees this tendency as a problem. See the video below (which is from before Google launched their new search improvement). He argues that if our web content gets sorted to fit the things we click on or +1 the most, we won’t get a varied picture of the world, but instead kept stuck in our on little bubble. He further criticize that most of the personalization are happening without our knowledge and without us having the possibility to switch it off.

Pariser compares the situation to when journalist began to exercise journalistic ethics, and propose that ranking algorithms should also have a sense of social responsibility and rank society important results higher, and help providing a wide information picture.

I think Pariser has a really relevant point in his concerns, and he raises some important issues. There is however more to the story than that. Sorting web content is inevitable, simply because of the amount available on the internet, and is, as you know, the backbone of Google’s success. People are already reading blogs or newspapers that fit their point of view or interest, and have been doing this for generations. So do Google, facebook etc. makes this bubble effect worse? Do they have a responsibility to act as editors?

If they choose to act as editors, we get posed with a new problem. The newspaper market is diverse with many players, each with their own interpretation of society-important content. Together they create a varied media landscape. But since these web companies are such major, almost monopolistic players, we won’t get that diversity. How do we ensure that say Google choices are “the right one”, if it’s at all possible to do so? The answers aren’t easy and mix ethics, philosophy and computer science. Maybe some in this class will help solving these issues in a few years….

-T. Bertelsen


  1. Good post. I agree with you that Google and Facebook cannot simply work as journalists, since personal relevance and diversity is a significant characteristic of a successful internet service. This also makes me think of the small world, which takes a tradeoff between local connections (personally relevant information) and global connections (breaking the bubbles!)...You will soon have a lecture on small world, so don't miss it!

  2. I really enjoyed reading this post ... It was really well done!

  3. I like your post, as well as the TED talk. It makes me wonder if people can be characterized not only by their network but by the structure of the global network. For example, you can be characterized as social if you have many weak ties, but that alone says nothing about whether you like to have your opinions challenged. If you are serving as a bridge between clusters on the other hand, then it might say something about the diversity of your life: whether it's work vs. home; pottery vs. sports; conservatives vs. liberals.

    I'll agree with Eli that personal control is an important feature (choosing who is at the top of your Facebook feed, or what results show up at the top of a Google search), but to present a different side: maybe people don't like to feel uncomfortable, to have their opinions challenged (i.e., so Eli is assuming that public discourse is important; that open-mindedness is important; that what he thinks is important (human rights, etc.) is important. And that's just one view. (Are you open-minded to the rights of others to be closed-minded?)

  4. I guess this issue goes away if Google would just publicly announce the customization it performs behind the scenes, and give the users the ability to turn them on/off according to their preferences.