TED Conversations

Pavels Jelisejevs

This conversation is closed. Start a new conversation
or join one »

What do you think about personalized content services?


I've recently stumbled upon a TED talk by Eli Pariser titled "Beware inline filter bubbles". He talks about how modern content delivery services try to deliver personalized relevant information, and filter out the rest, thus, limiting the diversity of content we consume. He calls it a "filter bubble."

I find this topic very interesting because I'm currently developing a similar service myself - http://6feeds.com. It's a service that will recommend you news based on you recent activity: what you've read, shared etc; and help you discover new content. I've started working on it, because I've noticed, that the amount of stories I have to skim through to find something interesting, is just too large. I thought that this process could be optimized with the help of modern technologies.

I haven't considered the "filter buble" problem while working on 6feeds, probably because it was designed to allow you to fully control you subscription list and add any content you like. But when I think of it, I'm not even sure if it's a problem. I think this is a case where the pros totally outweigh the cons. Usage of such systems doesn't directly limit, what information we consume, rather helps us save time, looking for quality content, and leaves more time to learn something else.

What do you think? Is it a problem? Would you feel comfortable using such systems? And how can we address the "filter bubble" problem?


Showing single comment thread. View the full conversation.

  • thumb
    Jul 1 2012: I find Pariser's arguments compelling. There is a problem when others are choosing what we can learn. It's a problem if the filter has an interest in promoting certain information to us (for example when airline reservation systems used to bury their competitors listing in their systems so the customer wouldn't see them) or when the fliter guesses wrong what we are interested in. An algorithm that channels what we see so that we see much of the same prevents us from entertaining new perspectives and also truncates the unique connections we would be able to make if more seemingly disparate information floated our way. We get a misleading picture of how our world looks. That one can then dig further is not insurance that we will get as open a view as if no one obscured certain ideas/content in favor of others in the first place.
    • Jul 1 2012: I think Pariser is missing one think: such services don't always control what you learn, they are following your behavior. You are still in charge of what you read, you just have to keep doing what you were doing before, and they will assist you. And you still have the ability to turn your interests to something new, and they will catch up with that.

      Information being manipulated for someones gain is hardly something new and is not unique to such services. Every media has reasons to promote certain content and ways to do it. I'm afraid, that's not a problem filters can solve, but also not something they cause.
      • thumb
        Jul 1 2012: If a service promotes your keeping "doing what you were doing before," it reenforces your staying in that box. As you become flooded with more and more of what is in that box, you can easily lose sight of what's not.
        • Jul 1 2012: On the other hand, I've noticed one curious thing.

          I'm a web developer. It's my job and one of my hobbies. So, when asked, "what are you interested in?", I, usually, reply: coding, development, internet etc. Yet, when I looked at my own reading statistics, I've noticed, that for some time I've been mostly reading about business, management, marketing and other corporate stuff. Only then I've noticed, how my reading interests have shifted. In a way, I've been constraining myself in the box, and my algorithm knew more about my real interests, than I did.
      • thumb
        Jul 1 2012: Did your algorithm know more about your interests or possibly steer your reading (without your even realizing it until afterwards) more toward what you now find you have been reading most about?
        • Jul 1 2012: With experience, it's getting harder to find interesting content about web development on the web, so I filled my reading time with stories on related topics. The algorithm just followed the new trend.

          Certainly, if my algorithm highlights some story, it will draw my attention to it, but I'm not sure if it can change my interests in the long run.
    • Jul 1 2012: You've mentioned the problem of connecting seemingly unrelated information. That's another problem filters can help us solve. For example, if I'm interested in cooking, would I read an article about physics? Doubt it. That's a long shot. But, say, a smart content recommendation service introduces me to molecular gastronomy first, that later may lead me to general physics. That sound more likely.

      A clever content recommendation service would need to think a step ahead of the consumer and guide him to new content. It's ineffective to just offer random out-of-the box content. Instead, we should offer content "on the border" of the box, so the consumer can choose his next step on his own. That's a problem I'm working on for 6feeds right now.

Showing single comment thread. View the full conversation.