TED Conversations

Eli Pariser

Author/Organizer, Author -- The Filter Bubble, Penguin Press, May 2011

TEDCRED 50+

This conversation is closed. Start a new conversation
or join one »

LIVE TED Conversation: Join TED Speaker Eli Pariser

LIVE conversation with Eli Pariser, TED Speaker and author of The Filter Bubble, a fascinating look at the effects of online personalization.

The conversation will open at 12 Noon (Eastern Standard Time), May 14, 2011 with the question:

What should companies like Facebook and Google prioritize besides "relevance"?

ADMIN EDIT: Eli has requested that we keep the conversation and discussion open past the 1 hour mark. He will be checking in periodically and answering questions, and is looking forward to continuing a great TED Conversation!

+1
Share:
progress indicator
  • thumb
    May 14 2011: Great conversation, everyone -- I've got to duck out, but I'll be back to check in. This is exactly what I was hoping my talk would do -- start a conversation about algorithmic ethics and filtering that could lead to more informed people and more diverse information streams. Thanks!

    Oh, and if you're interested, I've got a book on the topic out this week: http://www.amazon.com/gp/product/1594203008?ie=UTF8&ref_=sr_1_1&qid=1305382795&sr=8-1&linkCode=shr&camp=213733&creative=393181&tag=thefilbub-20

    Look forward to continuing the conversation.
  • thumb
    May 14 2011: Eli - thanks for provoking thought.

    To what extent do you feel that this 'filter bubble' is a symptom of a larger problem in societal organization? I find that, even in the physical world, people tend to find information which aligns with their existing beliefs (psychologists call it 'confirmation bias').

    People hang out with similar friends, read sympathizing newspapers/websites, etc. Isn't this pattern in social media simply a result of product designers giving people what they already want? I feel like solving this problem may lie in changing our values more than redesigning technology. While a new algorithm will certainly help, it may just dissuade users from using facebook (as they don't WANT competing outlooks).

    Thoughts?
    • thumb
      May 15 2011: Jay, the fact is as Marshall McLuhan reminded, values are something that are more often shaped by a media, rather than influenced by what we instil in it. This in part happens by way of intent and yet more often influenced by the dimensions it opens up.

      So for instance, on one hand it can have people more quickly informed about certain events, like say what happened in Egypt, where the media served as a rallying point for those who thought it was time to force change. On the other hand, because of the sheer volume of information things like our perceived levels of anxiety and fear gets raised over what they were in the past. This is because we are presented with more things to fear, yet unfortunately are still using personal filters which finds the input as local and immediate. So it’s not so much about our values needing to change, yet how we think about those things which have them to change.

      “All media exist to invest our lives with artificial perceptions and arbitrary values”
      - Marshall McLuhan

      “Television brought the brutality of war into the comfort of the living room. Vietnam was lost in the living rooms of America--not on the battlefields of Vietnam.”
      - Marshall McLuhan

      “Anyone who tries to make a distinction between education and entertainment doesn't know the first thing about either.”
      - Marshall McLuhan

      “A point of view can be a dangerous luxury when substituted for insight and understanding.”
      - Marshall McLuhan

      http://thinkexist.com/quotation/all_media_exist_to_invest_our_lives_with/152850.html
      • thumb
        May 16 2011: Phil:"However, there is a distinction to be made between what this is and the concern Eli Pariser is warning about, since censorship and persecution are there for all to see, while Pariser’s threat is hidden and therein subliminal."

        Phil, the point is to be aware of who can become the designers of the Hidden Filters. That is a global perspective of management of the internet.

        Yes Eli is talking about Google, Facebook, but I am making one aware if the potential is great, management of the internet and those who manage it, then you have to be aware that they too, are the Gate Keepers?

        That is my question, about what is hidden from perspective.

        Selectively this does not sit with the bubbles some are involvement in, yet remain unaware of the bubble they can become by being wrapped in the larger bubble of understanding called the internet.

        Again, I would point back to what I said about emergence and the algorithm, you being, and what is attached to all that you can and will become given the parameters of the internet and what it will allow?

        Phil:"However, it is also to ask, did you truly think that such a powerful media as this, was going to escape the same scrutiny, control and yes even manipulation that the others have faced throughout the ages;"

        You also missed the point about who owns the medium owns the message.:)The danger then are thus magnified by who owns the internet? Who owns it Phil?

        I am not unaware of the potentials realized that politically can be mastered by using the internet to advance social media to help people become aware? Imagine if one were to say that this is not right and so the political message I have is not appropriate according to the "Masters of the Internet?"

        Owning Broadcasting stations one can have a political bend too that allow the work of manipulation on it's readers. You have to be aware of that too.:)
      • thumb
        May 16 2011: White Space is an important subject when it comes to frequencies and those who would design their own equipment. Copyleft on hardware development aside from the big telecoms?

        Richard Stallman had mention at a municipal level of possibly challenging creating the hardware. A choice other then, the big telecoms to me it seems the right thing to do for accessing knowledge without charge and discrimination.

        The Universal Library. They of Google might called Google books, but that has always been my point about access to information. Access to the Library.

        Best,
  • thumb
    May 14 2011: There’s not much more than can be said other than what Eli Pariser has warned about. The only thing I would add is much of the bad filtering is resultant due to the imprecision and/or laziness of the inquisitor. So for instance, when Eli talks about the different results each of his friends who queried “Egypt” I’m not so surprised by the result. That is Google does give one a scrolled list of options when you first start to enter a query, with mine listing as Egypt news, Egypt, Egypt protest, Egypt crisis, Egypt gods and Egypt riots. Further one can refine the results, either by stating it more clearly what you are looking for, or proceed with an advanced search, which can narrow it further including being able to use Boolean logical operators.

    This is just to point out, that what one gets is not simply dependant upon the intentions of others, yet also the actions of our own; with anold computer analogy for this being “garbage in garbage out”. So it comes down to what Marshall McLuhan reminded that all media does is create a space of possibilities which never existed before and it’s up to each of us as to how it will be utilized; that is both collectively and individually.

    “a light bulb creates an environment by its mere presence”

    -Marshall McLuhan,”Understanding media: the extensions of man” (page 6)
    • thumb
      May 16 2011: The Question is who has become the gatekeeper if Government has extended it's reach into the issue that I linked with regard to Michael Geist. Could not comment to your above statement so I had to do it here.

      "To be clear: I do not believe that the Harper government is plotting to criminalize the Internet itself. Hey, Lawful Access started as Liberal legislaion! But whoever wrote it, it’s a terrible and stupid piece of law, and one that would never have survived committee in one piece. But Stephen Harper has promised to ram this stuff through, and now he has the majority to do it. See: Will anonymity and hyperlinks be illegal in Canada?"-
      http://www2.macleans.ca/2011/05/10/will-anonymity-and-hyperlinks-be-illegal-in-canada/

      In essence by definition then, is government considered a Gatekeeper?

      http://youtu.be/1c4nc0dBzKY

      If you had been following Usage Based Billing issue being the wish of big telecom, , then who has become the Gate Keeper? The CRTC is the decision maker yet a large opposition by the people have made it clear they do not want UBB.
  • May 14 2011: One has to admit that such filtering does make things more convenient for the individual. Surely as much as these companies have the responsibility to not over-filter what we see online, it is also up to us to perhaps sidestep these filters and explore beyond what is conveniently placed in front of us. Personally, I believe we can't depend on these businesses to help us out of ignorance! Raising awareness that what Google and Facebook show us isn't all there is to the world and letting people decide if they want to get out of their own filter bubble might just work better than persuading companies to think of the greater good. And your talk served just that purpose, if only for me! :)
    • thumb
      May 14 2011: Thanks! Yeah, I agree -- it takes two to tango here, and our desire for personalized tools is part of the problem. It just worries me that most people don't see this happening at all -- at least when you go to FOX or MSNBC, you know what's being left out.
      • May 14 2011: So do you reckon the priority is to make these internet companies more humane, so to speak, or focus on getting the masses to understand the possible problems of using these services?
        • thumb
          May 14 2011: Well, I think the two go together. If consumers change what they're looking for in an information service, the companies will have more pressure to respond. And meanwhile, I think it's possible to directly call on these companies to pay more attention to this -- a lot of people who work there really *want* to be doing good, and if enough of us call them to step it up, I think they will.
  • May 19 2011: Another case of "The road to hell is paved with good intentions"? I personally encountered automated filtering very recently when I moved - and was practically unable to find a shop from my old neigbourhood in Google.

    I would be personally very interested in a follow up activity. As a start: A Google co-founder is a member of the TED Brain Trust - shouldn't it be possible to convey an "idea worth thinking about" from this TED conversation directly? Google's credibility was build on making transparent the difference between search results and sponsored ads. So in a way the automated filtering leads Google away from its own core principle - why wouldn't a company understand the risk of that? Two obvious points:
    1) If you filter, make it transparent ("this search has been filtered and personalized be be more relevant for you")
    2) Give the user the option to shut off ALL filters

    Beyond that, an automated algorithm requires strong assumptions about what increasing the relevance of information. But only some users will end up in a happy pink bubble - for others the cage is both visible and annoying. So why not offer users the chance to influence the filters (and not with useless advanced search options for language and region)?

    For this suggestion, a very relevant concept to consider is the diverse roles we play in life - which change even within one day. No computer-based algorithm will ever detect on its own that five minutes ago I was looking up "Egypt" to book a business trip, but that I am now on my lunch break and want an update on the political situation.

    Filters that might be helpful if they are both visible and user controlled might include "I need..."
    - Background information
    - Shopping options
    - local answers
    - ....

    Advocating a "morally correctly biased" filter is just another form of censorship. But transparent, user-driven filtering might even make us a bit more aware of our choices and the hidden algorithms of our brain.
  • thumb
    May 16 2011: Dear Eli

    I enjoyed your talk very much. It made me think that keeping this channel as objective as possible, is partly our responsibility. This starts with the user being aware of the aspects of personalization (a useful tool) that compromise the objectivity of the information fed into our searches. When we are conscious of these issues, you have a greater control on how we conduct our searches and respond to the information we receive.

    I would like to see Google (and other search engines) give us the option to enter this criteria at the moment we conduct the search. This way we can make sure that we will receive the type of information we are looking for in that determined moment.

    Yours truly
    PR
  • May 14 2011: Eli - I deeply appreciate your commentary at the end of your presentation regarding the internet being a place to connect to other thoughts, perspectives and ideas from around the world. When I first logged on, I remember feeling like an astronaut exploring the vast human knowledge base. Nowadays, I primarily get advertising for cars and an exploration of things all too mundane and familiar.

    To return to that sense of exploration, I believe the most important factor would be, as you stated plainly, to have transparency in the filtration process. Filters, in and of themselves, aren't bad - but merely tools to get product messages across to potential customers.

    In the real world, when I go shopping for something, I filter using an active process in my mind based on mood, temperament, budget, etc. I think the web would benefit from having filters that you click and un-click, similar to how you can manage what netflix is showing you. That way, if you were in the mood to explore outside your bubble, you could easily do so & if you just need to buy your car, you'd be able to do so immediately.

    In terms of other factors to filter for - significance (as defined by physical, biological, social, ecological impact of a large scale), innovation (cancer cures, scientific breakthroughs, & business), international relevance (to help generate a global citizenship) & domestic relevance (to help build a sense of community). These are only my thoughts and suggestions - and I don't know that I wouldn't reorder the importance at some different moment in my life.
  • thumb
    May 14 2011: One thing that could be done to curb this as a user is by conducting some searches or choosing results of searches that are far out of the norm. This could expand the scope of the parameters although it may be treating the symptom so to speak.

    Also an interesting result of these filter bubbles is being able to identify the new 'borderless countries' that are forming.
  • May 14 2011: i dont think its so much that there are these algorithms that are the problem its that they are invisible and un adaptable as a user, if google simply had a settings column running down the side of their main page showing what information they are using to find the answers your looking for then this problem would basically be a moot point at least with search engines
  • thumb
    May 14 2011: Several weeks ago on my Tumblr, I saw a post comparing the popularity of two tags on the site: #Libya and #LOL. The bars representing the posting rates were significantly in favor of #LOL. This occurred only a short while after the uprising and civil war were all over the news. However, even after such a major event, people quickly returned to their entertainment.

    I would like to present the idea that filter bubbles are not significantly at fault for the cultural 'sweet tooth' that is sweeping the internet. Rather, this is a culture that thrives on entertainment and immediate gratification moreso than it does on serious contemplation. The filter bubbles are simply taking advantage of such to please the public.
    In the lecture, you said that the filters also needed to show information that was "important, uncomfortable, challenging, and from other points of view". I completely agree. However, this type of information is what people -need-, not what people -want-. The up-and-coming generation craves immediate information because it's what we've been brought up on. True nutrition for the mind won't sell as well.

    I really enjoyed your presentation, Mr. Pariser. The world needs more people like you.
    • thumb
      May 14 2011: Thanks for the kind words. I'm slightly more hopeful -- I think our long-term, aspirational selves are as aspirational as ever about being well-read, thoughtful citizens of the world. It's just that the Internet's making it easier than ever to indulge our short-term selves. The long-term ones need some help fighting the battle.
  • May 14 2011: I just wanted the TED presentation - excellent job! You made a very clear request to the companies. What response have you received from Google, FB, HuffPost, WashPost, etc to address the filter bubble you identified?
  • May 14 2011: I call the varying, information overflow myself and agree it doesn't hurt at all to seek out more information. I will visit sites and sign up for sites with opposite views of no relevance to me all the time. It is always good to know others opinions a look at the varying views...For the average person to keep up with everything to protect personal data is insane in the information overflow era. I think it is up to the people, companies and government to implement a long term plan on doing this. How to make it beneficial to companies and government is the big question. It can be expensive and lose benefits to both of them....
  • May 14 2011: The problem with personalization is that it is not discovery oriented. It assumes that people want what they have already discovered, things that are similar to what they already know or feel comfortable with. It does not encourage growth or experimentation. It may be in the short term that people do buy what they are being presented with, but eventually the thrill will wear off and they will be out looking for the unexpected, that which makes us feel alive. So, what is happening is that through much of this technology is that our interests and therefore purchases and experiences are narrowing. And that ultimately reduces our tolerance for others, so on a social level it could be (and often is) a negative thing encouraging tribal thinking.
  • May 14 2011: Does Google apply the filter if I don't sign in? Is there a way to be anonymous so that the bubble or filter does not kick in? Or is it all tracked by IP address or something like that?
    • thumb
      May 14 2011: Google has two levels of personalization -- one based on your web history and login information, and one based on the 57-or-so signals I mentioned in my talk that are available even if you're NOT logged in. Those include what kind of computer you're on, what browser you use, and your IP.

      You can turn off the first level, but you can't turn off the second -- to some degree, wherever you are, Google will be personalized for you.
      • May 14 2011: Does this mean, then, that Google is assuming things about certain types of users? As in, if I use a Mac, I'm more likely to think X way than someone using a PC? Or if I use Firefox I'll have different needs/interests/beliefs than someone using IE or Chrome? If I'm in Brooklyn I'll have different political views than someone in Detroit... etc?
  • thumb
    May 14 2011: Hi everybody -- I'm on, and looking forward to your questions/thoughts/provocations/critiques. Bombs away!
  • May 14 2011: I am curious as to whether there's anything we can do about the "filter bubble" as individuals. Can we take back control over our own information and what we are able to access?
    • thumb
      May 14 2011: Hi Noelle --

      As far as turning off personalization goes, I've compiled some of the easiest ways here:
      http://www.thefilterbubble.com/10-things-you-can-do

      The challenge is, right now the people who are pushing automatic, invisible personalization have far better technology than the people trying to give consumers control of it. So, for example, even if you turn all of your cookies off and surf anonymously, it's still possible to track you by signals your individual computer gives off. And it's totally possible that sites like Google use those signals to adjust what you see.

      That's why I think the long-term solution requires action from these companies themselves, and possibly from the government.

      The other thing you can do, though, is vary your information routine. You'll never go wrong seeking out more diverse and challenging viewpoints.

      --Eli
      • May 14 2011: Haha I like the idea of looking up random things just to confuse my search engines. Sounds like a fun challenge. Thanks! :)
      • May 14 2011: My concern at this moment is how the "signals" coming from my computer are being interpreted. I live in a not-so-great neighborhood and use an outdated computer. Does that mean that I am going to be filtered differently than those in a higher tax bracket? The subtle implications of that are prety scary. If I do turn off my personalization filter then what assumptions are being made about me?
        • thumb
          May 14 2011: Eli meant that even if you're using sites like Google/Bing from highest anonymity a user can achieve they can still match you ip/mac address, browser with your activities and will still provide you with some kind of, although small scale, personalization by putting up metrics of your location (ip/mac combination) and your surfing habit.

          Although, if you're already logged in, like checking gmail or posting on facebook, anonymity drops substantially, almost to zero, and then you'll get what >they< think is most relevant to you.

          Btw, living in no-so-great neighborhood may have some effect, but not 'outdated computer' as they can only see your browser, operating system, ip/mac address and bandwidth (not the hardware in majority of cases, unless you installed something).
      • May 14 2011: Whoa... Eli in response to you saying:

        "That's why I think the long-term solution requires action from these companies themselves, and possibly from the government. "

        Hold on there... really think about what you're saying. In your video you cited "what the internet originally meant to you" as a place of equality, people having a voice, etc. Seeing something that you think needs to be solved, and hoping that the GOVT comes in to solve it is exactly the antithesis of the open spirit you do indeed yearn for.

        The answer is, in fact, in the hands of we the people. There are ways to use these algorithms in our favor, to aid us in expanding our horizons. Remember, Google is NOT the government. We DO in fact have the ability to respond to fix our own, personal situations.

        Lobbying, petitioning, etc, is the wrong solution in this case.
      • thumb
        May 15 2011: Canadian government plans to outlaw internet linking
        Bill C-51: Investigative Powers for the 21st Century Act
        http://www.parl.gc.ca/About/Parliament/LegislativeSummaries/bills_ls.asp?Language=E&ls=c51&source=library_prb&Parl=40&Ses=3
        • thumb
          May 15 2011: Plato, you speak of laws of government as suggesting them being the gatekeepers we should be more wary of as to what gets in and what does not. However, this is not a new concern, for as soon as someone made a painting on a cave wall, the tribe could decide to have it erased or the artist persecuted for drawing it. This then is not a problem relevant to the form of the media yet as always for those who would have it serve them.

          That is there is a distinct difference between persecution or book burning and the filters which Eli warns about, as the former being there for all to see, while the latter lay hidden and subliminal. I would say we have means to deal with your concerns (constitutions, supreme courts, etc.) and yet none for those which are Eli’s. So the bottom-line being is to recognize as the media changes we must adapt to those changes and to help guide us we need new prophets, with Eli Pariser being just one of the newest of many.

          "The medium, or process, of our time - electric technology is reshaping and restructuring patterns of social interdependence and every aspect of our personal life. It is forcing us to reconsider and re-evaluate practically every thought, every action, and every institution formerly taken for granted. Everything is changing: you, your family, your education, your neighborhood, your job, your government, your relation to "the others. And they're changing dramatically."

          -Marshall McLuhan, The Medium is The Massage (1967)
  • May 20 2011: well they should make it clear to all of there software users that they are doing the "filter bubble" effect that they are using
    and to let the users the freedom to turn it off when thy want.
    because i really want to cancel this effect.
    thank you
  • May 18 2011: I suppose the good news is that, with the currently existing online filters, random information is filtered based upon my current lifestyle, demographics, etc.

    It’s not perfect, but at least with the internet, if I force the search parameters, I can access any information on any subject out there. By continuously forcing the search parameters I will ultimately shift my filter profile to provide me with information that is more relevant to my needs, regardless of what my demographic profile might indicate.

    I did become concerned when he brought up newspapers as a model to shaping internet filters in the future. Newspapers currently reflect the bias of the individual reporters, their editors and, ultimately, the owners of the paper who recruit, hire and fire these individuals. Whether they are writing an opinion piece on global warming or reporting a liquor store holdup, the writer inevitably inserts their bias into to the story.

    The internet is the last place where people can go and dig up facts on issues in order to become fully informed. If you search for enough sources on any topic on the internet, you can eventually get to the real issues in order to make a decision.

    Eli ‘s opinion is that the current algorithmic filters in place on the internet have some folks getting only information ‘junk food’.

    In his opinion.

    Eli thinks that the filters should be modified so that folks get a balanced diet of socially relevant information. Who determines what’s socially relevant? Eli Pariser? Barack Obama? George W. Bush?

    I always get concerned whenever I hear someone pontificating on what’s best of anything for anyone. I know what’s best for me and I defer to Eli to determine what’s best for him. While his motives are admirable, if we let individuals begin to determine what information the rest of can access, the law of unintended consequences will reign supreme and, ultimately, the internet will end up like every other form
  • thumb
    May 16 2011: Can we start (or has there been started) a petition to get this done? I would like my facebook and my google settings to include the checkbox preferences that you included in your presentation. Let me know if there already is one, so that I can sign it and promote it! Thanks. (I was thinking of a site like change.org or avaaz.org) - maybe someone could throw your presentation into a 1:30 info-graphic/kinetic text video, so that we can get it going (I would love to do it if you can send me those sweet graphics you used!) - I have a feeling many millennials, Gen Xers, and Boomers out there will get behind this movement!
  • May 16 2011: Hi Eli. I enjoyed your TED talk. My comment is about crystallization of diverse perspectives as a complement to local filtering:

    I think you presented some very important observations. I do not, however, completely agree with your conclusions. I believe another exigency for personal development and social responsibility is that the current universe of information filtering makes it more important than ever that one seek a diversity of perspectives, embrace the tension that such diversity often brings, strive for balance of critically-resilient perspectives, and crystallize these different perspectives into a more complete understand of our local engagements with the world in which we live. In other words, as long as we compare our own learning (e.g., filtered by various web-based sources of information) with that of other people who we know to be different, we will be just fine. In fact, we will thrive.

    Consider, for example, what you did when you asked your friends to do the same Google search? It certainly helped you discover and demonstrate how local (person specific) the filtering is and how different the results can be. I wonder if it also is the case that the two sets of search results together provide some insight that is greater than the sum of the two parts. Does it, for example, reveal to a person in one locality (particular collection of interests, not just local geography) how a person in another locality might view an event of common interest? Even if this is just a hypothesis (and if only constructed by a computer), isn't this interesting in its own right? If so, perhaps we should emphasize the importance of connecting diverse localities and coming to understand the differences in perspective as much as coming to understand how computers or advertisers think about our demographic.

    I think the world is much more interesting if viewed multi-locally (i.e., as diverse) instead of globally (i.e., as potentially flat or homogeneous). What do you think?
    • May 16 2011: I don't agree in the least with invisible filters, the internet is my free zone from corporate interest dictating "what" I want to see. If I want these filters, it should be my choice and I should indicate my preference. Neither Google, Yahoo, or Facebook know why, or what I am thinking about the search I am doing. I wear many hats and sometimes my searches are relate to work,business, being a parent my own interest,etc. I am a complex human being and what I do is not clear cut. So for Facebook or Google to even understand my "internet Brain" is a task my husband has not been able to conquer in years. Filters are great tools in "my" hands. Facebook and Google are too "big" for me to trust. They have many other interest $$$ pulling at the them. My internet searches should be determined by me, and if filters are the new thing, Google, Facebook and Yahoo can do what they do best, build fantastic filter for the user. I do not believe we need to pave the way for corporate control of what we are exposed to on the internet. So far, special interest with deep pockets have inched themselves into the internet invading the free zone. I have notice that advertisement and commercials have triple in the past two years. So no thank you, I don't need Google or Facebook to control content, because as I see it, they have become a "BIG" corporation catering to special interest worldwide, not just in the United states. I also think 20 years ahead, and think, what would I want for my Grandchildren when they log into the internet. What we accept silently today, might be what we fight against in congress tomorrow.
  • May 16 2011: Information should be free. That is why libraries, in the Age of Google, are so very precious.
  • May 15 2011: The transparency of filters and recommendations is always a challenge. However, would you, Eli at al., agree with a simple and pragmatic solutions allowing the user to be made aware of a filter when applied with the option to switch it on/off to get either the personalised results or the unbiased neutral version - if there is such a thing at all?
    YES / NO / MAYBE

    As a product designer, I am currently working on a discovery solution myself and my point of view is that most content portals unfortunately do not offer any filters to personalise the service which results quite often in an information overload and the knowledge you are looking for and relevant to you as an individual lost in the noise of streaming and dynamically changing content.
  • thumb
    May 15 2011: Who are the Gate Keepers?

    http://www.michaelgeist.ca/content/view/5794/125/
  • thumb
    May 15 2011: Hi Eli !

    Thanks for a very informative talk on TED and also for your participation in the political game here in USA that so greatly effects so many people around the globe and here at home. Regarding Facebook and Google responsibility to the public it can only be held in place by us the users of their systems. Of course it would be great if they kept us all informed of what they do and when, being Swedish and growing up with a strong consumer protection on everything given away or sold in the country helps everyone in the end. I think all of us living here in the US most get more active in citizenship and participation of "The American Dream" before it's all gone. So great to see a young guys like yourself taken a lead in the game of life. Best of luck and I will follow you here and on Facebook

    Kent
  • thumb
    May 14 2011: I may be a little off on the specifics of this, but I think that, over time, "relevance" indicators will hopefully evolve to take into account more granular data. For example, HTML5 enables more possibilities for algorithmically tracking sub-elements in html pages (yes, I know there are privacy issues to consider here...). If those sub-elements are better tracked, it can improve the overall heuristic that informs what pages "get it" to your information bubble. That said, it seems that there could also be better ways to allow the user to pro-actively inform the filters via a new web based framework/standards that currently do not exist. For example, there should be a better way for users to give feedback about the link relevancy of certain elements of a page rather than the page as a whole. If HTML allowed this, it would be nice the "like" a single sentence of a page rather than a whole article, etc, and then have the like weighted based on some standard. Alas we're not quite there yet where HTML/HTTP standards and ontology are concerned.
  • May 14 2011: What should companies like Facebook and Google prioritize besides "relevance"?

    They should embrace the spirit of open source :-) and open service.
  • May 14 2011: I like the question (tough and interesting) and your talk.

    My answer is... it depends. It depends on what the subject matter being searched for is. For many topics, the answer most relevant to the user *should* be on top. For example, if I do a lot of searches regarding programming languages, searching for "lisp" should probably show me results regarding Lisp, the computer language, whereas if I were a speech pathologist, I would expect my results to relate to speech.

    That said, there are questions/queries that don't have clear-cut answers, or more importantly, they have conflicting answers. Primarily, I believe the two main topics in this regard are politics and religions (and related topics such as morality, ethical conduct, evolution vs creationism, etc.).

    For such searches, the order of the results should still be relevant to the user, however, in addition to the regular results, there should be a results area which directly lists opposing/conflicting/alternative beliefs and viewpoints on the same topic. Impartial/multi-source information (like Wikipedia), should also be prioritized regardless of the user's viewpoint, so that such a source is always on the first page of results (preferably in the top 3) when available. In theory, even if a Wikipedia article is inaccurate/biased, it would evolve and be corrected over time, unlike the vast majority of articles and blog posts on the internet.
  • May 14 2011: According to self-identity theory, a person is a combination of an individual-self and a collective-self. The entire western world is moving towards an individualization of self. Young adults move out of their parents home, not because of lack of space but a desire to be on his/her own. There is an increasing number of pets as company instead of another human being. Maintainence of collective-self of persons remained a duty of the government and larger organizations. These were examples in the material world. Whereas, in your argument, the virtual world of information is isolating individuals in their own buble.

    First, the development of the internet is following the pattern of the western development. Second, even theoretically, one individual cannot be virtually present in many communities. For instance, if I got newsfeed of Afghanistan/Pakistan, Egypt, Japan Nuclear Crisis, NASA discoveries, Euro football championship, NBA matches, Cricket matches of Australia and West Indies, Russian spy involvment in Georgia, Chinese growth, etc., what could I make out of them?

    I don't want to be locked in a room, but I don't want to be thrown in the space too.
  • thumb
    May 14 2011: Great Talk!

    I just wrote a paper on interactive communal digital displays and mentioned "Fliter Bubbles."