This conversation is closed.

Cutting through internet "filth" by applying Eric Berlow and Sean Gourley's algorithmic analysis.

The entirety of their talk, I had a vision about using this mapping network as an updated search engine. Think of it, if you will, as a purely statistical approach to a highly biased system. YouTube for example, relies on the user to apply tags to a video for it to come up under specific sections and topics. Many times, this information is inaccurate in capturing the essence of the video. Utilizing this model, I could navigate visually from one talk to another with greater speed, accuracy, and relevance based on web-like interconnectability. Since algorithmic analysis circumvents user titling, we're left with a better understanding of what is actually in the video. Rather than have the uploader try to tell us themselves.

Secondly, the amount of time I've spent trying to research something on the internet is horrendous. Most often, I'm given bogus sites that are traps for my mind and my wallet. Imagine a search engine that bypasses the corporate "1st Suggestion" and instead does a content valuation based on interconnectability and idea approximation. Analyzing a websites content before actually traveling there might allow us to thread the needle of "bad sites" and ascertain the open source information we all know and love. Let me know if I've left gaps - this is my first conversation on Ted.

  • Sep 28 2013: I'm sure it's in a lot of other people's minds too after that amazing talk. Eric and Sean themselves have probably thought of this when they were devloping the algorithms. One day it may be an intrinsic added feature to search engines.