TED Conversations

Erik Richardson

Teacher, Richardson Ideaworks, Inc.


This conversation is closed.

Aren't transhumanists committing the Jurassic Park fallacy?

Given that even the smallest disruption or perturbation in a complex system can be amplified, and given that there are still so many important aspects of the mind-body interaction in human medicine, it seems like moving forward with the technological enhancement of human beings—ranging from putting computers inside us to putting us inside computers—is to court the same kind of disasters we always get when we tinker with things we don't yet understand.


Showing single comment thread. View the full conversation.

  • Apr 22 2011: I think this worry could be applied to any scientific or technological progress of any kind - and given the amazing success of SCIENCE and TECH as they get more and more complicated, your worries are completely unfounded.

    Computers, more complicated over time? Yes. But we learn how to work with them and create and fix them at the same pace, so they have become a crucial part of your life every day and have taken many tasks away that people like you probably fretted about to begin with.

    here are some amazingly important tasks done by technology that has gotten "more and more complicated" over time,off the top of my head:
    Commercial Jet Autopilot
    Power Plant Management systems
    Guided missiles
    Computer Security

    Is your concern that introducing the biological aspect will create this "less than 100%" control? By the time they figure out HOW this technology works and how to implement it (it won't be that soon) they will be a lot closer to 100% by necessity - in other words for the Singularity to work we will certainly have to be much closer to 100% knowledge of both the biological and technical aspect.

    As they get closer to mapping the brain, the proteome etc I think the first opportunities for transhumanism will show themselves as the simplest places that we understand the most and we will work from there.
    • thumb
      Apr 22 2011: Besides "guided missiles" which Einstein forewarned would be the result of science in war and that is not what science (even computer science) should be meant for... everything else is nicely put!

      Your last sentence is extremely clever.
      • thumb
        Apr 22 2011: Yes, like guided missiles, some genies can never be put back in the bottle. How do you propose we stop governments from using your superhero technology (mentioned below) to build killing machines on a scale the likes of which have never been seen? Will our idealism about what science is 'supposed' to be used for save us?
      • thumb
        May 8 2011: And previous approach of area bombing (with enormous civilian casualties if used in populated areas, just to remind you) is somehow better?
    • thumb
      Apr 22 2011: Interesting line of thought, but none of your examples reflect the manipulation of existing, permutable, adaptable organic systems. The organic systems aspect is exactly my point, and the rampant debacle of gmo's is, in contrast, a relevant sample case. Once we start attempting things like elective genetic selection/tinkering and efforts at cybernetic or nanotechnological enhancements, we are talking about a whole different scale.

      Be clear, though, I'm not arguing in essence about whether we should eventually consider implementing some of those adaptations or enhancements. I am arguing that we are nowhere near that point yet.

      The current contamination of the region around the nuclear power plant in Japan is another interesting case in point. We can 'think' we have adequate safety nets in place all we want. Sure, humongous earthquakes and tsunamis are rare, and lots of nuclear plants are online and NOT causing disasters. The point holds that in a sufficiently complex system, we can never account for all the factors, and if we get it wrong in an organic system, we could all be screwed. At least the nuclear plant is still a mechanical system.

      Shall we talk about how badly the "successful" world of science and technological infrastructure will become if something that even "acts" like an organic system - in this example the Stuxnet virus - gets loose and starts to spread, mutate, and cross-pollinate with other computer code?

Showing single comment thread. View the full conversation.