Grid Alternatives

This conversation is closed.

Should humans ever limit the development of a technology?

What grounds must a technology break in order to be declared illegal? There are 3D printers that can assemble buildings, flying robots that can survey countries, and artificial organs that can dramatically extend the average human lifetime-

should the development of any technology be limited to provide more opportunities to human workforces, restrict weapons development and to force people to live naturally?

Please observe, explain and illustrate.

  • thumb
    Aug 30 2012: No... We don't need no water let this mother funker burn : p

    Human beings are nowhere near as evil or destructive as the governments which attempt in vain to keep them in check. I trust you with a nuclear weapon... You feel like killing millions of people? You feel like living with that for the rest of your life? Didn't think so. I'm not worried about individuals... I'm worried about groups.
    • Comment deleted

      • thumb
        Aug 31 2012: It's relatively simple, I agree with Dr. Zimbardo whole heartedly... I would simply add that any group of more than one individual, experiences a sense of shared responsibillity. By feeling like you are part of a group which is doing something evil, you feel, only "partially" responsible. Through this partial responsibility a human being can rationallize all sorts of crazy and terrible things.

        I don't think a human being has ever been sitting alone in a room and thought, "I'm going to kill millions of people"... What happens, is that groups lead to a team, or "us and them" mentality, which allows you to distort your own built in moral compass. I think like minded individuals can work together to solve problems without the institution of force, which is often a tool of both governmental and religious authorities.
        • thumb
          Sep 1 2012: David, if you don't believe that any one person alone would decide to do evil, who comes up with the evil idea in the first place? A group of people just get together and all come up with the same idea at once? SOMEBODY has to be the catalyst. Even within a group that didn't have the idea before, someONE was a catalyst for the idea to begin with.

          And I have to agree Don has a point about "God says do no evil". But as a retired military professional, I have to question whether there are some times I don't have a choice to not "commit evil" regardless of how "pure" I may want to be.

          There are only TWO WAYS you can win a "war". If someone commits evil against you first and you end up "warring" with them, then in order to get that war to end, you must achieve one of two objectives:

          1. Eliminate the enemies ABILITY to continue the war against you, or

          2. Eliminate the enemies WILL to continue the war against you.

          Sounds easy, right? I can do the first one by physically bombing the bejeezes out of them and destroying enough of their resources to where they finally can't continue anymore. Yes, I might use technology to do that. But we've had Atomic Bombs for over 60 years now and they were used 2 times. And those 2 times scared the hell out of everybody least enemies with critical thinking capability...that we have never used them since. We CAN learn from fear.

          I can do the second one through other less violent means, where they may not be harmed at "evil act" occurs. But what if it is impossible to DO that? What if they have been convinced that it is THEIR duty to sacrifice themselves to achieve their goals? Kamikazi pilots during World War II...or a terrorist of today who is willing to try to destroy you at all costs to themselves regardless of how much "diplomancy" you may want to use to try and reach a resolution?

          Pretty much leaves you with option #1. BUT, I CAN use #1 in a reduced manner. We haven't "nuked" anyone recently.
      • thumb
        Sep 1 2012: Sorry Don, this is a response to Rick, but there's no reply button.

        I think the evil person is the one who says "let's form a group". That's the person who wants to wield power over others. That is the human being who has a goal, they cannot accomplish on their own. That is the human being who, rather than asking you to help with a problem, creates an "us", and makes it your responsibility to solve the problem.

        People who think they are leaders, are evil, in my humble opinion Mr. Ryan. The United States of America was founded as a Republic, not a democracy... and we chose the word "president" very carefully. It means "to be present"... An American politician, was designed to watch, as the rule of law was carried out, basic law, obvious law... We were not a country which intended to write many laws.

        This was once our strength... A country who lacked a lord... and contained simply, a president... Someone who would preside over our affairs, and insure only that we avoided infringing on one anothers life, liberty, and pursuit of happiness. The nation of Afghanistan, did not attack us. We are on their land, we changed the rules... not them.

        If a nation wishes to declare actual war... I trust our president to preside over the scorching of earth.
      • thumb
        Sep 1 2012: I did veer off a bit there. I'll start that conversation next time i'm on, or you can feel free, and I will respond. I did begin however by saying no to his question, "should human beings limit technology"... my reasoning required me to explain, that I would prefer us to be individual humans being... as opposed to human beings. I don't think a group of us really need to do anything, even something which may seem to have a noble purpose, limiting technology.
  • thumb
    Aug 30 2012: We should focus on the human being. If we are a society where there is a proper balance of patriotism, selflessness and universal charity; if we have a society where evil is known for what it is and discouraged (not by law or force ); then we wont have to lose our sleep about the possibility of technology turning to a tool of destruction.
  • Sep 1 2012: I will give this one more try. I am trying to make a case for something that seems thoroughly obvious to me, and perhaps the examples of Jon Ho will help.

    Jon Ho points to atomic power and the development of the automobile as examples of technologies that have done a lot of good.

    My point is simple. Let us learn from our history of technology. This time, let us develop these new technologies in the right way, and maybe we can avoid killing a million people along the way. We know the results of the old paradigm, and if we keep doing things the old way, we can reasonably expect the same results.


    Addition: You might notice that the martyrs to technology are not adding their comments to this conversation. If they could, I strongly suspect that they would be on the side of limits.
    • Jon Ho

      • 0
      Sep 2 2012: And now you worded it perfectly! Thou Art God. ;)


      Addition : You had to ruin it by saying champions of technology are on the side of limits. Oh, for shame.
  • Aug 31 2012: More. Importance. Should. Be. Given. To. Medical. Research. Than. Space research.
    • thumb
      Aug 31 2012: A cure for all Human diseases that would allow all Humans to live forever would be meaningless if all Humans were to become extinct from a 10-mile wide asteroid that hit the Earth. Ask the dinosaurs how much medical knowledge would have saved them.

      It's all relative...and subjective...and objective...when it comes to deciding when one thing should be considered more important than another.
      • Sep 1 2012: What you can do when an asteriod is going to hit earth.iam not totally. Against space research.too much money and time. Spend on. Space. Research. Is. Waste simply for geo politics.
    • Comment deleted

  • Aug 31 2012: Joh Ho,

    Thank you for your illuminating remarks and quotes regarding Chernobyl. You have clearly demonstrated that it is necessary to limit the development of dangerous technologies.
    Don Wesley,

    I have no interest in deciding who are the most responsible. My point is that there is no one who can be personally responsible for the development and deployment of super powerful technologies. That is why these technologies require limits.
    Our current paradigm is to provide individuals and corporations the freedom to develop and deploy new technologies. The developer gets the benefits (profits) and accepts the risks (law suits). The developer makes the decision whether the potential benefits are worth the potential risks.

    With super powerful technologies this paradigm is useless and dangerous. The potential risks are so huge that no one corporation could possibly cover all of the potential damages. The potential profits are so huge that no corporation could find the courage to make the decision to not deploy.

    This argument is probably moot. I think Lejan is correct, we are not capable.
    • Jon Ho

      • 0
      Sep 1 2012: BZZZT, Wrong!

      I have clearly demonstrated that it is extremely necessary to remove all limits, bureaucracy, and censorship in the development of technologies. When accident happens, I would prefer the government tells it all, telling the population to duck and cover or run away on every news channel, instead of using the CIA or NSA or whatever to silence the tv or radio station, and telling everyone to stay home, and die a pointless death.

      Dangerous technologies? Seriously? Cars killed more people than nuclear ever did, are you advocating we go back to using horses and carts and rickshaws?
    • Comment deleted

      • Sep 1 2012: Don,

        The only institution capable of limiting these technologies is the government.

        In the current political environment, I strongly expect that the government will do nothing until a lot of people are hurt.
  • thumb
    Aug 31 2012: Here are probably the most controversial; cloning humans and extinct animals, developing a sentient artificial intelligence, and creating drugs to effictively "stop" the aging process.
  • thumb
    Aug 30 2012: In my opinion and in certain fields, they should. Yet we are not capable of doing so.
  • thumb
    Aug 30 2012: Couple of thoughts relating to the question.

    If the technology is "a" single technology (as your topic question asks) that may become capable of a Human Being being unable to control it anymore, then yes, I would want to limit it. Once any technology achieves the capability of controlling US, we could be in trouble. Example would be the advance of Artificial Intelligence that would be able to "take control" of our own ability to decide the outcome of anything.

    Some here have addressed the question from an "all technology" perspective. I wouldn't want to limit ALL technological advances as some of them could open up new opportunities for human development and solving world problems. One extreme hypothetical example is increasing the education levels of people. Currently one problem all societies would face is that if all people had the equivalent of a PhD education, who would want to be the garbage collectors or the janitors for the society? Who would DECIDE who ended up being employed in those positions? Today it could already be said that problem exists, as there are people who have education degrees in fields where there are not enough open positions available for employment commensurate with the degree. But if a technolgy was developd that eliminated the need for a PERSON to collect all of society's garbage, then we have eliminated the need for that type of employement, and would not HAVE to have someone with a PhD doing it. However...THAT technology would not solve the problem of having enough employment opportunities for all people with PhD's. It might actually make the PhD employment problem worse. So there may be a cause and effect problem that one technological advancement solves in one area, but creates in another area.

    This is a very thought-provoking discussion. Thank you for starting it.
  • thumb
    Aug 30 2012: Well, should the U.S allow a drone to run down and locate it's target and eliminate said target without a human operator?
    • Aug 30 2012: No.
      • thumb
        Aug 30 2012: They might have to and from an article i read this week then they already have the system in place or it's in development,if someone introduces a technology they either run with it and constantly refine it and take everyone to court or you make sure your tech is always one step ahead of everyones.The U.S isn't the only one with this technology anymore regardless of how i feel on it the dominoes look set to fall.
        • Aug 30 2012: Yes, I agree that the dominoes are set to fall.

          No, they will never have to. This is a very weak excuse.

          It is still wrong.
  • Aug 30 2012: Our technologies are becoming increasingly powerful. Biotechnology, particularly genetic modification, could create an organism that might wipe out human life. Putting no controls on powerful technologies is exactly like letting anyone build atom bombs. The last time I checked, it requires a license to buy dynamite.

    If we wait until it becomes a problem, it might be too late.
    • Jon Ho

      • 0
      Aug 30 2012: Through the magic of diesel fuel and ammonium nitrate aka fertilizer, I can create high explosives powerful enough to blow up several World Trade Centers. Do I need a license to buy them?

      I agree wholeheartedly with Jake Maddox in that 'I don't believe we should fear technology. It is the man that possesses the technology we should fear'.
      • Aug 30 2012: It is the men with good intentions that I fear the most.

        In the USA, one of the places I fear is the hospital. Every year many thousands die due to medical accidents that occur in hospitals.

        We did not fear nuclear power plants enough to prevent Chernobyl and Three Mile Island.

        Technologies like genetic modification are powerful on a scale completely different from prior technologies. People make just as many mistakes as our ancestors, and we are just as vulnerable.
        • Jon Ho

          • 0
          Aug 30 2012: Don't you mean you fear men who are imprudent or irresponsible? ;)

          Remember the old maxim, 'With Great Power Comes Great Responsibility".

          If you become a doctor or a nurse, with the power over the lives of your patient, you're responsible for doing your best, be at your peak while operating or whatever. If you decide to go triple shift surgery and then realized you left a Cottle-Dorsal Scissor in that last patient abdomen, well... you're quite irresponsible, no?

          And don't get me on nuclear power! I mean, here you have the technology to harness the power of the atom, the Sun itself! And yet you humans are behaving like 6 years old children when given this privilege, sigh.

          When humans as a whole finally grows up and learns about responsibility, I swear, nay, I guarantee you, they will reach the stars, and beyond.
      • Aug 30 2012: Jon Ho,

        The people I fear most are the people who are convinced that they are prudent and responsible.

        The last words of the engineer who was in charge of the final test at Chernobyl died saying that he did nothing wrong. It is the people who are completely convinced that they are doing the responsible thing who are most dangerous to the rest of us. And remember, that engineer was put in charge because his superiors thought he was the very best man for the job.

        The most responsible people in the world make horrific mistakes.
        • Jon Ho

          • 0
          Aug 30 2012:

          I'm afraid Mr. Medvedev has a totally different view about the engineers and their idea of prudence and responsibility.

          He specifically states that
          "The design, overall administration and day-to-day operation of the Chernobyl plant, as Mr. Medvedev describes them, were so incredibly negligent that one wonders why the disaster did not occur earlier."

          Note the word so incredibly negligent.

          The best part?
          "The director of the plant refused to admit the reactor had been destroyed even after seeing scattered on the ground pieces of graphite that could only have come from inside the reactor."
          Imagine the captain of the Titanic telling everyone to calm down because the iceberg only scraped of the paint job even though the ship is already halfway under water. Responsible much?

          Like I said, little children should grow up if they want to play with the Sun.

          But, and this is the most important part,
          "In Mr. Medvedev's indictment of incompetence, delusion and arrogance the only inspiring element is the heroism of the men and women who fought to overcome the calamity. Firefighters, electricians, turbine engineers, physicians and nurses rushed to the scene without protective gear and sacrificed themselves to a radiation so intense that in some cases they returned from their posts turned brown with what Mr. Medvedev calls "nuclear tan," and incapable of further action. Helicopter pilots fell ill while dumping tons of sand on the open incandescent reactor core. Scuba divers swam to certain death in the pool of water under the reactor to close valves and prevent even greater damage."

          You humans are so brilliant, shine so beautifully, sometimes it makes me sad with joy. ;)
  • thumb
    Aug 30 2012: We should never limit a technology based solely on a fear. We are either progressing or regressing. I feel that in the near future that artificial intelligence will surpass human cognitive ability. It's ability to recreate itself into better, faster and more intelligent forms will increase exponentially. Within months of becoming self-aware it may reveal the hidden constructs and laws of the universe that may not have been discovered for thousands of years. Will it admire us as it's creator or view us as a despicable race that requires extermination?
    • Comment deleted

      • thumb
        Aug 30 2012: I don't believe we should fear technology. It is the man that possesses the technology we should fear. "Therefore do not fear. For there is nothing covered that will not be revealed, and hidden that will not be known." Matthew 10:26
  • thumb
    Aug 30 2012: NO, it is bad policy to fix something before it is a problem. This is something government does incessantly to the detriment of all .