This conversation is closed.

Is there really a difference between a robot and soldier making a kill?

Special Forces are precise weapons to kill. Soldiers receive orders from someone else on the kill and they carry out the mission. Wouldn't you agree that human soldier is this sense is like a robot?

  • thumb
    Nov 18 2013: After writing third paragraph, I realized I am less sure of my answer to your question. While a human killing a human has total awareness of the situation and still chose to kill that other person, a programmer (or other person working on a robot that kills another human) distances him/herself from the murder and can live a relatively normal life even while the robot is killling others. I suppose if someone must be murdered, I'd rather the person at fault completely know about it as a sort of punishment. So, I guess I'd rather have a human/soldier kill someone rather than robot if someone had to be killed.
  • thumb
    Nov 18 2013: I would say there is a small difference, and I think that if someone has to die, I'd rather they die by the "hands" of a robot than another human. Before I go into why, I'd just like to say that I think the murder of another human is one of the least moral actions one can do (mercy killing is sort of a gray part of this). Now before anyone says something along the lines of that bumper stick, "If you don't stand behind our troops, feel free to stand in front of them", I have a deep respect for the military. They do a lot more than go to war and much of my family were part of the military and saw no combat.

    A person intentionally killing another person has more implications to me than a robot killing a human. It implies that the person had a choice before killing someone and that the person chose to kill that other person. It also suggests that that person had to live with the fact that he had killed someone. I'm not sure if everyone feels regret, but I'd predict that that person would generally be depressed or at least confused from that time on.

    However, when a robot kills a human, this means that a person has died and implies nothing about the robot. The real problem I have with this is that why are we programming a robot to ever have to make a decision about killing someone. That should not have been programmed to be an option. That person's death ultimately resides on the programmer and anyone who knew that they were a part of making a robot that would kill humans.
  • Nov 17 2013: A robot making a kill and a human making a kill, I believe, has a difference. Robots or computers are given a command to follow and execute on the level of being absolute. These machines will execute those commands, if you will, without question. If you want a computer to somehow question a command thrown at it, you have to literally program variables for it to calculate or consider but it will push though the programming to absolutely execute the command. Basically, think about it in terms of being simply black and white, yes or no, execure or don't execute the command without thinking about the gray areas in the middle. Grey areas meaning what are the situations, stipulations, ethical, morale, or considerations about the command given which requires deep thought about the process.

    Humans, however, are more complex than that and can fathom the grey areas. When you come across situations in the real world, humans will consider the situation, stipulations, ethical issues, morale issues, and go into deep thought about what should really be done. Humans can even utilize their emotions about the command which can factor how it should be done in a good or bad way. Maybe someday in the future, computers and robots will be able to be on this level but only time will tell. In my opinion, in our time right now, this cannot be done yet.
  • Nov 16 2013: What "we" do to soldiers is a crime and to make it even worst we punish them for "our" crime. For them to come out the other end of war anything but crazy (OK Nadav, how about "a little crazy") is pure nonsense. Everyone without exception comes away disturbed and the second crime is we place them right back into society without any adjustments or tools to compensate for the disturbance.
    It is not the soldier that is doing wrong, they are just doing the best they can with the impossible task they have been given. What is the impossible task? Save lives by killing people.

    How do we get out of this endless cycle of destruction? Education
    • Nov 16 2013: Actually, there are plenty of people who fight in wars and come out just fine. I know quite a few myself.

      I'm not saying post traumatic stress isn't an issue, nor is readjusting to civilian life trivial, but claiming all the veterans are mentally ill is just as bad as the other extreme of ignoring the problem altogether.
      • Nov 16 2013: We are all mentally ill in my estimation, some more so than others. It comes from bad nutrition and progressely worst genes over the expansion of most of our existence on this planet. I am not talking about lock em up insane, I am just saying they have been stressed to the breaking point and many beyond. Then released back into society with no tools or training to adapt to their new environment so they usually either seek out highly stressful jobs to match the habits they have learned and practiced for so long except the rules have changed causing even more stress. It is a problem that we are aware of and just recently done anything about. For years the method was provide them all the drugs they want and wait for them to kill themselves or others or die. I am not talking about office soldiers fighting paper work, I am talking about combat.

        Here is my suggestion: No matter how many years they have spent in combat the government should allow them just as many in therapy and education or more when they come out. Then they should be evaluated and if they need more, they get it until they are well again. It is not the least we can do, it is our moral obligation.
        "If you broke it, you fix it"
  • thumb
    Nov 16 2013: I suspect it makes no difference to the one who is killed. Although I've heard stories from men who were involved in hand to hand combat, and facing a person you are killing, or one who is trying to kill them, seems to leave an impression. I have intimately known two men who were in special forces, and neither one would talk about their experiences, other than to say they killed too many people. Many years after the fact (they were both in Vietnam), one lives what seems to be a balanced, healthy life, and the other one is now in a mental institution. The impression left from this kind of job, seems to vary with different people.

    Is a human soldier like a robot?

    I believe all humans have the ability to think and feel, which I do not believe robots have.
    • Nov 17 2013: I believe robots have what you give them not a whole lot different than humans. Watching Watson beat the pants off our smartest jeopardy player was fun but even a grand master chess player does not stand a chance against a computer. I am sure Watson was quite gracious after winning and that was all programmed into his repertoire. It is hard to imagine after going to so much trouble to make a robot walk, talk and work with humans that feelings, emotions and other human traits would not also be considered. The idea is for robots to help mankind not just kick butt and take names.
      Thinking for robots comes in the form of algorithms and I am not sure we do not operate in much the same way.
      • thumb
        Nov 17 2013: I think/feel that humans are different than robots Keith. Robots are totally programmed by humans, whereas humans are taught/programmed, and as multi sensory/multi dimensional beings, still have the ability to think and feel for themselves. I agree with William, who says..."the sole purpose of elite training is to narrow the focus of the warrior into a tunnel where only the objective matters...".

        When the warrior/soldier is doing his/her job, that is totally where the focus is, and often their life and well-being depends on it. When the warrior/soldier is in a different "scene"....with his/her family for example, the focus is different.

        When a person genuinely learns to focus thoughts, feelings, decisions, choices, energy all going toward one objective, they/we are sometimes better able to change that focus in another direction. Therefor, the soldier/warrior is not necessarily ALWAYS focused in the same direction.....make any sense?
        • Nov 17 2013: You are right humans are no longer fit enough to fight in the new wars against machines. Machines are faster, smarter and more efficent. Today wars are mostly fought with machines but in the next twenty years wars will be fought only with machines. They are building them now, that's why the military is practicaly giving away all their old equipment to police departments. Computer programmers will fight the next war from a console. Eventually machines will figure out they really don't need us at all, oh sure they will keep us around for awhile as entertainment but then they will get bored with us and we will become extinct. They are already building self replicating robots and growing computers from viruses. Our military has been turning into one large computer network for years.
      • thumb
        Nov 17 2013: Keith,
        I don't think I wrote anything about humans no longer fit enough to fight in the new wars against machines. There may be some truth in the rest of your comment....however....the topic question is...
        "Is there really a difference between a robot and soldier making a kill?"
        • Nov 18 2013: "soldier/warrior is not necessarily ALWAYS focused in the same direction" This is what you said and it is the truth and the problem at the same time. Todays wars are fought at night with laser guided weapons and military intelligence from thousands of miles away from the battlefield. All the major powers have them and no country with that kind of technology will ever start another war with another country with the same technology for the same reason there are no nuclear, chemical or gas wars (the divestation would ruin the corporate economy and corporations make those decisions in all major power countries - that is the one good reason to let corporations run countries). However that being said the boys and there toys need hands on blood and guts practise so they pick on smaller countries that can not defend themselves and when they are through destroying them they rebuild them which helps their economy. As for soldiers we only use ours to move equipment around, hand to hand combat is fought with professionals like Halliburton who do not have to follow any rules. They just come in, kill everybody in site and disappear leaving our troops to clean up the mess and answer questions. I do not like to be so blunt but it is the real world that smaller countries live in and also why they hate Americans who live in a totally separate reailty and most have no idea what we do to other countries or why.
          So I ask you does it really make a difference "where" the bullet came from? Not to the recipient or their goat herding family who were just caught in the cross-fire, all they know is a half an hour ago they were a realitively happy family and now the Americans have slaughter his wife and two small children because they live two doors down from there intended target..
      • thumb
        Nov 18 2013: Keith,
        My very first statement in this conversations is...
        "I suspect it makes no difference to the one who is killed"

        The topic question is...
        "Is there really a difference between a robot and soldier making a kill?"
        • Nov 18 2013: I know we are a lot more the same than different. That was my first tought also, my second was to wonder what inspired the question and thought this can't be coming from someone who has been anywhere near a war. Some of questions in here are so ridiculous that I just ignor them considering the source and I always investigate the source if I can so as to better understand the meaning of the question.
    • Nov 22 2013: I'm a little more worried about the "thinking & feeling" bit lateley. Seems the schools are teaching less of "how" to think, and who-knows who is pouring more drugs into them; (I keep remembering one girl saying that the feelings of love, (for her mother?), were "disappearing!" I think that was a Ty C. Colbert or Peter R. Breggin book.) We're also seeing more and more corporations become more powerful than nations, which brings commercials & branding into the question, which I'll touch-on here:
      PS, I tried T-mailing you twice. Is the system down?

      To answer the question now, "No," as long as the orders are Unconstitutional, or would violate International agreements, then the soldier has a duty to say no. I've read that it is actually hard to get people to kill - that many early wars were rife with intentionally-misfired shots.
      I do agree that soldiers are like robots, in that their training is designed to break them down with brainwashing techniques (& propaganda growing-up - using chart-topping, hard-hitting songs, avoiding logical debate). (I can appreciate propelling soldiers to ever-higher goals, but to break them first seems suspiciously suspicious.)
      Also, I'd rather have an army of human soldiers who have a home and life to get back to. I see nothing good in any standing army; especially robotic ones that don't even sleep.
      • thumb
        Nov 22 2013: Hi Steve,
        I've been receiving and sending e-mails through the TED system, so it seems to be working. I'll send you a test T-mail right now...I got a notice that it was sent.....

        I agree that soldiers are programmed to a certain extent, and I also believe that as humans, they still have the ability to think and feel for themselves. Perhaps the thinking/feeling ability is turned off/tuned out when they are in combat.
  • Nov 18 2013: To answer the question directly, in my humble opinion I will say yes, the soldier is a robot. If the question is indirectly involving the morality of killing with a drone or with a human being trapped in the system, I would say that the morality lies not in the tool used, but in the purpose of that killing.

    Go back in time to WWI, the war of trenches. Immensely destructive of both men and property. Then go forward to the day of the armistice, where you find soldiers on both sides embracing their liberation and shedding their artificial hatred, some embracing the other side. No single soldier on either side could be blamed with immoral actions. The basis for the war itself should be considered immoral. Had drones been available then, it would not have made any difference in where to place the blame.

    Same with the Vietnam war. Once over, all survivors went on carrying their lives, and Americans now visit Hanoi as tourists, with no particular hatred for anyone. The war itself was immoral.

    And I could go on with endless examples.

    But, back to the question. The drone is definitely a better tool. Today’s conflicts are transcending borders. The times of trench warfare is gone. The enemy takes refuge in countries of which they are not even citizens. The drone is at present the only answer to their aggression.
  • thumb
    Nov 17 2013: A soldier is, in a sense, an automaton whilst in combat, but he is still a human with human sensibilities. Those sensibilities are what make war 'fair'.

    Killing people from the comfort of a cosy robot control room would be similar to a combat video game. This environment would be several steps removed from any sense of reality involved with taking people's actual lives. There would be less reason to question one's morality, if the enemy is just a faceless two-dimensional blur on a computer monitor.

    Also a soldier making a kill by controlling a robot is probably less likely to suffer PTSD, post combat. While that may sound like an advantage, isn't it also true that PTSD is a barometer of personal morality in warfare? Is PTSD a kind of 'rite of passage' from the automaton mentality of face-to-face combat, back to the morality of civilian life?

    If all sense of morality and reality is removed, then warfare will become just a grotesque game for the nation who can afford remotely-controlled killer robots - but very, very real and a serious offence on the collective morality of those who can't afford it.

    What can the losing nation possibly do against such unfairness? Resorting to terrorism would be one way...
    • Nov 17 2013: First off, if you're fighting fair, you're doing it wrong. You should be grasping for every possible advantage--fighting fair is how chivalrous idiots get themselves killed.

      Second, it should be noted that fighting via remote, isn't very much like a video game. The biggest difference is probably that its no fun at all (long shifts consisting mostly of tedium are the order of the day), and the second is that the operator is fully aware that he's looking through a camera at the real world.

      Finally, PTSD isn't a measure of morality, its a psychological disorder that results from trauma. Actual connection to the sufferer's morality is questionable at best--the best and worst men can suffer from it, and a lucky few are inexplicably immunized to it (without being sociopaths, usually).
      Also, recent studies show that drone pilots seem to suffer from PTSD about as often and about as bad as regular combat pilots doing similar jobs (only in the aircraft). They're perfectly aware that what they're doing isn't a video game, and it shows. Fear of death isn't the only thing that traumatizes a soldier; killing and witnessing the the horrors of war typically have a more profound effect.
      • thumb
        Nov 17 2013: Absolutely Nadav the sneakiest and dirtiest fighter invariably has the high ground in any conflict, a scenario we see played out time and time again in politics. But I would go even further to say that fair is myth in any context and is a concept perpetrated by those who cannot cope with reality.
        • thumb
          Nov 18 2013: So as a general point, being 'sneaky and dirty' is a handy blueprint to shape one's success in life, is it?

          Fighting dirty is superior to fairness?

          What kind of society do you hope for?

          Can you please define what you consider 'reality' to be William?
      • thumb
        Nov 17 2013: It's the "grasping of every possible advantage" of the underdogs that gives rise to the drawn-out insidiousness of terrorism long after the conflict has ended in a war that is technologically one-sided. If they have the burning desire to defend the paltry remnants of what they've got left, then terrorism is probably one of the last desperate blows remaining.

        Do I understand you correctly that you are happy that a poor soldier is enduring the indignity of "tedium" in a warm control centre, while someone at the other end is getting blown to bits trying to defend their family and their culture against overwhelming odds?

        And I'm only too aware of what PTSD is thanks, and what you omitted to read in my post is that I said it is a possible barometer of morality in war - not a measure. Quite different. Many of the former servicemen I've met say that their problems in readjusting to civilian life lie in the sense of unfairness in killing and injuring poorly defended people with vastly superior weapons and training. That kind of guilt is hard-wired to innate morality, natural justice and fairness.

        I have yet to be convinced that an electronic two dimensional representation of reality happening thousands of miles away is as traumatic as proximal experiences of explosions and bloodshed "on the ground". If you have any researched evidence of such comparisons, then I'd be very interested to see it.
        • Nov 18 2013: "Do I understand you correctly that you are happy that a poor soldier is enduring the indignity of "tedium" in a warm control centre, while someone at the other end is getting blown to bits trying to defend their family and their culture against overwhelming odds?"

          Yes, very much so, and let me explain why.
          In the best of circumstances, war is a lopsided event. One side is much stronger than the other, crushes it swiftly and efficiently, and then the war can be over. When things are nice and "fair", wars drag on for years on end as neither side can gain a significant advantage, yet refuses to pull out due to a combination of sunk cost fallacy, risk of loosing face, and there probably being an objective to achieve they went to war over to begin with that wasn't accomplished yet.

          As for trauma to the soldiers themselves, drone and aircraft pilots have similar rates of it. This makes sense, as a pilot isn't exactly "on the ground". The air force has a long history of being the most "detached" of the service branches, flying high over battlefields without setting a foot there. The traumatic difference between a pilot and a drone pilot is that one isn't afraid of being shot down, that's it. The same is probably true for naval unmanned systems.
          Whether that will translate for remote controlled ground forces on the other hand, is harder to say. Ground warfare is by its nature the messiest of the three, and physical removal from it may have a more significant psychological effect.

          Drones also make terrorists and guerrilla groups much less effective, as those groups rely on inflicting enough casualties to cause political pressure. With drones, you can fight them without raising the body count.
          Terrorists find the tactic of drones just as bad as NATO forces find IEDs, for many the same reasons. You can't fight back; which only serves to turn it into a better weapon. Drones are if anything worse--the buggers actively seek you out. Helps break the enemy's resolve.
  • thumb
    Nov 17 2013: If you succeed, you and your commander will be honored or notorious. But as a robot, it doesn't have this kind of feeling, and his operator or inventor will be honored or notorious.
  • Nov 17 2013: Yes but there are similiarities Sometimes the discussion is more important than the question.
  • thumb
    Nov 16 2013: I absolutely agree. The sole purpose of elite training is to narrow the focus of the warrior into a tunnel where only the objective matters and orders - or the lack of them - will dictate the focus and level of carnage that ensues. The programming of a robot would be almost identical, albeit without same political cost of losing a human soldier in the process. Of course, there is also the other very human draw back of emotions and empathy clouding the warrior's judgement and screwing up the objective that a robot - at least the primitive ones we employ today - would be absent of. Nor do we have to worry about PTSD in robots, yet.
  • thumb
    Nov 16 2013: It would seem there is not a way to make an unwholesome idea wholesome. It is what it is, I'd look for a wholesome idea. Killing is just not something I see as wholesome. There are no winners in that game.
    Both robot and man follow instructions, that is why I look for a Higher Authority for my instructions.
  • thumb
    Nov 16 2013: There's an interesting dichotomy in this thread. On the one hand, Poch sees pitting robots against human soldiers as 'cheating. On the other, Nadav sees replacing humans with robots as a good thing because it reduces human suffering.

    Rationally, its hard to disagree with Nadav. But emotionally I lean towards Poch. It does seem rather ungentlemanlyr to go into battle trying to kill enemy soldiers without entailing the same risk yourself.

    When it gets really interesting is when both sides replace humans with robots, because that shakes the very foundations of warfare. After all, why do wars exist? To try to force an opponent into submission by killing more enemy soldiers than you loose yourself or causing other forms of human suffering. If only robots are fighting, there is no loss of life or suffering, so why should one side ever surrender? I suppose a war would then become more of an technical/economic show of power, with one side loosing when it's factories are unable to produce enough robots to compensate for losses in battle.

    In fact, what would be the point of starting a war if both sides just send in robots?

    Utopic, perhaps, but maybe that will be the point at which humanity finally realises the idiocy of any form of war and disbands all its armies.....
    • Nov 16 2013: Cheating is good. If you're fighting fair, you're doing it wrong--that's how chivalrous idiots get themselves killed. When lives are on the line, anything less than pushing for every single possible advantage is nothing short of criminal neglect.
      I've seen soldiers get court marshaled for as much, in fact.

      As for a war between two armies of robots ending, no trouble at all. Once military opposition is wiped out, and the enemy still doesn't surrender, start targeting civilian infrastructure. Bridges, power plants, that sort of thing. Going after population centers is ill advised though, as that typically only strengthens the enemy's resolve.
      War is already a technological/economical display of power. With robots on both sides (or even one side), it'll just cause less people to get die for it. Victory is about breaking the enemy's will to fight; actually breaking the enemy isn't always necessary.

      My problem with robotic warfare is less about the morality of it. Its the practicality of application against an enemy with proper electronic warfare capabilities (which the recent insurgents and terrorist groups targeted by drones all lack).
      This is the real reason we won't make the transition to completely robotic forces, even when it becomes technologically feasible. Though we may end up in a situation where robots do most of the heavy lifting, and human forces are there in a supporting role (as opposed to the exact opposite we're starting to have today).
  • Nov 16 2013: No on both counts.

    From the perspective of the victim, there is no difference who or what kills them.

    A human soldier is much more complex than a machine. They are capable of real time independent thought, of interpretation of new sensory input, and of putting what they see in context with the rest of the situation.
  • thumb
    Nov 16 2013: The same in what sense? In the sense that someone has been killed no, there is no difference. In the sense that you do not have to risk one of your own soldiers, then yes, there is a difference.
  • thumb
    Nov 16 2013: well, at any point a soldier could give up the mission and walk away, whereas a robot could not.

    Why are you asking the question, Cecil?
  • thumb
    Nov 16 2013: if a soldier is robot-like, i.e. does not have conscience, i certainly don't want that person to live in my neighborhood.
  • thumb
    Nov 16 2013: When robots are allowed to kill human soldiers, that's 'cheating'. Robots are harder
    to destroy than men.
    Robots are still different when we talk of 'programming'. Human soldiers have
    conscience and will decide on killing on a case to case basis. Many even commit
    suicide because their conscience can't carry their 'kills' anymore.
  • Nov 16 2013: There is a difference in accountability whenever something doesn't go the way its supposed to, which tends to happen a lot in wartime. Humans are only semi-programmable, and their legal standing is much better defined.

    I'm still in favor of it however. Sending a robot to fight and die in the cold and mud means you don't have to send some poor human to do the same.
    A greater practical concern is enemy electronic warfare on the robot's effectiveness.
  • thumb
    Nov 16 2013: Your question and your explanation are at difference with each other.

    One is comparing a human and machine .. the other is asking if a human can be programed in the same manner as a machine.