TED Conversations

Miguel Alvarez

Colegio de Pedagogia UNAM

This conversation is closed. Start a new conversation
or join one »

Idea: Social networking and robotics programing to help children and adolescents develop moral criteria.

Since 1998 I am working with robotics (aka Lego Mindstorms robots) to organize educational experiences to help children to develop moral criteria (As Piaget and Kohlberg) suggest. The idea is very simple: Students has an every class challenge to build a robotic solution for a handicapped persons then they must think about the solution being emphatic with the user of their technical solution. After many years using this focus in our Technology class I have results and it is interesting how students look for solutions using the moral criteria thinking the solution from the handicapped vision.

The second idea is use private social network to help students to use the moral criteria to autocontrol their behavior in a social network. Using Ning as a private social network, we invite students to develop a team solutions using technology. The issue was how the relation each other using empathy. Then we made a review with the communications used in fora. Many time we find comments who discard the other students opinions based in cultural, economical and even races issues, then we the teachers could participate to moderate the discussion and to orient students based in empathy and other moral criteria applied.

Both ideas was used in class between 1998 and 2011 and was published in Information Review of Information Ethics below the tag "Teaching Information Ethics" Could be consulted in
http://www.bvce.org/SociedadeInformacaoResultado.asp?Consulta=Primeira

0
Share:
progress indicator
  • thumb
    Nov 13 2012: Miguel I agree. I am a fan of lawrence kohlberg's Stages of Moral Developement.
  • thumb
    Nov 10 2012: I think of morality as a long term investment in love but without the pain of losing the other or wondeing if the other is cheating on you. But there is always the need to recreate the other, within a context of non professional creative care. Empathy is visible mostly to the other, living within the norms of a perticular culture, faith and tradtions. Ethics is both personal and collective; feeding into the larger debate of nurture versus nature. How we interpret morals and what these morals are effect the way we relate to each other. This relationship is the most important recipe of peacful co-existence every society.

    These categories of morals and ethics set humans apart from non humans transcending global human society and interconnecting our understanding as we grow to know or understand what we need to hate or love. Although these binaries are clear they are not always visible, therefore teaching MORALS/ETHICS as a value base commodity in any context increases my interest on how we teach those values, because boy! it's critical how we see the outcome and also how do we measure this outcome?

    I am concerned and interested about a few things: One, do computers teach or think for us now? Two, are we shaping the way computers/technology affect our moral values? Three, who are the victims of morality?
    • Nov 12 2012: When I read your comments I think immediately in I Robot, the novel. I was young when I find this story in my school library. Was amazing to read about the 1o rules of robotics. When we put the students to work with robots programming we face the challenge of ethics because we can cover human needs with technology but not in any way. We need put the solutions in the contexts of the persons needs, privacy, respect, intimacy, and all the elements who let the person be him/herself.

      My experience is that if you bring technology to the classroom you cannot avoid ethical issues involved. If you are a teacher you must be a change factor in ethical matters too.
  • Nov 28 2012: Take a look:

    Risk of robot uprising wiping out human race to be studied http://www.bbc.co.uk/news/technology-20501091
  • Nov 13 2012: More and more Americans and others around the world are becoming mental robots.
    They are being made into Artificial Intelligence.
    They believe what they are told to believe, think what they are told to think,
    say what they are told to say and do what they are told to do.
    They also don't believe what they are told not to believe, they don't think what they
    are told not to think, they don't say what they are told not to say and they don't do
    what they are told not to do. They thus, are not able to and cannot, make a choice of their own or come to their own decision.

    This is precisely what many want to do with children (and have been doing), claiming that children cannot make choices.
    What B.S.!!
    Of course they can't if they are programmed ahead of time to make a choice that is pre-determined by someone else.

    They will always be making the choice of that someone else, who wishes to play god and destroy the spirit of another child, by claiming,(Oh, this is so rich, it is sinful and evil), that "it is for the good of the child!!!"

    Sorry, cannot approve of your idea unless you let me be the one to determine your kids morals and the morals of others kids as well.
    Unfortunately, too many wonderful children are already being trained by robots. They are known as their parents!
    And the government, in various forms, is involved as well.
    Morals. The TSA strip searching and plain searching children at airports is simply their way of training young kids to get used to being in a Fascist country and growing up that way, with no rights whatsoever. Those are not morals. Those are truly evil intentions and evil human beings. And their parents, right before them, do nothing, as they are told what to do or not do. Say or not say. Think or not think. Believe or not believe. Don't you dare. We will train your kids to be afraid and not to react or act.

    Stay out of it. You have no rights.
    • Nov 13 2012: The idea works an inverse way. If you program a robot it works like a mirror of your thoughts and your vision of the world. The procedures works like a mirror of your moral structure. I am repellent with this TV shows where people laugh for the other accidents. Is a kind of a repulsive pedagogy of avoid empathy with others. But if you invite or challenge a student to develop a piece of software for a machine who suppose help others, the student must be able to put himself in the other situation, analyze it and develop a solution in/for the OTHER needs.

      I am not defending robotics at all, I am using the experience with technology to contribute to the developing of reasoning skills since 25 years ago. And Social networking, program robots and many others are useful if your have the ability to develop learning environments with very rich educational experiences. Papert and Resnick form MIT with LogoWriter and Scratch are doing wonderful job in this matter and Latin America has wonderful experiences.
  • thumb
    Nov 13 2012: If you want to teach morals it's a good idea but teaching compassion with robots, I don't think so.
    • Nov 13 2012: Well you can teach robotics aseptically avoiding ethical issues. But Develop moral criteria is not teaching morals. Developing moral reasoning is the way to help students become good citizens. I prefer use every school and informal occasion to help students became autonomous and ethically competent, reflexive persons.
  • Nov 12 2012: Yes, I am agree. We never teach morals. We contribute to develop the reasoning skills who let students use their moral criteria. We are based in Piaget and Kohlberg texts about the development of moral criteria. Robotics is used to plan an escenario where students think like programmers (actually they develop a solution and they must be emphatic with human needs before to think like technicians). This technical challenge is used as a useful environment for developing consciousness about other needs, be empathic with them.
  • thumb
    Nov 10 2012: I am horrified at the thought of any formal educational institution teaching morals. I live in the USA and Christian morals are so cruel and violent that we should protect our children from them. Who is going to decide which morals are going to be taught and what are you going to do with the parents who object?
  • Nov 9 2012: Thanks. I will find out more about the FIRST robot program. In our school we try with Lego Mind Storm and crickets form MIT Mitch Resnick and the invest was moderately accessible for students (they must pay 60 dlls. monthly basis and then they can access a Mac computer, Internet, Labs and Vernier probes too).
    • thumb
      Nov 10 2012: Two reasons I like robotics and engineering design in schools where people can afford it are: building things together can be a neglected but highly useful/applicable skill set, not only in the workplace but in daily life, and and these activities seems to be very effective in motivating students to apply themselves to school work who simply are not drawn to the abstract.
  • thumb
    Nov 9 2012: Regardless of the subject of the classroom, whether Language Arts, Social Studies, Math... or the less common school subject of robotics or engineering design, it is easy and developmentally useful to build in a service dimension that allows students to see how work in that subject can be used in the public interest.

    I agree with you that students typically love seeing how these school subjects can be used to start to address human problems and that it gives them practice in seeing themselves as part of future solutions..

    Collaboration itself is a big focus in schools, because everyone recognizes its importance as a life skill.

    Congratulations to you on the successes of your robot projects. I am familiar with the FIRST robot program. My younger daughter had hoped to participate as a high schooler, but the expense was prohibitive.