TED Conversations

Colegio de Pedagogia UNAM

This conversation is closed.

Idea: Social networking and robotics programing to help children and adolescents develop moral criteria.

Since 1998 I am working with robotics (aka Lego Mindstorms robots) to organize educational experiences to help children to develop moral criteria (As Piaget and Kohlberg) suggest. The idea is very simple: Students has an every class challenge to build a robotic solution for a handicapped persons then they must think about the solution being emphatic with the user of their technical solution. After many years using this focus in our Technology class I have results and it is interesting how students look for solutions using the moral criteria thinking the solution from the handicapped vision.

The second idea is use private social network to help students to use the moral criteria to autocontrol their behavior in a social network. Using Ning as a private social network, we invite students to develop a team solutions using technology. The issue was how the relation each other using empathy. Then we made a review with the communications used in fora. Many time we find comments who discard the other students opinions based in cultural, economical and even races issues, then we the teachers could participate to moderate the discussion and to orient students based in empathy and other moral criteria applied.

Both ideas was used in class between 1998 and 2011 and was published in Information Review of Information Ethics below the tag "Teaching Information Ethics" Could be consulted in
http://www.bvce.org/SociedadeInformacaoResultado.asp?Consulta=Primeira

Share:

Showing single comment thread. View the full conversation.

  • thumb
    Nov 10 2012: I think of morality as a long term investment in love but without the pain of losing the other or wondeing if the other is cheating on you. But there is always the need to recreate the other, within a context of non professional creative care. Empathy is visible mostly to the other, living within the norms of a perticular culture, faith and tradtions. Ethics is both personal and collective; feeding into the larger debate of nurture versus nature. How we interpret morals and what these morals are effect the way we relate to each other. This relationship is the most important recipe of peacful co-existence every society.

    These categories of morals and ethics set humans apart from non humans transcending global human society and interconnecting our understanding as we grow to know or understand what we need to hate or love. Although these binaries are clear they are not always visible, therefore teaching MORALS/ETHICS as a value base commodity in any context increases my interest on how we teach those values, because boy! it's critical how we see the outcome and also how do we measure this outcome?

    I am concerned and interested about a few things: One, do computers teach or think for us now? Two, are we shaping the way computers/technology affect our moral values? Three, who are the victims of morality?
    • Nov 12 2012: When I read your comments I think immediately in I Robot, the novel. I was young when I find this story in my school library. Was amazing to read about the 1o rules of robotics. When we put the students to work with robots programming we face the challenge of ethics because we can cover human needs with technology but not in any way. We need put the solutions in the contexts of the persons needs, privacy, respect, intimacy, and all the elements who let the person be him/herself.

      My experience is that if you bring technology to the classroom you cannot avoid ethical issues involved. If you are a teacher you must be a change factor in ethical matters too.

Showing single comment thread. View the full conversation.