Apr 03, 2016 | By

The Quagmire of Moral Development

Parents, philosophers, theologians and educators for millennia have grappled with the challenge of teaching morality to young people. Many great thinkers have proposed theories, models, practices and programs designed to instill virtue, yet people young and old consistently fail to live up to the morals their elders promote. Microsoft recently experienced this phenomenon in relation to artificial intelligence. As reported in the New York Times, Microsoft launched a self-learning chatbot program named Tay, designed to emulate a 19-year old female, into the Twitter-sphere. Within 24 hours, the program had to be removed as it had been quickly corrupted by exposure to anti-moral attacks that turned Tay into a “sexist, Holacaust-denying supremacist” (according to The Week, April 8 at p.18). It turns out that the company your chatbot keeps is important to its moral development. True, the program is not really a sentient human and has no morals per se, but the social learning the incident demonstrates is a reminder of how powerful social influences can be on the impressionable. Moreover, while the influence of social networks, e.g. family and communities, on human moral development has always been apparent, modern technology powerfully amplifies these influences in ways that we may not appreciate.

One Response to “The Quagmire of Moral Development”

  1. Kaye says:

    My gut has had this hunch and it is alarming to see it confirmed by this NY Times experiment. Thank you for sharing this.

Join the Discussion

Why ask?