How creeped out? Just look at this picture:
She says in the article that her research into objects designed to encourage us to form nurturing emotional bonds with them -- Tamogochis, dolls, etc. -- "gave me the chills." She also admits, "I have finally met technology that upsets and concerns me."
Here's why, from the MIT news office release:
One of Turkle's concerns was triggered by the effect of a sophisticated interactive doll, Hasbro's "My Real Baby," and of the Paro seals on the elderly. She left a few "My Real Baby" dolls (which were not a big retail hit with children) in a local nursing home, and when she returned later, she found that the staff had bought 25 of them because of the soothing effect on the residents.There are a lot of disturbing implications here: Human empathy is easy to simulate because it's mainly an illusion created by looking at someone with a thoughtful expression. Not only that, when we seek empathy from others, we're content with the illusion because we wouldn't be able to distinguish it from the other actually understanding us anyway. Psychics seem to work this way, bouncing back things you tell them in a way that allows you to feel as though something magical has happened, some secret insight has been revealed. Horoscopes, too -- they seem insightful because almost any generality can apply to our lives, but because we are so fixated on our singularity, the advice seems shockingly particular and oracular to us. We don't want new understanding, we want the understanding we already have made strange and confirmed simultaneously. We want to be able to project and then recognize ourselves, thereby extending our scope, universalizing how we feel.
"The only one who's not happy here is the sociologist," said Turkle, raising her hand.
That soothing response was based on a sham, she believes. "What can something that does not have a life cycle know about your death, or about your pain?"
She cited the case of a 72-year-old woman who, because she is sad, says she sees that her robotic toy is also sad. "What are we to make of this relationship when it happened between a depressed woman and a robot?" Turkle asked.
Also, it suggests the process of nurturing is less a matter of communication then it is a technical operation, a set of objective conditions that can be met by any means, human or nonhuman. And it's not unusual to nurture something incapable of feelings. When we nurture, the object of our nurturing can be an infant, a houseplant, or book collection (which I invest with loving care by occasionally attempting to alphabetize them or group them by subject) -- anything that can elicit the appropriate forms of behavior and gestures. In other words, nurturing is a reflexive gesture rather than an altruistic one. It seems plausible that the effort our culture spends investing material goods with emotional qualities abets this process, reinforcing the idea that other people need not be present to complete circuits of emotional experience. Other people's feelings, after all are inconvenient and inefficient to deal with.
There's more on robotic love (via Mind Hacks) here, in this interview from the Boston Globe with Marvin Minsky, an AI researcher whose most recent book argues that emotions are just another way of thinking through a problem and thus can probably be emulated by machines. This also suggests that humans themselves are machines as well:
We don't like to think of ourselves as machines because this evokes an outdated image of a clunky, mechanical, lifeless thing. We prefer the idea that inside ourselves is some sort of spirit, essence, or soul that wants and feels and thinks for us. However, your laptop computer has billions of parts, and it would be ridiculous to attribute all its abilities to some spirit inside its battery. And a human brain is far more complex than is any computer today.A fairly radical materialist viewpoint that dispenses with the mind/body problem -- mind is the product of the brain's processing power. (This, by the way, is how I think Battlestar Galactica ultimately ends; we find out eventually that the humans on the show never did reach Earth, but the Cylons did and we are their descendants.)
Minsky also suggests that love occurs through subtraction rather than addition:
There's short-term infatuation, where someone gets strongly attracted to someone else, and that's probably very often a turning-off of certain things rather than something extra: It's a mental state where you remove your criticism. So to say someone is beautiful is not necessarily positive, it may be something happening so you can't see anything wrong with this person.So if you put that together with the elderly people and their robot babies, it seems that products could be designed (or are designed, or are advertised as such when they are made to be lovable) to induce this kind of forgetting, to propel this kind of screening, or turning-off, which makes the shortcomings of others (or things) and the outside world in general less recognizable, less present, and at the same time keeps the focus more securely on ourselves, the true object of our affections.
No comments:
Post a Comment