If you kick a robotic dog, is it wrong?

When pet Lila wasn't getting as much playtime as the other two animals in her Plymouth, Mass., home, owner Genie Boutchia felt guilty. Then when a potential new owner came calling with $850 in hand, Ms. Boutchia felt even guiltier. She changed her mind and deemed Lila not for sale.

Such feelings of moral responsibility might seem normal, even admirable, in a dog owner. But Lila is not a real dog. She's a robot.

And like tens of thousands like her in homes from Houston to Hong Kong, she's provoking fresh questions about who deserves moral treatment and respect.

How should people treat creatures that seem ever more emotional with each step forward in robotic technology, but who really have no feelings?

"Intellectually, you realize they don't have feelings, but you do imbue them with personality over time, so you are protective of them," Boutchia says. "You feel guilty when you play with the other two dogs [which, as newer models, are more apparently emotive], even though you know Lila could care less."

Trouble is, Lila seems to care, and her newer kin seem to care even more.

Sony Corp. has brought the latest robotic engineering technology to bear on the new Aibo ERS-7, which at $1,599 promises to have six emotions: happiness, anger, fear, sadness, surprise, and discontent. Pat one on the head, and it becomes happy enough to do tricks. Whack its nose, and it not only appears hurt, but it also learns not to repeat certain behavior.

Aibo's "feelings" appear real enough that researchers at the University of Washington felt compelled to explain in a study that, contrary to Sony's claim, Aibo does not have any true emotions.

If Aibo did have true emotions and self-awareness, philosophers generally agree, then it would require humane treatment.

But as machines, robotic pets with sad eyes can nevertheless be legitimately neglected, a fact that some people find troubling, while others welcome both in its practicality and moral significance.

Support from PETA

Among those celebrating the ability to forget a pet without consequence is a national animal rights group, People for the Ethical Treatment of Animals (PETA).

"The turn toward having robotic animals in place of real animals is a step in the right direction," says PETA spokeswoman Lisa Lange. "It shows a person's recognition that they aren't up to the commitment of caring for a real animal. Practically speaking, from PETA's perspective, it really doesn't matter what you do to a tin object."

A trend that might be good for animals, however, might not be good for those who profit most from relationships with animals, according to Peter Kahn, a psychology professor at the University of Washington who has studied Aibo's effect on preschoolers at the university's Center for Mind, Brain and Learning.

"Children need rich interactions with real, sentient others, both human and animals," Professor Kahn says.

"If we replace that, I think we're impoverishing our children. These relationships [with robotic pets] aren't going to be fully moral. They'll be partially moral, which is not as good as a real relationship with a real animal whose needs teach children that their own desires don't always come first."

Robot rights?

Despite their attachments, most Aibo owners don't seem to grant their pets much moral standing, according to sociological research done by Kahn and his colleague, Prof. Batya Friedman.

Fewer than 10 percent of owners who openly shared their feelings in an online chatroom indicated a belief that their pet had rights or could be blamed for misdoings. Owners always seem to remember, whether on a conscious or subconscious level, that their pets - beloved as they are - remain machines.

Nevertheless, some are more comfortable than others with treating them as mere machinery.

For Peter Danielson, director of the Center for Applied Ethics at the University of British Columbia, the danger lies not in regarding mechanical pets too lowly but rather too highly. They are, after all, not much different from handheld computers.

"If someone puts a [robotic] kitten in a microwave, it's not horrible. It's foolish, but it's not morally forbidden," Dr. Danielson says.

"It seems to me there's a whole ethics of fiction and toys that we're thinking through. Are you telling me I ought to treat something that looks like a kitten, but is actually a piece of plastic, better than I treat a pig, which is actually a sentient and intelligent being? You're building a taboo system that gets further and further from the actual value. And every time you do that, you lose your moral authority with your technologically inclined kid who says, 'That's stupid.' "

Forming bad habits

Yet for those who ascribe emotions to a robotic pet, and who learn nonetheless that they can be neglected or abused without consequence, a dangerous habit with larger implications could be in the works, according to James Hughes, a sociologist at Trinity College in Hartford, Conn., and secretary of the World Transhumanist Association.

"The concern is that treating an 'animal' as a lesser being could bleed over to treating certain humans as lesser beings," Dr. Hughes says.

Consider a human being in a vegetative state with no greater self-awareness or deep desires than a robotic pet, he says.

If Aibo deserves no respect because it lacks self-awareness and deep desires, he asks, then on what basis would a vegetative human being deserve any measure of respect?

Even if a robot is "abused," it may not "hurt" the object, but the act of destruction shows a lack of respect for property.

"We have to become more sophisticated in the distinctions we make in order to ensure we make them responsibly," Hughes says.

You've read  of  free articles. Subscribe to continue.
QR Code to If you kick a robotic dog, is it wrong?
Read this article in
https://www.csmonitor.com/2004/0205/p18s01-stct.html
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe