2.10.2010

An Empathy Problem

I just read Philip K. Dick's Do Androids Dream of Electric Sheep for either the second or third time, and I'm beginning to articulate for myself the challenges it raises to the value of empathy. Empathy makes it possible for humans in the novel to remain hopeful and productive on a post-apocalyptic and decaying Earth. Humans distribute their pain (which lessens it) and also share in each other's happiness (which enlivens everyone). As far as this goes, empathy appears as an unqualified good, a non-zero sum game; and in the novel this empathy consists literally of people sharing their emotions through an "empathy box" – a machine that allows collective participation in a religious allegory.

Furthermore empathy appears to be valuable not just for humans, but for animals, too. In the post-apocalyptic environment, with hundreds or thousands of common species very recently extinct, every animal left alive is the subject of human veneration, and empathy. P.K. Dick doesn't literalize this empathy, but he underscores its importance by making its presence a basic method for distinguishing authentic humans from the advanced artificial humanoid servants – androids – who occasionally kill their masters and try to pass as human. Bounty hunters, like the novel's hero, Rick Deckard, ask subjects to imagine and respond to scenarios that involve obvious or implicit harm to animals, and androids always fail to react with the appropriate horror. But the tests are obviously culturally coded: you and I would probably fail, too.

From here things become more complicated, because Deckard, whose job is to "retire" escaped androids, empathizes with some of them; and Deckard detests another bounty hunter, named Resch, for his lack of empathy towards the androids. In fact Deckard becomes convinced by Resch's callousness that Resch is himself an android, and when Resch tests out as human, Deckard is dismayed.

Parts of this novel simply don't cohere, as is common in P.K. Dick's oeuvre. But a generous reading of the novel has it asking this question: Can one reconcile the glorious expansiveness of empathy with making the distinctions upon which one's survival may rest? It seems to me that without a machine that permits a heterogeneous population literally to pool its emotions, people in fact empathize with other, recognizably similar beings. Empathy cannot make a bridge to what is alien or radically different, and in fact in can exacerbate feelings of alienation. When empathy is widespread, anyone left out of the loop becomes suspect, potentially the object of hatred or intense fear. If empathy is what makes "us" human, dare we embrace that humanity? Do we even have the alternative – to know the android, too?

1 comment:

  1. Hi Ezra,

    You might like to look the new (self-described) manifesto You Are Not a Gadget by Jaron Lanier. (Certainly I'd like for you to look into it, for the purpose of getting your reactions.) Lanier asks and suggests, too, about empathy, including a suggestion that too large an empathy circle may be destructive with the same effects as a small circle.

    Empathy is indeed about a recognition of similarity. That recognition can be given as a gift--but is there anything other than your name that you can give without losing? You say Dick affirms this by showing the empathy as a "non-zero sum game." You also seem to say that's far from the whole story.

    "You can't make an omelet without breaking eggs" is meant to sound harsh. It's also meant to give people confidence to live. If you don't want to break the eggs, you could just starve yourself to death: that's not even culturally unacceptable everywhere. But that much empathy as a norm doesn't sustain a society, as it does not sustain an individual.

    ReplyDelete