Do Non-Vegans Dream of Electric Puppies?
By Sherry F. Colb
Every year for the past decade, Cornell University has run a New Student Reading Project. The Project involves choosing a book for the incoming undergraduate class and Cornell community to read over the summer. The Greater Ithaca community also participates, with support from the Tompkins County Public Library. Then, when Fall arrives, faculty from different departments present lectures and facilitate small-group discussions of the book.
This year, Cornell selected the Philip K. Dick novel, Do Androids Dream of Electric Sheep?, on which the Ridley Scott movie, Blade Runner, is (quite loosely) based. This year, Michael Dorf and I decided to participate in the Project and serve as faculty facilitators for one of the small groups of freshmen discussing the book. Our discussion – which took place on Monday – was quite interesting and gave students an opportunity to enjoy a seminar-like exchange with one another on a familiar topic. Because Michael and I were very nondirective in our role as facilitators, however, the discussion ended up mostly neglecting what I view as perhaps the most salient theme in the book. I thought I'd take this opportunity to elaborate on it.
Androids is a brilliant science fiction treatment of a variety of philosophical questions, including the following: What quality or capacity makes human beings distinctively human? The book is set in Post-Apocalyptic California, where few people continue to live, most having left Planet Earth to colonize Mars. All of the humans still living on Earth (because of their jobs or because of physical or mental deterioration that disqualifies them from emigrating) belongs to a religion called Mercerism, dedicated to empathy and compassion.
To tempt people to leave Earth for Mars, the government has provided free android slaves (marketed with references to the pre-civil-War plantation South) to each emigrant. Androids are extremely convincing replicas of humans, and they feel pain and emotion much in the way that humans do, with one exception: androids do not experience empathy, for other androids, for humans, or for nonhuman animals.
Some androids, unhappy with their slave status on Mars, have killed their human masters there and escaped to Earth. Unwelcome on this planet, the androids are treated as illegal aliens, whom the government hires bounty hunters, including the novel's protagonist, Rick Deckard to “retire” (i.e., kill). The bounty hunter faces a challenge, however, each time he wishes to retire an android. He must make sure that his target is truly an android, as it is a crime to kill a human being (or, indeed, a nonhuman animal).
To meet this challenge, Rick administers the “Voigt Kampff" Empathy test before retiring anyone. The test – involving a sort of polygraph machine – measures the target’s emotional and physiological responses to hearing the bounty hunter describe “a morally shocking stimulus.” If the subject experiences a “’shame’ or ‘blushing’ reaction” within a short enough time, then Rick knows he is dealing with a human; if not, he has an android, whom he can then proceed to “retire.”
These are some of the scenarios in the test: (1) “You are given a calf-skin wallet on your birthday”; (2) “In a magazine, you come across a … picture of a nude girl … lying facedown on a large and beautiful bearskin rug”; (3) You are reading a novel from before the war, and one of the characters at a restaurant “orders lobster, and the chef drops the lobster into the tub of boiling water while the characters watch”; (4) “You rent a mountain cabin … and above the fireplace a deer’s head has been mounted, a full stag with developed horns”; And (5) You watch an old movie in which people attend a banquet where “[t]he entrée consists of boiled dog, stuffed with rice.”
Quite strikingly, each of these scenarios involves a person consuming the products of animal suffering and slaughter. With the exception of the lobster, moreover, none of the people involved in the scenes is directly inflicting the suffering. What distinguishes androids from real humans, in other words, is the experience of apathy rather than empathy when confronting a scene in which animals have been killed and commodified for human consumption.
Though the human society of Androids has ostensibly embraced an ethic of ahimsa – or nonviolence – toward animals, it continues to practice pet ownership. A catalogue listing prices for various animals (many of them endangered, most extinct and therefore only theoretically available for a price) is circulated regularly, and humans on earth express their “empathy” by purchasing one of the caged animals, taking care of him or her, and keeping the animal on display for the neighbors. Because many people cannot afford the price of a real animal, a business providing fake animals develops, although the fake animals appear not to have real feelings, so they are not truly a nonhuman analogue to the androids.
Just as human bounty hunters kill androids (in a striking failure of empathy, the capacity that supposedly distinguishes them from their quarry), the humans in Androids generally support commodification and captivity for the animals who remain on Earth, by purchasing them as empathy commodities. This seems most naturally analogous to the contemporary practice of buying cats and dogs (and other animals) who have been bred in captivity for sale, either from breeders or from pet stores. People purchase their empathy commodities, for whom they often feel genuine affection, and thereby support the cruel practices by which the animals are bred and made available, including the repeated anguish visited on mother dogs and cats whose babies are all taken away from them for sale, just as people purchase beef, pork, dairy, and eggs, and thereby support the torture and slaughter of billions of animals morally indistinguishable from our more favored empathy commodities. None of these purchases is necessary, and all contribute to immeasurable suffering.
A question that arose for both me and Michael when we were facilitating the undergraduates’ discussion of Androids is why most people who read the book do not seem to notice the vegan/animal rights messages. Most (especially faculty) seem to want to focus almost exclusively on the commentary about technology, machines, the singularity, and the line between authentic (humans, animals, moods) and artificial. The only animal-related lecture presented this week appeared to be about fake pets rather than about the moral questions Dick poses about the consumption of animal products.
Michael articulated an interesting theory of why this is so. Like fantasy fiction in totalitarian regimes, written allegorically to avoid alerting the censors to a subversive message, science fiction in relatively free societies must elude the internal censors that we all have. By situating Androids in a post-apocalyptic world, Dick makes it possible for us to be critical of the androids for failing to feel empathy for animals, even as we miss the fact that it is we who are failing to live up to our distinguishing feature in this regard, in the real world. Sadly, however, since no one at Cornell appears to be talking about the animal rights themes of Androids, Dick’s message about the consumption of animals – however clear and straightforward – has failed to get by the internal censors here. I hope that this post will serve as a corrective for that sad (and likely unintended) success of our internal censors.
Every year for the past decade, Cornell University has run a New Student Reading Project. The Project involves choosing a book for the incoming undergraduate class and Cornell community to read over the summer. The Greater Ithaca community also participates, with support from the Tompkins County Public Library. Then, when Fall arrives, faculty from different departments present lectures and facilitate small-group discussions of the book.
This year, Cornell selected the Philip K. Dick novel, Do Androids Dream of Electric Sheep?, on which the Ridley Scott movie, Blade Runner, is (quite loosely) based. This year, Michael Dorf and I decided to participate in the Project and serve as faculty facilitators for one of the small groups of freshmen discussing the book. Our discussion – which took place on Monday – was quite interesting and gave students an opportunity to enjoy a seminar-like exchange with one another on a familiar topic. Because Michael and I were very nondirective in our role as facilitators, however, the discussion ended up mostly neglecting what I view as perhaps the most salient theme in the book. I thought I'd take this opportunity to elaborate on it.
Androids is a brilliant science fiction treatment of a variety of philosophical questions, including the following: What quality or capacity makes human beings distinctively human? The book is set in Post-Apocalyptic California, where few people continue to live, most having left Planet Earth to colonize Mars. All of the humans still living on Earth (because of their jobs or because of physical or mental deterioration that disqualifies them from emigrating) belongs to a religion called Mercerism, dedicated to empathy and compassion.
To tempt people to leave Earth for Mars, the government has provided free android slaves (marketed with references to the pre-civil-War plantation South) to each emigrant. Androids are extremely convincing replicas of humans, and they feel pain and emotion much in the way that humans do, with one exception: androids do not experience empathy, for other androids, for humans, or for nonhuman animals.
Some androids, unhappy with their slave status on Mars, have killed their human masters there and escaped to Earth. Unwelcome on this planet, the androids are treated as illegal aliens, whom the government hires bounty hunters, including the novel's protagonist, Rick Deckard to “retire” (i.e., kill). The bounty hunter faces a challenge, however, each time he wishes to retire an android. He must make sure that his target is truly an android, as it is a crime to kill a human being (or, indeed, a nonhuman animal).
To meet this challenge, Rick administers the “Voigt Kampff" Empathy test before retiring anyone. The test – involving a sort of polygraph machine – measures the target’s emotional and physiological responses to hearing the bounty hunter describe “a morally shocking stimulus.” If the subject experiences a “’shame’ or ‘blushing’ reaction” within a short enough time, then Rick knows he is dealing with a human; if not, he has an android, whom he can then proceed to “retire.”
These are some of the scenarios in the test: (1) “You are given a calf-skin wallet on your birthday”; (2) “In a magazine, you come across a … picture of a nude girl … lying facedown on a large and beautiful bearskin rug”; (3) You are reading a novel from before the war, and one of the characters at a restaurant “orders lobster, and the chef drops the lobster into the tub of boiling water while the characters watch”; (4) “You rent a mountain cabin … and above the fireplace a deer’s head has been mounted, a full stag with developed horns”; And (5) You watch an old movie in which people attend a banquet where “[t]he entrée consists of boiled dog, stuffed with rice.”
Quite strikingly, each of these scenarios involves a person consuming the products of animal suffering and slaughter. With the exception of the lobster, moreover, none of the people involved in the scenes is directly inflicting the suffering. What distinguishes androids from real humans, in other words, is the experience of apathy rather than empathy when confronting a scene in which animals have been killed and commodified for human consumption.
Though the human society of Androids has ostensibly embraced an ethic of ahimsa – or nonviolence – toward animals, it continues to practice pet ownership. A catalogue listing prices for various animals (many of them endangered, most extinct and therefore only theoretically available for a price) is circulated regularly, and humans on earth express their “empathy” by purchasing one of the caged animals, taking care of him or her, and keeping the animal on display for the neighbors. Because many people cannot afford the price of a real animal, a business providing fake animals develops, although the fake animals appear not to have real feelings, so they are not truly a nonhuman analogue to the androids.
Just as human bounty hunters kill androids (in a striking failure of empathy, the capacity that supposedly distinguishes them from their quarry), the humans in Androids generally support commodification and captivity for the animals who remain on Earth, by purchasing them as empathy commodities. This seems most naturally analogous to the contemporary practice of buying cats and dogs (and other animals) who have been bred in captivity for sale, either from breeders or from pet stores. People purchase their empathy commodities, for whom they often feel genuine affection, and thereby support the cruel practices by which the animals are bred and made available, including the repeated anguish visited on mother dogs and cats whose babies are all taken away from them for sale, just as people purchase beef, pork, dairy, and eggs, and thereby support the torture and slaughter of billions of animals morally indistinguishable from our more favored empathy commodities. None of these purchases is necessary, and all contribute to immeasurable suffering.
A question that arose for both me and Michael when we were facilitating the undergraduates’ discussion of Androids is why most people who read the book do not seem to notice the vegan/animal rights messages. Most (especially faculty) seem to want to focus almost exclusively on the commentary about technology, machines, the singularity, and the line between authentic (humans, animals, moods) and artificial. The only animal-related lecture presented this week appeared to be about fake pets rather than about the moral questions Dick poses about the consumption of animal products.
Michael articulated an interesting theory of why this is so. Like fantasy fiction in totalitarian regimes, written allegorically to avoid alerting the censors to a subversive message, science fiction in relatively free societies must elude the internal censors that we all have. By situating Androids in a post-apocalyptic world, Dick makes it possible for us to be critical of the androids for failing to feel empathy for animals, even as we miss the fact that it is we who are failing to live up to our distinguishing feature in this regard, in the real world. Sadly, however, since no one at Cornell appears to be talking about the animal rights themes of Androids, Dick’s message about the consumption of animals – however clear and straightforward – has failed to get by the internal censors here. I hope that this post will serve as a corrective for that sad (and likely unintended) success of our internal censors.