Artificial Emotion and Sensations

Peter Peng '08, English 65, The Cyborg Self, Brown University (Spring 2005)

William Gibson's sensational cyberpunk novel Neuromancer pits two apparently opposite forces against one another. But strangely, it seems that the protagonist Case is constantly trying to understand the AI while the AI is constantly trying to understand humans. The most dramatic argument against the difference between humans and AI is sentience, generally defined as having a conscience or having emotions. Throughout his novel, Gibson frequently stops to describe the emotions of his characters. What is most interesting about this is how these emotions are elicited. When Case walks into Emergency, a small bar, he starts questioning his emotions.

He bought a mug of Carlsberg and found a place against the wall. Closing his eyes, he felt for the knot of rage, the pure small coal of his anger. It was there still. Where had it come from? He remembered feeling only a kind of bafflement at his maiming in Memphis, nothing at all when he'd killed to defend his dealing interests in Night City, and a slack sickness and loathing after Linda's death under the inflated dome. But no anger. Small and far away, on the mind's screen, a semblance of Deane struck a semblance of an office wall in an explosion of brains and blood. He knew then: the rage had come in the arcade, when Wintermute rescinded the simstim ghost of Linda Lee, yanking away the simple animal promise of food, warmth, a place to sleep. But he hadn't become aware of it until his exchange with the holo-construct of Lonny Zone.

It was a strange thing. He couldn't take its measure.

"Numb," he said. He'd been numb a long time, years. All his nights down Ninsei, his nights with Linda, numb in bed and numb at the cold sweating center of every drug deal. But now he'd found this warm thing, this chip of murder. Meat, some part of him said. It's the meat talking, ignore it. [Gibson 146]

These emotions are so affected, it becomes hard to claim they can be the differential attributes that make someone or something human. As the scene progresses, Cath shows up to inject Case with a drug, and Case's reaction is incredible.

The anger was expanding, relentless, exponential, riding out behind the betaphenethylamine rush like a carrier wave, a seismic fluid, rich and corrosive. His erection was a bar of lead. The faces around them in Emergency were painted doll things, the pink and white of mouth parts moving, moving, words emerging like discrete balloons of sound. He looked at Cath and saw each pore in the tanned skin, eyes flat as dumb glass, a tint of dead metal, a faint bloating, the most minute asymmetries of breast and collarbone, the -- something flared white behind his eyes. [Gibson 146-147]

Discussion Questions

1. Why does Case become angry only when he thinks of Wintermute rescinding the simstim ghost of Linda Lee? Why does Case not feel anger during all those other events when clearly they were provoking enough to even bring him to kill? What does this say about Case's humanity?

2. People who are cruel or emotionless are often called inhuman or described as inhumane. Can one person be more human than another? Are there degrees of humanness? Or is it a black and white distinction (for example, between humans and AI)?

3. Cath had hoped to seduce Case with the drug, but what goes wrong with the plan that causes Case's anger? Is it a drug-induced anger or is it a true human emotion, caused by Cath's invasive action? What even constitutes true emotion?

4. If an AI can be made to react in a way that displays anger or other emotions, would that mean it is sentient? Or would it actually have to feel the emotion internally and not just create an outward display of emotion? What does it mean for humans to actually feel emotion? After all, aren't human emotions determined by the arrangement of neurons and chemicals in the body? Is that not just as artificial?

References

Gibson, William. Neuromancer. New York: Ace Books, 1984.


Cyberspace OV Cyborg  Mona Lisa Overdrive

Last modified 14 February 2005