Reality and Humanity in Neuromancer

Adam Mazer '08, English 65, The Cyborg Self, Brown University (Spring 2005)

In Neuromancer, William Gibson plays with the concepts of reality and humanity. Most interestingly, Gibson uses the idea of artificial intelligence, or AI to explore the concept of human consciousness and sentience in a digitized age. Possibly the most striking example occurs in Case's conversation with Dixie.

"Wait a sec." Case said. "Are you sentient, or not?"

"Well, it feels like I am, kid, but I'm really just a bunch of ROM. It's one of them, ah, philosophical questions, I guess. . . ." The ugly laughter sensation rattled down Case's spine. "But I ain't likely to write you no poem, if you follow me. Your AI, it just might. But it ain't no way human."

"So you figure we can't get on to its motive?"

"It own itself?"

"Swiss citizen, but T-A own the basic software and the mainframe."

"That's a good one," the construct said. "Like, I own your brain and what you know, but your thoughts have Swiss citizenship. Sure. Lotsa luck, AI." [pp. 127-28]

Questions

1. Could an artificial intelligence ever be described as "sentient"? Why or why not?

2. Is true artificial intelligence technologically possible? How do current AI's compare with those presented in the novel?

3. Assuming true AI existed, what should the ethics of that situation be? For example, how should ownership be handled?"

4. Should we go farther in attempting to actually create true AI, or is that an idea best left to novels and film? Why or or why not?

References

Gibson, William. Neuromancer. New York: Ace Books, 1984.


Cyberspace OV Cyborg  Mona Lisa Overdrive

Last modified 24 February 2005