Random House: Bringing You the Best in Fiction, Nonfiction, and Children's Books
Newletters and Alerts

Buy now from Random House

  • I, Robot
  • Written by Isaac Asimov
  • Format: Trade Paperback | ISBN: 9780553382563
  • Our Price: $15.00
  • Quantity:
See more online stores - I, Robot

Buy now from Random House

  • I, Robot
  • Written by Isaac Asimov
  • Format: Paperback | ISBN: 9780553294385
  • Our Price: $7.99
  • Quantity:
See more online stores - I, Robot

Buy now from Random House

  • I, Robot
  • Written by Isaac Asimov
  • Format: eBook | ISBN: 9780553900330
  • Our Price: $7.99
  • Quantity:
See more online stores - I, Robot

Buy now from Random House

  • I, Robot
  • Written by Isaac Asimov
    Read by Scott Brick
  • Format: Unabridged Audiobook Download | ISBN: 9780739312711
  • Our Price: $13.75
  • Quantity:
See more online stores - I, Robot

I, Robot

    Select a Format:
  • Book
  • eBook
  • Audiobook

Written by Isaac AsimovAuthor Alerts:  Random House will alert you to new works by Isaac Asimov


List Price: $7.99


On Sale: June 01, 2004
Pages: 272 | ISBN: 978-0-553-90033-0
Published by : Spectra Ballantine Group

Audio Editions

Read by Scott Brick
On Sale: June 01, 2004
ISBN: 978-0-7393-1271-1
More Info...
Listen to an excerpt
Visit RANDOM HOUSE AUDIO to learn more about audiobooks.

I, Robot Cover

Share & Shelve:

  • Add This - I, Robot
  • Email this page - I, Robot
  • Print this page - I, Robot
Tags for this book (powered by Library Thing)
science fiction (842) fiction (525) robots (295)
» see more tags
> Link to Audio Clip
Related Links


The three laws of Robotics:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm
2) A robot must obey orders givein to it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

With these three, simple directives, Isaac Asimov changed our perception of robots forever when he formulated the laws governing their behavior. In I, Robot, Asimov chronicles the development of the robot through a series of interlinked stories: from its primitive origins in the present to its ultimate perfection in the not-so-distant future--a future in which humanity itself may be rendered obsolete.

Here are stories of robots gone mad, of mind-read robots, and robots with a sense of humor. Of robot politicians, and robots who secretly run the world--all told with the dramatic blend of science fact and science fiction that has become Asmiov's trademark.



"Ninety-eight—ninety-nine—one hundred." Gloria withdrew her chubby little forearm from before her eyes and stood for a moment, wrinkling her nose and blinking in the sunlight. Then, trying to watch in all directions at once, she withdrew a few cautious steps from the tree against which she had been leaning.

She craned her neck to investigate the possibilities of a clump of bushes to the right and then withdrew farther to obtain a better angle for viewing its dark recesses. The quiet was profound except for the incessant buzzing of insects and the occasional chirrup of some hardy bird, braving the midday sun.

Gloria pouted, "I bet he went inside the house, and I've told him a million times that that's not fair."

With tiny lips pressed together tightly and a severe frown crinkling her forehead, she moved determinedly toward the two-story building up past the driveway.

Too late she heard the rustling sound behind her, followed by the distinctive and rhythmic clump-clump of Robbie's metal feet. She whirled about to see her triumphing companion emerge from hiding and make for the home-tree at full speed.

Gloria shrieked in dismay. "Wait, Robbie! That wasn't fair, Robbie! You promised you wouldn't run until I found you." Her little feet could make no headway at all against Robbie's giant strides. Then, within ten feet of the goal, Robbie's pace slowed suddenly to the merest of crawls, and Gloria, with one final burst of wild speed, dashed pantingly past him to touch the welcome bark of home-tree first.

Gleefully, she turned on the faithful Robbie, and with the basest of ingratitude, rewarded him for his sacrifice by taunting him cruelly for a lack of running ability.

"Robbie can't run," she shouted at the top of her eight-year-old voice. "I can beat him any day. I can beat him any day." She chanted the words in a shrill rhythm.

Robbie didn't answer, of course—not in words. He pantomimed running instead, inching away until Gloria found herself running after him as he dodged her narrowly, forcing her to veer in helpless circles, little arms outstretched and fanning at the air.

"Robbie," she squealed, "stand still!"—And the laughter was forced out of her in breathless jerks.

—Until he turned suddenly and caught her up, whirling her round, so that for her the world fell away for a moment with a blue emptiness beneath, and green trees stretching hungrily downward toward the void. Then she was down in the grass again, leaning against Robbie's leg and still holding a hard, metal finger.

After a while, her breath returned. She pushed uselessly at her disheveled hair in vague imitation of one of her mother's gestures and twisted to see if her dress were torn.

She slapped her hand against Robbie's torso, "Bad boy! I'll spank you!"

And Robbie cowered, holding his hands over his face so that she had to add, "No, I won't, Robbie. I won't spank you. But anyway, it's my turn to hide now because you've got longer legs and you promised not to run till I found you."

Robbie nodded his head—a small parallelepiped with rounded edges and corners attached to a similar but much larger parallelepiped that served as torso by means of a short, flexible stalk—and obediently faced the tree. A thin, metal film descended over his glowing eyes and from within his body came a steady, resonant ticking.

"Don't peek now—and don't skip any numbers," warned Gloria, and scurried for cover.

With unvarying regularity, seconds were ticked off, and at the hundredth, up went the eyelids, and the glowing red of Robbie's eyes swept the prospect. They rested for a moment on a bit of colorful gingham that protruded from behind a boulder. He advanced a few steps and convinced himself that it was Gloria who squatted behind it.

Slowly, remaining always between Gloria and home-tree, he advanced on the hiding place, and when Gloria was plainly in sight and could no longer even theorize to herself that she was not seen, he extended one arm toward her, slapping the other against his leg so that it rang again. Gloria emerged sulkily.

"You peeked!" she exclaimed, with gross unfairness. "Besides I'm tired of playing hide-and-seek. I want a ride."

But Robbie was hurt at the unjust accusation, so he seated himself carefully and shook his head ponderously from side to side.

Gloria changed her tone to one of gentle coaxing immediately, "Come on, Robbie. I didn't mean it about the peeking. Give me a ride."

Robbie was not to be won over so easily, though. He gazed stubbornly at the sky, and shook his head even more emphatically.

"Please, Robbie, please give me a ride." She encircled his neck with rosy arms and hugged tightly. Then, changing moods in a moment, she moved away. "If you don't, I'm going to cry," and her face twisted appallingly in preparation.

Hard-hearted Robbie paid scant attention to this dreadful possibility, and shook his head a third time. Gloria found it necessary to play her trump card.

"If you don't," she exclaimed warmly, "I won't tell you any more stories, that's all. Not one—"

Robbie gave in immediately and unconditionally before this ultimatum, nodding his head vigorously until the metal of his neck hummed. Carefully, he raised the little girl and placed her on his broad, flat shoulders.

Gloria's threatened tears vanished immediately and she crowed with delight. Robbie's metal skin, kept at a constant temperature of seventy by the high resistance coils within, felt nice and comfortable, while the beautifully loud sound her heels made as they bumped rhythmically against his chest was enchanting.

"You're an air-coaster, Robbie, you're a big, silver air-coaster. Hold out your arms straight. —You got to, Robbie, if you're going to be an air-coaster."

The logic was irrefutable. Robbie's arms were wings catching the air currents and he was a silver 'coaster.

Gloria twisted the robot's head and leaned to the right. He banked sharply. Gloria equipped the 'coaster with a motor that went "Br-r-r" and then with weapons that went "Powie" and "Sh-sh-shshsh." Pirates were giving chase and the ship's blasters were coming into play. The pirates dropped in a steady rain.

"Got another one. —Two more," she cried.

Then "Faster, men," Gloria said pompously, "we're running out of ammunition." She aimed over her shoulder with undaunted courage and Robbie was a blunt-nosed spaceship zooming through the void at maximum acceleration.

Clear across the field he sped, to the patch of tall grass on the other side, where he stopped with a suddenness that evoked a shriek from his flushed rider, and then tumbled her onto the soft, green carpet.

Gloria gasped and panted, and gave voice to intermittent whispered exclamations of "That was nice!"

Robbie waited until she had caught her breath and then pulled gently at a lock of hair.

"You want something?" said Gloria, eyes wide in an apparently artless complexity that fooled her huge "nursemaid" not at all. He pulled the curl harder.

"Oh, I know. You want a story."

Robbie nodded rapidly.

"Which one?"

Robbie made a semi-circle in the air with one finger.

The little girl protested, "Again? I've told you Cinderella a million times. Aren't you tired of it? —It's for babies."

Another semi-circle.

"Oh, well," Gloria composed herself, ran over the details of the tale in her mind (together with her own elaborations, of which she had several) and began:

"Are you ready? Well—once upon a time there was a beautiful little girl whose name was Ella. And she had a terribly cruel step-mother and two very ugly and very cruel step-sisters and—"

Gloria was reaching the very climax of the tale—midnight was striking and everything was changing back to the shabby originals lickety-split, while Robbie listened tensely with burning eyes—when the interruption came.


It was the high-pitched sound of a woman who has been calling not once, but several times; and had the nervous tone of one in whom anxiety was beginning to overcome impatience.

"Mamma's calling me," said Gloria, not quite happily. "You'd better carry me back to the house, Robbie."

Robbie obeyed with alacrity for somehow there was that in him which judged it best to obey Mrs. Weston, without as much as a scrap of hesitation. Gloria's father was rarely home in the daytime except on Sunday—today, for instance—and when he was, he proved a genial and understanding person. Gloria's mother, however, was a source of uneasiness to Robbie and there was always the impulse to sneak away from her sight.

Mrs. Weston caught sight of them the minute they rose above the masking tufts of long grass and retired inside the house to wait.

"I've shouted myself hoarse, Gloria," she said, severely. "Where were you?"

"I was with Robbie," quavered Gloria. "I was telling him Cinderella, and I forgot it was dinner-time."

"Well, it's a pity Robbie forgot, too." Then, as if that reminded her of the robot's presence, she whirled upon him. "You may go, Robbie. She doesn't need you now." Then, brutally, "And don't come back till I call you."

Robbie turned to go, but hesitated as Gloria cried out in his defense, "Wait, Mamma, you got to let him stay. I didn't finish Cinderella for him. I said I would tell him Cinderella and I'm not finished."


"Honest and truly, Mamma, he'll stay so quiet, you won't even know he's here. He can sit on the chair in the corner, and he won't say a word,—I mean he won't do anything. Will you, Robbie?"

Robbie, appealed to, nodded his massive head up and down once.

"Gloria, if you don't stop this at once, you shan't see Robbie for a whole week."

The girl's eyes fell, "All right! But Cinderella is his favorite story and I didn't finish it. —And he likes it so much."

The robot left with a disconsolate step and Gloria choked back a sob.

George Weston was comfortable. It was a habit of his to be comfortable on Sunday afternoons. A good, hearty dinner below the hatches; a nice, soft, dilapidated couch on which to sprawl; a copy of the Times; slippered feet and shirtless chest;—how could anyone help but be comfortable?

He wasn't pleased, therefore, when his wife walked in. After ten years of married life, he still was so unutterably foolish as to love her, and there was no question that he was always glad to see her—still Sunday afternoons just after dinner were sacred to him and his idea of solid comfort was to be left in utter solitude for two or three hours. Consequently, he fixed his eye firmly upon the latest reports of the Lefebre-Yoshida expedition to Mars (this one was to take off from Lunar Base and might actually succeed) and pretended she wasn't there.

Mrs. Weston waited patiently for two minutes, then impatiently for two more, and finally broke the silence.



"George, I say! Will you put down that paper and look at me?"

The paper rustled to the floor and Weston turned a weary face toward his wife, "What is it, dear?"

"You know what it is, George. It's Gloria and that terrible machine."

"What terrible machine?"

"Now don't pretend you don't know what I'm talking about. It's that robot Gloria calls Robbie. He doesn't leave her for a moment."

"Well, why should he? He's not supposed to. And he certainly isn't a terrible machine. He's the best darn robot money can buy and I'm damned sure he set me back half a year's income. He's worth it, though—darn sight cleverer than half my office staff."

He made a move to pick up the paper again, but his wife was quicker and snatched it away.

"You listen to me, George. I won't have my daughter entrusted to a machine—and I don't care how clever it is. It has no soul, and no one knows what it may be thinking. A child just isn't made to be guarded by a thing of metal."

Weston frowned, "When did you decide this? He's been with Gloria two years now and I haven't seen you worry till now."

"It was different at first. It was a novelty; it took a load off me, and—and it was a fashionable thing to do. But now I don't know. The neighbors—"

"Well, what have the neighbors to do with it. Now, look. A robot is infinitely more to be trusted than a human nursemaid. Robbie was constructed for only one purpose really—to be the companion of a little child. His entire 'mentality' has been created for the purpose. He just can't help being faithful and loving and kind. He's a machine—made so. That's more than you can say for humans."

"But something might go wrong. Some—some—" Mrs. Weston was a bit hazy about the insides of a robot, "some little jigger will come loose and the awful thing will go berserk and—and—" She couldn't bring herself to complete the quite obvious thought.

"Nonsense," Weston denied, with an involuntary nervous shiver. "That's completely ridiculous. We had a long discussion at the time we bought Robbie about the First Law of Robotics. You know that it is impossible for a robot to harm a human being; that long before enough can go wrong to alter that First Law, a robot would be completely inoperable. It's a mathematical impossibility. Besides I have an engineer from U.S. Robots here twice a year to give the poor gadget a complete overhaul. Why, there's no more chance of anything at all going wrong with Robbie than there is of you or I suddenly going looney—considerably less, in fact. Besides, how are you going to take him away from Gloria?"

He made another futile stab at the paper and his wife tossed it angrily into the next room.

"That's just it, George! She won't play with anyone else. There are dozens of little boys and girls that she should make friends with, but she won't. She won't go near them unless I make her. That's no way for a little girl to grow up. You want her to be normal, don't you? You want her to be able to take her part in society."

"You're jumping at shadows, Grace. Pretend Robbie's a dog. I've seen hundreds of children who would rather have their dog than their father."

"A dog is different, George. We must get rid of that horrible thing. You can sell it back to the company. I've asked, and you can."

"You've asked? Now look here, Grace, let's not go off the deep end. We're keeping the robot until Gloria is older and I don't want the subject brought up again." And with that he walked out of the room in a huff.

Mrs. Weston met her husband at the door two evenings later. "You'll have to listen to this, George. There's bad feeling in the village."
Isaac Asimov

About Isaac Asimov

Isaac Asimov - I, Robot
Isaac Asimov began his Foundation Series at the age of twenty-one, not realizing that it woudl one day be considered a conerstone of science fiction. During his legendary career, Asimov penned over 470 books on subjects ranging from science to Shakespeare to histroy, though he was most loved for his award-winning science fiction sagas, whcih include the Robot, Empire, and Foundation series. Named a Grand Master of Science Fiction by the Science Fiction and Fantasy Writers of America, Asimov entertained and educated readers of all ages for close to five decades. He died, at the age of seventy-two, in April 1992.
Reader's Guide|About the Book|Author Biography|Discussion Questions|Teachers Guide

About the Book

Isaac Asimov’s Robot series and Foundation series comprise some of the greatest classics in their genre. They probe the questions of technology and destiny, war and politics that have captured readers’ imaginations for generations.

I, Robot, the first and most widely read book in Asimov’s Robot series, is a collection of nine stories that forever changed the world’s perception of artificial intelligence. Here are stories of sensitive robots, robots gone mad, mind-reading robots, prankster robots, and closeted robots that secretly dominate politics. Chronicling the robot’s development from primitive prototype to ultimate perfection, I, Robot blends scientific fact with science fiction in Asimov’s provocative style.

Foundation, Foundation and Empire, and Second Foundation tell the story of Hari Seldon, a brilliant visionary who synthesized history, psychology, and mathematical probability to shape a bold commandment for the future and steer humanity through a series of brutal eras. Following the collapse of a Galactic Empire, Hari gathered together the top scientists and scholars on a bleak planet at the very edge of the Galaxy in order to preserve the accumulated knowledge of mankind. He called his sanctuary the Foundation and designed it to withstand a dark age of ignorance, barbarism, and warfare that would last for the next thirty thousand years. But not even Hari could have predicted the intense barbarism lurking in space, or the birth of an extraordinary creature whose mutant intelligence would destroy all that Hari held dear.

The questions, discussion topics, and author biography that follow are intended to enhance your reading of these four classics written by one of the most widely recognized fiction authors of our time.

I, Robot
Isaac Asimov
0-553-29438-5 (paperback)
0-553-80370-0 (hardcover)

Isaac Asimov
0-553-29335-4 (paperback)
0-553-80371-9 (hardcover)

Foundation and Empire
Isaac Asimov
0-553-29337-0 (paperback)
0-553-80372-7 (hardcover)

Second Foundation
Isaac Asimov
0-553-29336-2 (paperback)
0-553-80373-5 (hardcover)

About the Guide

I, Robot
Isaac Asimov
0-553-29438-5 (paperback)
0-553-80370-0 (hardcover)

Isaac Asimov
0-553-29335-4 (paperback)
0-553-80371-9 (hardcover)

Foundation and Empire
Isaac Asimov
0-553-29337-0 (paperback)
0-553-80372-7 (hardcover)

Second Foundation
Isaac Asimov
0-553-29336-2 (paperback)
0-553-80373-5 (hardcover)

About the Author

Isaac Asimov began his Foundation series at the age of twenty-one, not realizing that it would one day be considered a cornerstone of science fiction. Including such award-winning science fiction sagas as the Robot, Empire, and Foundation series, Asimov is the author of more than 470 books and was named a Grand Master of Science Fiction by the Science Fiction and Fantasy Writer of America. He died, at the age of seventy-two, in April 1992.

Discussion Guides

1. Do Asimov’s now-famous Three Laws of Robotics mirror humanity’s ethics code in any way? Whose orders are human beings required to obey? Do our definitions of “harm” ever lead to the same confounding dilemmas experienced in I, Robot?

2. Why was Gloria’s mother unable to accept Robbie as an excellent nursemaid? Was Robbie premonitory on Asimov’s part—a prediction that children in the twenty-first century might form intense emotional attachments to electronics?

3. Cutie (QT) questions his origins and finds it impossible to believe that a human created him. In what ways did Powell and Donovan reinforce this belief?

4. Does the case of Stephen Byerley indicate that robots might make better politicians? Would this only hold true if, as the novel envisions, nations dissolve into massive world regions?

5. What is the ultimate commodity produced by U.S. Robot & Mechanical Men, Inc.? Does our global workforce follow this model in any way? Were humor and compassion inevitable traits in the robots? Do these traits interfere with productivity in the world of I, Robot?

6. In the book’s closing lines, Dr. Susan Calvin tells the narrator, “You will see what comes next,” as robots stand between mankind and destruction. How did her career lead up to such a precarious conclusion?

7. I, Robot has been turned into a major motion picture starring Will Smith. How does the movie compare with your book-reading experience? What do you think of the adjustments made and liberties taken when converting this collection of stories to one seamless film adaptation?

8. Foundation opens with the perspective of Gaal Dornick, “a country boy who had never seen Trantor before.” What is the effect of opening the novel with Gaal’s observations? Why did Hari Seldon extend such an invitation to Gaal?

9. In the trial portrayed in chapter 6, the Commission’s Advocate repeatedly rejects Hari’s deductions regarding the future. What has made Hari a target for exile? Why are his projections—supported by seemingly irrefutable logic and mathematics—so easily dismissed by his accusers?

10. Part 3 of Foundation begins with an entry from the Encyclopedia Galactica that reads, “Undoubtedly the most interesting aspect of the history of the four Kingdoms involves the strange society forced temporarily upon it during the administration of Salvor Hardin.” In what ways does Hardin distinguish himself from the other rulers described in the novel? What conditions fostered his rise to power?

11. The Foundation is intended in some ways as a kind of religious center. What are its doctrines? Can a religion of science fail?

12. Discuss the novel’s references to energy—in this case, nuclear power—in relation to political and economic supremacy. What other forces drive the novel’s hierarchies of dominance? How does the role of the Traders evolve in the novel’s closing chapters?

13. What were the root causes of the Foundation’s fall? Could its demise have been avoided, even after war had begun?

14. As Lord of the Universe, is Cleon II naïve or perceptive? In what ways do his sensibilities affect his fate?

15. What, ultimately, is the source of the Mule’s power to perform Conversions in Foundation and Empire? What role did psychology play in his own origins?

16. Do the Independent Trading Worlds accurately perceive their vulnerabilities? In contrast, what perpetuated Neotrantor’s survival?

17. Bayta’s final conversation with the Mule explains his moniker as well as his perceptions of how power is perpetuated. What does this dialogue indicate about gender roles in the realm of the Second Foundation, and about the possibility of democracy?

18. Discuss the spectrum of characters affected by the Mule in Second Foundation’s five opening interludes. In what ways do the Mule’s tactics vary?

19. In what ways does Bail Channis’s personality reflect a cultural shift from the previous Foundation novels?

20. Near the beginning of the fifteenth chapter, Arcadia is described as “dressed in borrowed clothes, standing on a borrowed planet in a borrowed situation of what seemed even to be a borrowed life.” In what ways is she both an unlikely and an ideal savior?

21. Scholarship such as the Encyclopedia project represented Hari’s belief in the power of learning (and even the power of the mind itself, in the form of neural microcurrents). To what extent is a civilization’s success measured by the survival of its knowledge?

22. The final chapter of Second Foundation offers a thoughtful coda to the novel. What is the “true” question to that chapter’s “answer that was true?”

23. If Hari Seldon’s equations were applied to Earth’s societies, what might the results be?

24. What connotations and root words were you able to derive from the character names and geographic locations featured in the series?

25. How does the series evolve as a whole? What overarching narrative is propelled by the events that occur within the individual books?

26. Isaac Asimov wrote these three books very early in his career, during the 1950s—an era marked by the Cold War, McCarthyism, and the early stages of the space race. How might the events of this period have shaped the Foundation storyline?

27. In what sense does the trilogy offer a cautionary tale for contemporary leaders in politics, science, and the humanities?

Teacher's Guide


I, ROBOT turns the world of science fiction literature on its head. Rather than telling the typical tale of a humanoid machine run amok (e.g., Terminator), SFWA Grand Master Isaac Asimov asks readers to imagine a world where robots protect us from our own worst nature. Beginning with a simple story about the relationship between a little girl and a limited-function robot, I Robot moves on to explore, in subsequent stories, increasingly sophisticated thoughts, questions, and moral complexities. In the process the book reveals Asimov’s overarching vision of a future that entangles inextricably the humans and the machines.

The stories grew from Asimov’s opinion that anyone smart enough to create robots would be smart enough to make sure that those robots wouldn’t attack their makers. Conceived by Asimov as the Three Laws of Robotics–essential laws built into the robots’ inner workings–these Laws freed science fiction writers to develop robots as characters instead of portraying them as monstrous things. I, ROBOT hints loudly that robots are a “better breed” than humans and though they were created to serve, they will inevitably become the masters.

At first Asimov had trouble getting “Robbie,” the opening story in I, ROBOT, published. But throughout the 1940s the subsequent tales appeared regularly in pulp science fiction magazines. In 1949, Asimov gathered the stories into a book that he wanted to call Mind and Iron. (The publisher prevailed with I, ROBOT.) The collection has enjoyed great success through the years and is offered by Spectra in a compact, affordable, and eminently readable edition for you–or your robot.

Plot Summary
“Introduction” 2057: Earth. An unnamed reporter for “Interplanetary Press” prepares to interview Susan Calvin, a seventy-five-year-old “Robopsychologist” who works for U.S. Robot and Mechanical Men, Inc., (U.S. Robots). Accused of being emotionless like a robot, Calvin argues that robots are more than mechanical parts, “They’re a cleaner better breed than we [humans] are.” Calvin reminisces about early opposition to robots from labor unions (that were worried about competition) and religious groups (that were worried about sacrilege). Against these anti-robot arguments she holds the memory of an early robot model named Robbie, which was sold in 1996 as a nursemaid for a little girl. Calvin begins to tell the story.

“Robbie” 1998: Earth. Robbie the Robot plays hide-and-go-seek outdoors with his charge, nine-year-old Gloria Weston. Robbie lets her win. Gloria is a demanding but charming girl who loves Robbie. At Robbie’s gestured urging (he cannot speak), she begins to tell him his favorite story, Cinderella, but her mother, Grace, calls them to come inside. Grace does not trust Robbie with her daughter and badgers her husband, George, to get rid of him until George finally gives in. Gloria is heartbroken when her parents take away Robbie. In an attempt to distract her, her parents decide to take Gloria on a trip to New York, hoping the excitement of the city will take her mind off Robbie. While in New York they tour the U.S. Robots factory. Gloria spies Robbie, one of “the robots creating more robots.” She runs toward him–right into the path of a moving tractor. Before anyone else has time to react, Robbie snatches Gloria out of harm’s way. Because Robbie has saved Gloria’s life, her mother grudgingly allows the robot to return to the family. At this point the frame story (Susan Calvin talking to the reporter) resumes. Calvin tells us that robots were banned from Earth between 2003 and 2007. To ensure the company’s survival, U.S. Robots started developing mining models for other planets. Calvin recalls two troubleshooters, Mike Donovan and Gregory Powell, who worked with the experimental designs in “the teens.”

“Runaround” 2015: Mercury. Gregory Powell and Mike Donovan have sent robot SPD 13, “Speedy,” on a quest for selenium, a necessary ingredient for their life support machinery. Selenium is somewhat dangerous to Speedy, and when the robot doesn’t return, Powell and Donovan decide they must go retrieve him from the surface. They find Speedy circling the selenium pool and gibbering. Speedy has gone crazy because two of the fundamental laws of robotics have come into conflict. Powell ordered Speedy to get the selenium (Second Law: always obey human orders). But since Powell wasn’t very insistent, the Second Law didn’t quite overwhelm the Third Law (self-protection). Caught between conflicting directives, Speedy hovers around the selenium pool, not quite able to get close enough to harm himself, but not able to leave the site because he has been ordered to go to the pool. To remove Speedy’s conflict, Powell walks towards Speedy, purposely going too far from safety for him to be able to return without Speedy’s help. Speedy sees him, causing the First Law to kick in (do not harm or allow a human to come to harm through inaction). Speedy saves Powell and they send the robot back for the selenium, this time installing in Speedy Second Law orders firm enough to counteract any Third Law thoughts of self-preservation. Speedy returns with the selenium and the pair anticipate their next work assignment at the space stations.

“Reason” 2015: Space Station. QT-1 (Cutie), a new model, refuses to believe that inferior humans created superior robots. Cutie decides that an Energy Converter has created robots, and that he is its Prophet. Cutie’s religion spreads to the other robots: they obey Cutie, but not Powell or Donovan. Meanwhile, a potentially dangerous electron storm is approaching. Widespread destruction on earth could occur if the storm is able to throw out of focus the energy beam sent from the station to earth. Cutie will not let Powell and Donovan make adjustments. To Cutie, humans are obsolete. Donovan and Powell try to argue with Cutie, but all their attempts fail. They then try to prove to Cutie that humans build robots by building a robot themselves. Cutie argues that the pair only assembled the robot; they did not create it. The electron storm comes and, luckily, Cutie keeps the beam focused because he believes he serves the Converter by keeping its instrumentation in balance (i.e., in focus). Powell points out that the Second Law (always obey human orders) requires QT to obey. No matter what robots believe to be the ultimate source of command, they will still do their duties.

“Catch That Rabbit” 2016: Asteroid. Donovan and Powell are sent to an asteroid to test a mining robot, DV-5 (Dave), that controls six subrobots. Dave has a problem: sometimes, for no apparent reason, the robot stops mining and starts marching his subrobots in drills. Though upset at his own behavior, Dave can’t explain why he is doing this. Donovan and Powell interview a subrobot, but that is like asking a “finger” why the hand does what it does. They figure out that situations requiring personal initiative (e.g., emergencies) cause the problem. But they can’t understand why because Dave always resumes his correct duties when they show up. They can’t take Dave apart to test him: all the circuits are intertwined, which means that testing them in isolation won’t help. They decide to create an emergency without Dave knowing and then watch what happens. They cause a cave-in on themselves, but Dave goes marching off with his subrobots, leaving them trapped. Powell then shoots one of the subrobots. Dave comes back to rescue them. Powell figured out that the six subrobots needed more command attention during an emergency, which put too much stress on “a special co-ordinating circuit.” With fewer robots to attend to, Dave was able to handle emergencies. At the end of this tale the frame story resumes with Susan Calvin. The reporter asks her if a robot had ever “gone wrong” on her watch. She hesitates, but then admits that this did happen once, and launches into the story of Herbie.

“Liar!” 2021: Earth. U.S. Robots accidentally creates RB-34 (Herbie), a robot that can read minds. Susan Calvin, Alfred Lanning, Milton Ashe, and Peter Bogert are assigned to find out how the mind reading has changed the robot: “what harm it has done to his ordinary RB [robot] properties.” Herbie has figured out that Calvin loves Ashe but does not feel worthy of being loved in return. Herbie assures her that Ashe loves her and that a woman Ashe brought to visit was only a cousin. When Bogert consults Herbie, the robot tells him that the director, Lanning, has retired and has put Bogert in charge. When Lanning questions some of Bogert’s calculations, Bogert informs an incredulous Lanning that he is no longer the boss. Later, during a conversation with Ashe, Calvin finds out that the girl isn’t Ashe’s cousin but his fiancée. Calvin realizes that Herbie has been lying to them because it was following the First Law of Robotics (do not harm a human). By telling each person what s/he wanted to hear, the robot was trying not to hurt the humans. Calvin asks the robot what went wrong in its assembly that made it able to read minds. This throws the robot into an impossible conflict. The robot can’t answer because it thinks it will make Lanning and Ashe feel bad to know that a robot figured out something that the scientists couldn’t. On the other hand, it must answer because Lanning and Ashe want to know the answer (Second Law: obey human commands). Since either action will cause harm to humans, the robot collapses. The frame story resumes again with Calvin sitting behind her desk with her face “white and cold.”

“Little Lost Rabbit” 2029: Hyper Base. Susan Calvin and Peter Bogert are called to a hyper base to identify one missing NS-S (Nestor) robot out of a fleet of sixty-three seemingly identical models. The missing robot is identical to all the others except that its positronic brain is not fully wired with the entire First Law of Robotics (against harming humans). It turns out that a worker, Gerald Black, got annoyed with the robot and told it, “Go lose yourself.” Obeying the order, Nestor 10 made itself indistinguishable from other robots. To flush out Nestor 10, Calvin arranges to have all the robots see a rock drop toward a human (the rock is deflected at the last second). She measures the reaction time of the robots as they rush to protect the human, reasoning that the robot that is not wired with a complete First Law will react differently. Her reasoning is wrong: they all react the same way. She tries another strategy. She tells the robots that they will be electrocuted if they move towards the human. Calvin reasons that only the robot with a weak First Law won’t move because the Third Law, self-preservation, will equal, not override, the First Law. But Nestor 10 had pointed out to the other robots earlier that if they were going to die, they wouldn’t be able to save the person anyway, so it was better not to move so they could live to save someone else another day. Finally, Calvin arranges a third test to flush out Nestor 10. Only Nestor 10 can tell the difference between harmless and harmful radiation. When the Nestor robots are all told that harmful radiation will be between them and the person in danger, all but one–Nestor 10–remain seated when the rock falls. Nestor 10 moves because he can see that the radiation is not dangerous. He tries to attack Calvin because she has found him out and the Second Law (obey orders) outweighs the weak First Law (don’t harm people). But because the room is bathed in gamma radiation, which kills robot brains, she survives.

“Escape!” 2030: Earth. A competing robot company, Consolidated Robots, asks U.S. Robots to solve a problem that fried their own “Super-Thinker.” The problem is how to build a hyperspace drive for humans. Susan Calvin thinks that the reason Consolidated is having problems is because building the hyperspace drive involves harm to humans, it brings the First Law (do not harm humans) and Second Law (obey human orders) into conflict. U.S. Robot’s own super-thinker, “The Brain,” however, is equipped with a personality. Calvin thinks that The Brain will be able to handle the dilemma because having a personality–emotional circuitry–makes it more “resilient.” But when the scientists feed the problem to The Brain, it doesn’t even acknowledge the existence of a problem, and promises to build the ship. The story jumps ahead to Powell and Donovan inspecting the ship two months later. While inside, the ship takes off and as the ship makes an interstellar jump, each man has a near-death experience. The men return from beyond the galaxy and Calvin learns that, during their time in hyperspace, the two were technically dead (matter turns to energy at light speed). Why was The Brain able to build the ship if it caused human death? It turns out that Calvin had adjusted The Brain’s controls to play down the significance of death for the robot. Since death on the ship was temporary, The Brain, unlike Consolidated’s “Super Thinker,” was able to ignore the harm aspect (First Law) of the order and build the ship (Second Law).

“Evidence” 2032. In the frame story, Calvin discusses how earth’s political structure changed from individual nations to large “regions.” She recalls a man, Stephen Byerly, who ran for a mayoralty of one of the regions. The story begins with Francis Quinn, a politician, trying to convince Lanning, Director of U.S. Robots, to keep Byerly from political office because Byerly is a robot. Byerly denies this, but lets Quinn base his campaign on testing whether or not he is a robot. Byerly returns home and tells John, an old, crippled man who lives with him and who he calls “teacher,” about Quinn’s strategy. Once informed that Byerly might be a robot, Fundamentalists begin huge protests outside Byerly’s home. He goes outside to talk to them and a man challenges Byerly to hit him. Byerly obliges and Calvin pronounces him a human, because the First Law (do not harm a human) would have stopped him if he were a robot. Later, Calvin reveals to Byerly that she suspects that he really is a robot. She recalls that a biophysicist named Byerly was horribly crippled in an accident. Calvin theorizes that the real Byerly is actually the old cripple, “John,” and that he built a new body around a positronic brain he’d acquired. Byerly doesn’t confess but does admit that he spread the rumor that if he were really a robot, he couldn’t hit a human being. Calvin suggests to him that the human Byerly hit wasn’t really a human, but another robot, which let him avoid any conflict with the First Law. She admits later that she doesn’t know whether or not he really was human.

“The Evitable Conflict” 2052: Earth. Earlier parts of the frame story give hints that Robots have evolved into dominating influences in human life as “Machines.” These Machines are vast systems of calculating circuits that maintain the human economy to minimize the harm that humans cause themselves. Earth has “no unemployment, no overproduction or shortages. Waste and famine are words in history books. And so the question of ownership of the means of production becomes obsolescent.” Byerly, now “World Co-ordinator,” worries because production is not precise, which means that the machines may be malfunctioning. Byerly interviews the Vice-Coordinators for each of Earth’s four regions (Eastern, Tropic, European, and Northern) but each one tries to downplay the production problems. Byerly guesses that behind it all is the “Society for Humanity,” a small group of powerful men “who feel themselves strong enough to decide for themselves what is best for them, and not just to be told [by robots] what is best for others.” But Calvin assures Byerly that the Machines allow the Society’s plots so that it will create just enough turmoil to destroy itself. Given that even the cleverest attempts to overthrow the Machines only result in more data for the Machines to consider, large-scale disruptions (wars, economic turmoil, etc.) will be avoidable–evitable. “Only the Machines, from now on, are inevitable!”


Isaac Asimov (January 2, 1920 — April 6, 1992)

Isaac Asimov was born in Russia but his family moved to New York when he was three years old. A self-proclaimed “child prodigy,” he could read before first grade and had an almost perfect memory. Asimov credits his early intellectual development to public libraries: “My real education, the superstructure, the details, the true architecture, I got out of the public library. For an impoverished child . . . the library was the open door to wonder and achievement.”

Asimov became fascinated with pulp science fiction magazines and by age eleven he began to write, imitating in them in subject matter and style. By eighteen, he had sold his first story, and by twenty-one he had published “Robbie”–the first tale in the series gathered here–after several rejections. Over the next few years, he continued to test the Three Laws of Robotics in a series of robot stories.

In 1941, inspired by Gibbon’s Decline and Fall of the Roman Empire, Asimov imagined writing about the rise and fall of future civilizations as if he were a historian looking back. He called such writing future-historical. The first story, “Foundation,” grew into a series that rivaled his robot stories for fame and influence. By 1949, the two series (and other writings), had cemented Asimov’s reputation as one of “The Big Three,” along with groundbreaking science fiction authors Arthur C. Clarke and Robert Heinlein.

Even with early success, Asimov could not afford to be a full time writer until 1958. Meanwhile, he continued to write while obtaining his B.S. in 1939, M.A. in 1941, and Ph.D. in 1948 from Columbia University in the field of Chemistry. He taught at Boston University from 1949 to 1958 and remained a faculty member throughout his life.

This “science” part of Asimov’s science fiction shows in his early commitment to make the science in his stories realistic (or at least plausible). In addition, throughout the 1960s and 1970s he primarily wrote non-fiction science works that covered a dazzling number of subjects including Astronomy, Earth Sciences, Physics, and Biology, among others.

Calling Asimov a prolific writer would be an understatement. Through the years he tried his hands at literary criticism (from Shakespeare to Gilbert and Sullivan), humor (mainly limericks), children’s literature, autobiography, and editing. In addition, he managed to find time to write histories of Europe, North America, Greece, Egypt, England, and Earth. He wrote more than 1,600 essays and published at least 450 books. Famously, Asimov had at least one book published in each of the ten major Dewey Decimal library classifications. As he said in an interview, “I wrote everything I could think of.”

Asimov won every major science fiction award during his life. He won seven Hugo awards, the first in 1963 and the last in 1995. He was also honored with two Nebula awards. In 1986, the Science Fiction Writers Association named him a Grand Master and eleven years later he was inducted into the Science Fiction and Fantasy Hall of Fame.

Asimov died in 1992 but his work lives on through new generations of readers, writers, and scientists. Rather than being outdated, his writing has proved prophetic. Reading his stories about robots in 1950, we would have thought that his reach exceeded his grasp. As advances in robotics and brain imaging have brought the idea of a human-like robot closer, we recognize that the day may come when we just might see Robbie tending to our own children.


1)Why are there no female robots in the stories?

2)The two groups consistently opposed to robots are labor unions and religious fundamentalists. Why did Asimov single out these groups to be threatened by robots? What do the groups share that makes them hostile towards robots? What other groups might not welcome robots into our world? Why? What groups would be happiest to see robots develop? Why?

3)What do Mike Donovan and Gregory Powell look like? Without letting them look at their books, have your students describe the two in as much detail as possible. What color are their eyes? How tall is each one? What race are they? Then ask your students to prove their descriptions from citations in the book. Asimov gives only one physical detail about the two (in “Catch That Rabbit!”): Donovan has red hair.

4)In teams or individually, have your students check Asimov’s science. For example, when he claims that sound can’t travel on an airless asteroid (“Catch That Rabbit”), is he right? Why or why not? You can adjust this topic to your students’ level. For more exotic questions, they may try to figure out how fooling around with hyperspace might blow a hole in “the normal space-time fabric” (“Little Lost Robot”).

5)Why does Robbie always want Gloria to tell him the story of Cinderella? If Robbie is a modern Cinderella, who are the wicked stepsisters, the fairy godmother, and the prince?

6)Early in “Liar!” RB-34 says that fiction helps him understand people better than science does. For him, “Science is just a mass of data plastered together by make-shift theory.” In what ways do your students agree with RB-34? How can a novel tell you more about humans than an anthropological textbook, or physics? What facts do novels leave out that science books include?

7)Asimov tended to be a pessimist about humanity. Late in life, he allowed the possibility that humans might improve in the future. “But still,” he said, “people tend to do things that harm Humanity.” Do your students agree or disagree with Asimov’s pessimism? Why or why not? What, for Asimov, makes robots “a better breed” than us? What, for your students, makes us a better breed than robots?

8)The stories often use scientific-sounding terms without explanation (e.g., “hyper-imaginary,” “Mitchell Translation Equation,” “positronic brain,” “hyperatomic travel,” “Planar reactions,” “etheric physics,” etc.). Have your students pick a few of them and invent stories–consistent with their context in I, ROBOT–that explain what these terms are and how they work.

9)Why do robots call humans “master” while humans generically refer to the robots as “boy”?

10)In “Evidence,” Asimov writes that “the three Laws of Robotics” are the essential guiding principles of a good many of the world’s ethical systems.” First, lead your class in a discussion of the concept of an ethical system. What is it? Can your students identify some? Then encourage your students to find out how many of the three Laws are represented in the systems they have identified, and in what order. Does the U.S. legal system, for example, require obedience over self-preservation? If so, why? If not, why not?

11)Through the 1940s, Asimov published each story in I, ROBOT as an individual story. In 1950, he collected the stories and published them together as a book. What clues can your students find in the book that show that the stories have been joined together? Where are the seams? What techniques did Asimov use to make the stories seem like one whole book? Where does this reweaving work well? Poorly? Why does it work in some places but not others?

12)This book begins with a story about a robot that is dominated by a little child (“Robbie!”) and ends a story in which robots control every facet of human life (“The Evitable Conflict”). How likely do your students find a situation where humans would give up control of their worlds to machines? Would we give up the ability to own things? To determine our own movements? To what degree do they think we already have? What signs are there that our lives already have become controlled by machines? That we control our machines?

13)Asimov admits in his Memoirs that, in his early writing, he was most comfortable with European-American characters. What signs of discomfort can your students detect when he writes non-European characters like Ching Hso-lin or Lincoln Ngoma (“The Evitable Conflict”)? Put another way, would Asimov have written any differently if Hso-lin (or others) had been Powell or Donovan? For example, would he have noted that Powell spoke in “precise English” as with Hso-lin, or that Donovan’s English was “colloquial and mouth-filling,” as with Ngoma?

14)Although most readers focus on the Three Laws of Robotics as the animating principle for the robot stories, there is another factor at work: emotional attachment. Asimov said, “Back in 1939, I realized robots were lovable.” What is lovable about the various robots in the stories? Which one was the most lovable? Why? Which was least lovable? Why? How does Asimov manage to make a hunk of metal lovable (or unlovable)?

15)How would the collection have changed if it were titled Mind and Iron (as Asimov wanted to call it originally)? What does the title, I, ROBOT, communicate that the title, Mind and Iron, doesn’t? Similarly, how would the first story change if it were titled “Strange Playfellow” instead of “Robbie?” What does strange playfellow setup that Robbie doesn’t? Come up with other titles that Asimov might have considered for the individual stories and the whole collection.

16)I, ROBOT has been turned into a major motion picture starring Will Smith. How does the movie compare with your book-reading experience? What do you think of the adjustments made and liberties taken when converting this collection of stories to one seamless film adaptation?


1)Have your students invent their own philosophical puzzle involving the Three Laws of Robotics using Asimov’s human characters, but new robots. They might, for example, imagine a story where Powell and Donovan meet "Star Wars" character R2-D2 who is pulled in three directions by an order to destroy himself, the knowledge that destroying himself will kill a human, and the knowledge that not destroying himself will kill another human.

2)Asimov was deathly afraid of flying, but many of his stories involve flight across the earth to other planets and to distant galaxies. Have your students choose something they fear, and encourage them to write a science fiction story that involves that fear indirectly. For example, if someone is afraid of heights, s/he might write about a society that lives in the treetops. If someone is afraid of spiders, s/he might write about a society based on the pattern of a spider’s web. After they write the story, have them consider how fear factored into their composition. Did they tend to write less about what they were afraid of? More? Did they write about their fear less directly? Return to Asimov’s stories and see if you can identify the marks of fear when he writes about flight. (He was also afraid of other things that they might look for in a biography, or his memoirs.)

3)Let your students pick a story by another author that involves robots and compare it to Asimov’s. What similar concerns do they have? How human are the robots? What contrasts do they find between themes that interest Asimov and the other author?

What role do machines play in our lives today? Have your students keep a journal that lists every machine that helps them live their lives. A list might start, for example, with the alarm clock that wakes them up, the refrigerator that keeps the milk cold, the water heater that keeps the water hot, the computer that transmits email and stores their homework, the vehicle that drives them to school, the phones that deliver messages and pictures, and so on. What would life be like without these machines? In discussion, or writing, have them imagine a world where one by one, all these machines vanish. How would we eat, communicate, travel, etc.? Turning what they learn to the past, have your students research the history of a machine that has become indispensable to us today. What did people do before a particular machine was invented (e.g., clocks)? What changes happened when the machine was invented? Perfected? Turning toward the future, ask your students to think of machines that have yet to be invented? What things will become necessary to future generations that we do not have?


A Note on Inanimacy

We tend to understand things as if they were people. Literarily, this is called personification. James McIntyre, for example, paid homage to a huge block of cheese in a poem that begins, “We have seen thee, queen of cheese, / Lying quietly at your ease, / Gently fanned by evening breeze, / Thy fair form no flies dare seize.”

Although cheese being alive seems silly, it is a fact that we frequently animate the world around us. We curse rocks on which we stub our toes. We explain our hopes to stuffed animals. And we spend countless hours trying to outwit video games that are nothing more than shifting patterns of light, like sunshine dappled through waving leaves.

The stories that make up our cultural heritage often return to the theme of transforming “stuff” into “life.” The Yiddish tale of Golem describes a being made out of mud and created to serve that becomes sad that it isn’t like other children. From Italy comes the story of Pinocchio, a little boy carved from wood. The ancient Greeks offered us King Pygmalion’s wife, Galatea, who began as a statue carved from ivory, and ended as a living woman. And who can forget that horrible little doll Chucky from the movies! Similarly, stories from Africa, China, India, Australia, and other parts of the world populate our imaginations with talking dogs, unreasonable dragons, and spirits that appear as one thing only to slip into a more comfortable avatar later.

Your students themselves may be heirs to some stories in this tradition. Invite them to tell similar stories that they know and/or ask them to query their families about stories like this. Alongside this, they can explore the library (Asimov’s intellectual foundation) for books on mythology, folk tales, and fairy tales. What do the animation stories share? Where do they differ? How might these similarities and differences result from the particular time/culture where the stories originated?

Given the general human habit of treating things like people, Asimov’s idea that robots would look like humans is reasonable. However, in the real world, that seems not to be the case. As we look at the machine servants around us, few of them look remotely human. (Cell phones, for example, do not look like ears and mouths.) Even Honda’s humanoid robot named–appropriately enough– ASIMO looks more like something a child built out of LEGO blocks rather than the product of the some of the best minds in robotics.

As machines become more powerful and necessary in our lives, we seem to want to make them invisible–to forget them. Even Apple Computer’s stunning counter-design, bright color approach is balanced by its use of organic, rounded forms: there are more curves in nature than straight lines. The push in design is always towards the smaller, the more discreet. It’s as if by making the technology that dominates our lives less apparent, we can pretend that we are still in control–not such a different arrangement from what Asimov imagined.

Explore with your students other design possibilities for technological devices. What things could be bigger? Smaller? Different colors and shapes? Why do particular machines look the way they do? How could they look differently? What would happen, for example, if a calculator were round and bounced? Encourage your students to have fun. They may uncover something groundbreaking. As Asimov noted, “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka’ (I’ve found it), but ‘That’s funny…’”


Darryl Stephens is a Ph.D. candidate in English Literature at U.C. Berkeley and received his undergraduate degree from the same institution in English and Linguistics. He is currently studying how the brain realizes literature.

  • I, Robot by Isaac Asimov
  • April 29, 2008
  • Fiction - Science Fiction
  • Spectra
  • $15.00
  • 9780553382563

Your E-Mail Address
send me a copy

Recipient's E-Mail Address
(multiple addresses may be separated by commas)

A personal message: