I, Robot Quotes

Quotes

1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Gregory Powell

This quote is the foundation of “I, Robot.” Officially known as the Three Laws of Robotics by which all robots must adhere. This is true of the narrative at hand, but the legacy extends far beyond. The overwhelming majority of fiction written in which robots appears tend to follow these rules as plot guidelines. Nothing suggests that they cannot be violated, but Asimov did not just invent a codified set of rules and regulations by which his own stories must abide, he penetrated to the heart of dramatic license: a world with no rules for robots actually serves to reduce the potential drama and conflict. One way of looking at the Three Laws of Robotics is a form of censorship. By effectively censoring what a writer can do with robots, Asimov challenged writers to become creative in the way they constructed stories that still heeded these regulation. Thus, in a way, the Three Laws demonstrate how self-censorship can actually be a stimulant for artistic expression rather than a handcuff upon it.

“No! Not dead -- merely insane. I confronted him with the insoluble dilemma, and he broke down. You can scrap him now -- because he’ll never speak again.”

Susan Calvin

Susan Calvin has just caused a robot to go insane. By now, the drill is familiar enough: give a robot a paradox which logical cannot solve and its circuits fry and its head explode. It is a scene that has become so familiar since the book was first published that it has become the stuff of parody and satire. But here it is presented in all its horrifying—if necessary—cruelty.

“There’s bad feeling in the village. Oh, it’s been building up and building up. I’ve tried to close my eyes to it, but I’m not going to any more. Most of the villagers consider Robbie dangerous. Children aren’t allowed to go near our place in the evenings.”

Mrs. Weston

Asimov is having a bit of fun here. The author affirmed that he was motivated to write his robot stories in part as a response to what he terms the “Frankenstein complex” in which a scientist’s creation endowed with some measure of sentience inevitably turns upon his maker in an effort to destroy him. The Three Laws of Robotics arose from his rejection of this trope; the laws were a way to circumvent inevitability. If science were capable of creating robots with the sophistication to turn upon its makers, Asimov reasoned, it would sure be capable of creating technological failsafe procedures to prevent exactly that. Mrs. Weston’s expression of fear about the villagers is a somewhat tongue-in-cheek reference to the familiar scenes from Universal Studios’ Frankenstein movies which almost always seemed to end with villagers making their way to Frankenstein’s lab armed with torches and pitchforks on a mission to destroy what the mad doctor would not.

“I’m sorry, but you don’t understand. These are robots -- and that means they are reasoning beings. They recognize the Master, now that I have preached Truth to them. All the robots do. They call me the prophet.”

Cutie (QT-1 robot)

Cutie is a robot that develops a serious case of delusions of grandeur. Through logical reasoning, it has arrived at the conclusion that since humans were created before robots, they represent a lower form of order and that robots are their natural superior. This being the case, the Second Law of Robotics is inevitably in conflict with Cutie’s reasoning. And since Cutie is preaching this gospel to other robots to the point of being elevated to prophet, that means that humans not only have to worry about not robots not taking orders, but that they have to worry about robots not taking orders to refrain from harming them. The chapter titled “Reason” is where I, Robot comes face to face with the fragility of the robot laws and where they are put to their toughest test.

Update this section!

You can help us out by revising, improving and updating this section.

Update this section

After you claim a section you’ll have 24 hours to send in a draft. An editor will review the submission and either publish your submission or provide feedback.

Cite this page