Novels Exam  >  Novels Notes  >  I, Robot by Isaac Asimov- Summary, Themes & Characters  >  Morality And Ethics Quotes - I, Robot Morality And Ethics Quotes

Morality And Ethics Quotes - I, Robot Morality And Ethics Quotes | I, Robot by Isaac Asimov- Summary, Themes & Characters - Novels PDF Download

How we cite our quotes:

"They're a cleaner better breed than we are." (Introduction.32)

This is Calvin's final opinion on robots, and we'll hear this a lot in the last two stories, "Evidence" and "The Evitable Conflict." Notice that we get this idea up front, in the very introduction. It's like Asimov doesn't want us to miss it: robots are fundamentally good.

You know that it is impossible for a robot to harm a human being; that long before enough can go wrong to alter that First Law, a robot would be completely inoperable. It's a mathematical impossibility. (Robbie.78)

Here we have some explanation as to why robots are better than people: because they're designed to be; or rather, because a robot couldn't work if it wasn't designed to be good.

"So Rule 3 has been strengthened—that was specifically mentioned, by the way, in the advance notices on the SPD models—so that his allergy to danger is unusually high." (Runaround.148)

Speedy is as moral as the other robots we meet. But his desire for self-preservation (Third Law) nearly gets Powell and Donovan killed. It's not that protecting himself is a bad thing to do, it's just that there's something of a conflict between his needs and his orders. Not to mention that this whole conflict comes about because Speedy doesn't understand the situation; if Powell and Donovan told him they need this selenium to live, he would've gotten it very quickly. No matter how moral he is, Speedy doesn't know everything.

Cutie said nothing, nor did any other robot, but Donovan became aware of a sudden heightening of tension. The cold, staring eyes deepened their crimson, and Cutie seemed stiffer than ever. "Sacrilege," he whispered—voice metallic with emotion. (Reason.96-7)

In "Reason," the robots still have good morals—they don't want people getting hurt, which is why they take over the space station. But there's some strangeness in the way they do it, inventing a religion based on the Power Converter and coming up with some new rituals and taboos. Here Donovan has performed something sacrilegious. There's not a lot of religion in this book, but it's worth asking: what's the relationship between religion and morality in this book?

And, finally, worked his precise mechanical mind over the highest function of the robot world—the solutions of problems in judgment and ethics. (Catch that Rabbit.38)

We love the way Asimov works up to this. First, Powell and Donovan test Dave's math (which is normal, what we'd expect to do with a calculator), then they test his physical reactions (like you'd test somebody's reflexes, which is a little weird with robots, but understandable since robots have bodies), and finally, they test… his moral reasoning. That's something that seems really strange for robots from other science fiction works, but Asimov works his way up to it in a way that makes it seem reasonable.

"You can't tell them," droned the psychologist slowly, "because that would hurt and you mustn't hurt. But if you don't tell them, you hurt, so you must tell them. And if you do, you will hurt and you mustn't, so you can't tell them; but if you don't, you hurt, so you must; but if you do, you hurt, so you mustn't; but if you don't, you hurt, so you must; but if you do, you—" (Liar.258)

Ah, Herbie. Other robots get caught up between two Rules, but Herbie is caught by just one, so there's no way out for him—he's damned if he tells and he's damned if he doesn't tell. Luckily, Calvin is here to help talk him through this problem. Wait, did we say talk him through? No, she's driving him insane on purpose, out of revenge. So Herbie is caught because he doesn't want to hurt anyone and Calvin is purposely hurting him. Just another reminder that robots are better than people.

"Positronic brains were constructed that contained the positive aspect only of the Law, which in them reads: 'No robot may harm a human being.' That is all. They have no compulsion to prevent one coming to harm through an extraneous agency such as gamma rays." (Little Lost Robot.50)

The Nestors that get used at Hyper Base have a slightly altered sense of morality: they can't hurt people, but they can let them get hurt. The government thought they needed these robots for their research; but Calvin knows that pulling out one element of the Three Laws means the system will be unstable, like a game of Jenga. So the Three Laws aren't just individual moral ideas, but a whole system that works together.

"Now that they've managed to foul theirs up, we have a clear field. That's the nub, the... uh... motivation. It will take them six years at least to build another and they're sunk, unless they can break ours, too, with the same problem." (Escape.16)

Consolidated broke their super-computer and now they want to destroy the super-computer of US Robots. Why? Just because they're competitors. Perhaps this reminds us that humans are not as moral as robots, especially if they're in business.

"Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world's ethical systems. … To put it simply—if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man." (Evidence.138)

This is one of Calvin's main ideas in this book: robots are more moral than humans because they have to follow the Three Laws. Robots aren't simply moral because they have to follow rules, but because the rules they follow are moral guiding principles, the same that people should follow.

"Think about the Machines for a while, Stephen. They are robots, and they follow the First Law. But the Machines work not for any single human being, but for all humanity, so that the First Law becomes: 'No Machine may harm humanity; or, through inaction, allow humanity to come to harm.'" (Evitable Conflict.212)

In later robot books, Asimov's robots come up with something called the Zeroeth Law, which is basically this idea here: robots have to prevent harm not just to individuals, but to all of humanity. Now, this raises some problems in this story, since in order to help people in "The Evitable Conflict," the Machines have to slightly hurt some individuals, those who belong to the Society for Humanity. So what do you think—is it more moral to help many by hurting a few?

The document Morality And Ethics Quotes - I, Robot Morality And Ethics Quotes | I, Robot by Isaac Asimov- Summary, Themes & Characters - Novels is a part of the Novels Course I, Robot by Isaac Asimov- Summary, Themes & Characters.
All you need of Novels at this link: Novels
64 docs
64 docs
Download as PDF
Explore Courses for Novels exam
Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev
Related Searches

video lectures

,

Objective type Questions

,

Semester Notes

,

Robot Morality And Ethics Quotes | I

,

shortcuts and tricks

,

Robot by Isaac Asimov- Summary

,

Important questions

,

Themes & Characters - Novels

,

Summary

,

pdf

,

Free

,

Robot by Isaac Asimov- Summary

,

Themes & Characters - Novels

,

ppt

,

MCQs

,

Morality And Ethics Quotes - I

,

Morality And Ethics Quotes - I

,

Exam

,

mock tests for examination

,

Extra Questions

,

Robot Morality And Ethics Quotes | I

,

Robot Morality And Ethics Quotes | I

,

past year papers

,

Sample Paper

,

Previous Year Questions with Solutions

,

Morality And Ethics Quotes - I

,

Themes & Characters - Novels

,

practice quizzes

,

study material

,

Viva Questions

,

Robot by Isaac Asimov- Summary

;