69 pages • 2 hours read
Isaac AsimovA modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more. For select classroom titles, we also provide Teaching Guides with discussion and quiz questions to prompt student engagement.
A major premise of the book is that robots will have to limit their behaviors to prevent harm to humans. To that end, the author posits the Three Laws of Robotics under which all robotic devices must behave. These laws create a form of ethics for machines, a system that derives from human mores yet holds older ethical systems up to the light of a higher standard.
All robots are designed, and their positronic minds are imbued, with unyielding respect for the Three Laws of Robotics. These Laws are meant to channel robot behavior along pathways that are safe to humans, never wasteful, and always useful.
The First Law states that “a robot may not injure a human being, or, through inaction, allow a human being to come to harm” (37). Robots are very strong and extremely intelligent, and their activity might easily cause damage; thus, the most important rule they must follow is to act in such a way that no one is injured or killed.
The Second Law says that “a robot must obey the orders given it by human beings except where such orders would conflict with the First Law” (37). This is the only positive law among the three: It gives robots their duties and removes from them the option of disobedience or of doing something other than what humans want from them. The only exception is carrying out orders that might cause harm. This prevents robots from becoming weapons.
The Third Law commands that “a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws” (37). Robots tend to be expensive, and preserving their usefulness, though the least important rule, is still necessary, lest robots constantly fulfill orders by damaging or sacrificing themselves. Thus, if saving an individual is impossible, and trying to save that person results in the destruction of the robot, the robot may stand down so that it may continue to protect others.
Dr. Calvin remarks that the Three Laws encapsulate what is best in people: “the three Rules of Robotics are the essential guiding principles of a good many of the world’s ethical systems” (182). Protecting others, performing one’s duties, and protecting oneself are virtues worthy of anyone. A robot’s behavior nearly always is exemplary, a far cry from how most people behave. Only the kindest, most considerate people come close to achieving the level of dedication to humans that the story’s robots live by every day.
It is not easy being mechanical. Robots absolutely must obey the Three Laws—their positronic brains are simply incapable of doing otherwise—but this sometimes leads to perplexing dilemmas for the machines. Their solutions to such binds often show ingenuity but sometimes cause even more trouble. These problems demonstrate that neither the Three Laws nor any ethical system wired into robots can be foolproof.
Robot Herbie has the unexpected ability to read minds. His instructions require that he not cause harm, so he tells people what they want to hear instead of what is true. This causes a bad side effect because people need to know the truth, so they do not make critical mistakes at work and at home. Caught in a bind between not causing harm and obeying his orders to provide useful information, Herbie breaks down.
Sometimes, simply following orders can cause a robot to malfunction. Dave is a robot in charge of a squad of mining robots. He must conduct mining operations productively, but during situations involving sudden danger, such as a tunnel collapse, the calculations Dave must make to manage all six robots become enormously complex. Asteroid mining is inherently dangerous, and Dave resolves the conflict by blacking out and unconsciously ordering his robots to do safe activities like marching around. This glitch is solved by human technicians, who simply remove one of Dave’s six mining robots, which simplifies Dave’s calculations and permits him to carry on effectively.
Nestor is a robot designed without the First Law rule against neglecting people in danger. This makes him useful in situations where people deliberately put themselves into risky positions during scientific experiments. It also makes the First Law secondary to the Second Law, so that Nestor, when ordered to hide, does so with such determination that he is willing to kill to stay hidden. Humans learn from this that any change in the Three Laws can cause a disaster.
No set of robotic behavior laws can perfectly protect robots and people from problems. Awkward or unsafe situations will arise. Sometimes the robots find solutions, as did the robotic computing machine The Brain when inventing the first hyper-space drive: He saw that the motor would kill passengers, but he forged ahead and discovered that those dead humans would return to life unharmed—thus, there is no violation of the First Law.
The elegance of that solution merely points out that most such dilemmas do not end well, and each one forces designers to adjust the robots’ brains. It is a never-ending process.
Robots exist to serve humankind, and the makers and users of robots believe they are in command of the machines. It is possible, though, that the machines, as part of their mission to prevent harm—including not hurting people’s feelings—simply let their owners think they are in charge.
People who work with robots typically talk down to them, symbolically putting them in their place. They address a mechanical man as “boy,” an insult to anyone else but accepted by the always-courteous robots. Keeping this social caste system in place gives humans a sense that they are on top of the situation and in control of the robots.
The anti-robot people, the Fundamentalists, organize a group called the Society for Humanity that resists robotic participation in human affairs. They get laws passed that forbid robot citizenship, ownership of property, or residency on Earth and other occupied planets in the solar system. The Society confines robots to space stations and mining operations on remote moons and planets.
The robots perform their work ably, and the results—plentiful ores and abundant energy—greatly improve life on Earth and elsewhere. Thus, even in exile, robots prove useful to humans.
On Earth, a special type of robot called a Machine, essentially a stationary computer, is built for each of the four main regions of Earth, plus one for the World Co-ordinator and one at the main office of the US Robots company. The four regional Machines manage the world economy so perfectly that any perturbation in it is quickly repaired. Members of the Society, opposed to the Machines and the replacement of human planners and managers with mechanical brains, try to sabotage the Machines by disregarding some of the information they receive from them. This causes slowdowns and other dislocations to economic activity. The World Co-ordinator, Stephen Byerley—himself possibly a secret robot—wants to wage a campaign against the Society.
Dr. Calvin tells Byerley that the Machines are so smart that they already know about the sabotage and have already made small, quiet adjustments that account for it. The dislocations would be worse if they did not; what is visible is the outcome of an attack and a peaceful defense. The problem already is solved.
Dr. Calvin suggests that, by now, the Machines are the ones in charge. Working constantly on behalf of humans, “the Machines understand them; and no one can stop them,” and that “for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!” (224)
Thus, it does not matter how mechanical humanoids are addressed, and it doesn’t matter that humans believe they’re in charge: The robots are running things, they do so strictly for the benefit of humanity, and they’re vastly too smart ever to be outmaneuvered. From here onward, people are just along for the ride.
By Isaac Asimov