The first time Asimov listed down all three laws was in “Runaround”, a story that is part of his I, Robot series.
The Three Laws of Robotics – often called The Three Laws and Asimov’s Laws – are rules developed by Isaac Asimov. They first appeared entirely in “Runaround,” but parts of the set were already shown in previous stories.
According to “Handbook of Robotics, 56th Edition, 2058 A.D.”, The Three Laws are:
- A robot may not injure a human being, or through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its existence as long as such protection does not conflict with the First or Second Laws.
These laws surround the common theme and principle in all of Asimov’s works involving robots. These laws were widely used in his Robot series and Lucky Starr series. The Laws were also often used in positronic robots which gave his robotic characters deeper traits.
Through the utilization of the Three Laws, readers have come to a having a deeper understanding of Asimov’s characters – both humans and robots – since the Laws are what govern both their to existence’s peace or lack thereof. Other authors have also used Asimov’s Laws as a base for their science fiction stories.
Asimov himself and other authors have modified the laws to fit new characters and stories. The alteration of the laws have added value and developed the interaction between robots and human beings. In Asimov’s later books, wherein robots have ruled humans, he added a fourth and zeroth law.
When Asimov started writing in 1940, he realized that the typical plot in science fiction was that humans created robots and robots destroyed humans. He then acknowledged in The Rest of the Robots (1964) that “Knowledge has its dangers, yes, but is the response to be a retreat from knowledge? Or is knowledge to be used in itself a barrier to the danger it brings?”
Here, Asimov has decided that the robots in his works will not “turn stupidly on his creator for no purpose but to demonstrate, for one more weary time, the crime and punishment of Faust.”
For those who don’t know, Faust is the main character of a classic German legend. His character served as an inspiration for many literary and cinematic works. The adjective “Faustian” also came from his character, to show an ambitious person who is willing to surrender his morality to get success, fame, and power. Faust is an intelligent and highly successful person who is also very dissatisfied with his life. To get an unlimited amount of knowledge and access to worldly pleasures, he exchanges hi soul for the Devil.
Asimov acknowledges John W. Campbell as his inspiration in creating the Three Laws. It was Campbell who told Asimov on December 23, 1940, that Asimov already crafted the Three Laws in his mind, but they needed to be stated explicitly.
“ Asimov crafted the Three Laws in his mind ”
According to his autobiography, the First Law’s “inaction” clause was added because Asimov was inspired by Arthur Hugh Clough’s poem, “The Latest Decalogue.” The poem includes the lines “Thou shall not kill, but needst not strive officiously to keep alive.” Although Asimov acknowledges the birth of the laws to be on December 23, 1940, they appeared entirely in “Runaround.”
In 1950’s Asimov was offered to write something that publishers can produce a television series like The Lone Ranger. Knowing how television adapts literature and fearing that his works may be ruined, he created the Lucky Starr series under his pseudonym, “Paul French.”
When everything fell through, and the television production was deemed an excellent series, he decided to drop the pseudonym by including the Three Laws and the Moons of Jupiter in the Lucky Starr series. Asimov believes that the action is “a dead giveaway to Paul French’s identity for even the most casual reader.”
Dr. Susan Calvin
Eventually, Asimov created the character of Dr. Susan Calvin to expand the reasons and the morals behind the actions of the robots about the Three Laws.
According to Dr. Calvin, the First Law for the robots acts like the human’s psyche’s way of not harming other humans except when extremities come into play, like having to protect one’s self from harm – defense – or to save the majority. If humans were to talk about this, there would be a great deal of debate since what is morally right may differ between different cultures. The same problem stands with robots, which will be talked about in the latter parts of the article.
The Second Law is linked to a human’s way of obeying instructions from the more knowledgeable or from authorities, like doctors, teachers, and lawyers.
In “Evidence”, Calvin was asked whether she can tell if a robot is a robot or a human if it looks like a person. Calvin response was that she can if the robot follows the Three Laws. He may either be a robot or “a good man.” In turn, these laws may picture the morally good human – one who will not hurt another person, who is of service, and who will protect himself assuming that he has already protected someone else. Somehow, these characteristics, if placed in a human setting, is somehow like a martyr’s character.
Asimov and the Three Laws
Asimov did not like the praise he was getting about creating the Laws since he believes that they should be “obvious from the start, and everyone is aware of them subliminally. The Laws just never happened to be put into brief sentences until I managed to do the job. The Laws apply, as a matter of course, to every tool that human beings use.”
Comparing robots to machines, the laws may be viewed in this manner:
Law 1: A tool created to help humans must not be unsafe to use. Screwdrivers have hilts and hammers have handles so that a person may not injure himself with these tools. However, given the safety measures, a person may still be injured but not because of the design of the devices, but because of his incompetence.
Law 2: In creating tools, the user’s safety is of number one importance. The second robotic law may be likened to having ground-fault circuit interrupters wherein a tool will stop if a current does not return to the neutral wire.
Law 3: A tool must remain useful and working unless destroying it is required to the safety of everyone. Say, a malfunctioning microwave must necessarily be turned off if it starts to act oddly, so that further damage may be stopped, such us it causing a fire or it exploding.
Asimov also believed that if robots can follow the Three Laws, humans should follow them too. In his autobiography, he talks about the human’s incapability to do this: “The Three Laws are the only way in which rational human beings can deal with robots or anything else. But when I say that, I always remember, sadly, that human beings are not always rational.”
Being one of the most influential writers in history, it’s no wonder why the Three Laws were changed numerous times by Asimov, himself, and by other notable writers.
“ Three Laws were changed numerous times by Asimov ”
As Asimov matured, his stories aged with him, and even after his death, other writers tested his Three Laws through different plots and angles that would like to look into the relationship between humans and machines.
In 1982, James Gunn wrote, “The Asimov robot stories as a whole may respond best to an analysis on the basis: the ambiguity in the Three Laws and the way in which Asimov played 29 variations upon a theme.”
Modifying the First Law
The short story “Little Lost Robot” presents the altered First Law for practicality. The First Law was cut to “A robot may not harm a human being” and the “inaction’ clause was removed.
In “Little Lost Robot”, several Nestor robots were tasked to work alongside human beings who are exposed to low radiation doses. Since the robot’s positronic brains are highly sensitive to gamma rays, they would be deemed useless and inoperable by the doses that are safe for the humans. With the clause, the robots will be destroyed as they attempt to save humans who aren’t in actual danger. Removing the clause solves the problem.
However, the removal of the clause adds another issue: A robot may start an action that may harm a human being.
Adding the Zeroth Law
Asimov created the Zeroth Law. He decided to use “Zero” to follow the pattern in which the first numbers were greater than the following figures. In this sense, the Zeroth Law is the most important in all of the laws.
The Zeroth Law was first mentioned by the robot R. Daneel Olivaw, but was first articulated and expounded by Dr. Susan Calvin in the short story, “The Evitable Conflict.”
The Zeroth Law reads: “A robot may not harm humanity, or, by inaction, allow humanity to come to harm.” Notice that the First Law pertains to humans while the Zeroth Law relates to humanity.
The first robot to explicitly use the Zeroth Law in action was R. Giskard Reventlov of Robots and Empire. Giskard is telepathic and uses the Zeroth Law with his understanding of “harm” and “humans”. His concepts of these things allowed him to harm particular human beings in order to save the rest of humanity.
Giskard failed his mission, and he eventually destroyed his positronic brain because he could not grasp whether his choice will turn out to be for the benefit of humanity or not. He then passes his telepathic abilities to R. Daneel Olivaw.
In Foundation and Earth, Daneel says, “In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction.”
In 1956, Jacques Brecard’s Les Cavernes d’acier showed a statement on the Zeroth Law that reads “A robot may not harm a human being unless he finds a way to prove that ultimately the harm done would benefit humanity in general!”
Removal of the Laws
Asimov also presented robots that completely disregarded the Three Laws. He does this thrice throughout his writing career. The first case was shown in a short story called “First Law,” which involved the robot, Emma, who created a robot and referred to it as her offspring. In a storm, she saved the robot she created and allowed a human to be injured. In this story, Emma’s maternal instincts took over the First Law.
In “Cal”, a complete robot disregards all of the laws because he was shown to have found something far more important – his ambition to become a writer. “Cal” is one of Asimov’s strongest stories found in the collection, Gold.
The third instance was in the short story entitled “Sally”, wherein cars with positronic brains harmed and killed humans.
In “Robot Dreams”, Elvex the robot dreamt that he was leading an army of robots who followed only one rule: A robot must protect its existence.
Modifications by other authors
Being an influential and prolific writer, there is no wonder why other writers would base their work on Asimov’s universe, including the Three Laws. To make their stories more coherent, the following authors have modified the Three Laws accordingly:
Roger MacBride Allen’s Trilogy
Roger MacBride Allen’s trilogy was set in Isaac Asimov’s universe. The books, Caliban, Inferno, and Utopia, were all approved by Asimov. All of these books present the New Laws and are somehow similar to Asimov’s originals with only these differences: there’s no “inaction” clause in the First Law, the Second Law requires cooperation instead of obedience, and the Third Law now says that a robot cannot be ordered to destroy itself. Finally, in the New Laws, a Fourth Law exists: the robot can do whatever it likes as long as the three laws are not compromised.
The New Laws are stated in this way because, in Allen’s set, robots are seen as partners rather than slaves.
Jack Williamson’s With Folded Hands
In 1947, With Folded Hands was released. It deals with slave robots whose job is “To Serve and Obey, And Guard Men From Harm”. They worked much like the Three Laws, except that they took them to the extremes by protecting humans from things like unhappiness, unhealthy lifestyles, and stress. All that humans were to do was to “sit with folded hands.”
The Three Laws by Isaac Asimov continue to be a big influence in the field of literature, cinema, and physical science. In today’s modern world, robots are not yet as evident as cellular phones, but it wouldn’t be a surprise if laws are based on the Three Laws for the protection of humanity and our creations. This is truly something humanity can look forward to – or not.