#edisi tugas kuliah
#tidak perlu dianggap serius
#bikinnya juga pengen cepet2 selesai
#but i gladly accepted any comments or suggestion : )
Three laws of robotics are a well-known laws of robotics proposed by Isaac Asimov. This laws was introduced from his science fiction story (books) with title “runaround”. This law is not a ready-to-implemented law in current robotics and artificial intelligent technology, but by knowing these law, and predicting worst case on the future of robotics, it would be better to learn about the laws. The author itself (and robotics community) also revise or added the laws by the time being. Basically, the content of the three laws are:
First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the first or second law.
In normal point of view, there is nothing wrong with the law. But somehow this law itself actually difficult even for human to differentiate clearly between harmful action or not, between orders or just suggestion, and what is the meaning of protecting own existence. If the robot can differentiate this clearly, then the creator of the robot is the one who responsible for its creation (and definition)
In my opinion, it would be better to define what is the true meaning of the creation of the robot? Above all functionality, the purpose of creating robot should be to help human. So that, the robot may do any activity with human society, to help human, as long as it’s not conflicting the three laws.
My other opinion is, it would be better to change a little bit in the second law. A robot must obey order given by its creator or master (owner). It does not necessarily obey all order from human being, it can obey or not depend on the robot evaluation of all condition. It’s important to obey the owner so that the owner have full responsibility for any ordered action done by the robot. It also better to define an order that robot must do or just a normal suggestion to the robot.
The third law is not really necessarily pointed in the law. If the robot can live in the society (live its life), then it know what they are doing, also means understand to do good things, help human, or not harm own existence.
The last, human must treat robot as robot (machine that created by human), just like human treat animal as animal. If we scare that the robot will acts bad because it was ordered by human, then the one who sick is human (or society) not the robot.