In the movie, I, robot, robots have to follow the following 3 laws
- A robot may not injure a human being or, through inaction, allow a human being to come to harm
- A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
When a group of robots decide not to follow these rules, that becomes the crux of the movie. The most important law is that, it should not harm a human being. But when I read the following news, I was shocked
Robot kills man at Volkswagen plant in Germany
The 22-year-old was part of a team that was setting up the stationary robot when it grabbed and crushed him against a metal plate
source : http://mashable.com/2015/07/02/volkswagen-robot-germany/?utm_cid=mash-com-fb-tech-link
I am sure that, as years pass by, stories like these will become too common for us. We are already hearing drones killing 1000s of people, although they are human operated. But in a fully automated society, or rather almost fully automated society, this could become very common. Just guessing….
