Teaching technology human behaviour


Years ago, it was unthinkable to design a set of rules for artificial intelligence. Now we have three basic rules of robotics: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

This rules basically summarize that robots have to protect humans no matter what. The issue starts when robots have to decide which human will survive.

Teaching morality to artificial intelligence is one of the big issues these days. We are very close to totally implement self-driving cars in society. Therefore, we have to anticipate that accidents will occur. These accidents will be caused by humans in most cases, but technology can also fail.

For a vehicle to function correctly with the intervention of the human being, it needs the sensors to interpret the environment of the vehicle. But how do they work? What function does each of them fulfill? To find more about how these sensors work click here.

We have developed the following scenarios in order to evaluate what kind of morality will have to be implemented in self-driving cars:


As you can see, it is not possible to develop a unique morality than can be implemented globally to every technology in the planet. Answers vary depending on the country, culture or lifestyle of the people. Deciding how artificial intelligence will behave, it is one of the biggest challenges of the future.

Comments

  1. Very interesting topic! While filling out the form, it was hard to decide who got to survive and who has to face the fate of death.

    ReplyDelete
  2. The moral issue of artificial intelligence has always been a problem. This questionnaire really took me lots of time to make up my final decision . Interesting research!

    ReplyDelete
  3. Interesting topic! I think the issue of morality doesn’t only depends on the country, culture or lifestyle of the people. Actually, nobody could say which the definition of morality is definitely right or wrong. Because life can’t be judged from any perspective, no matter people or AI.

    ReplyDelete

Post a Comment