Can Robots be Held Liable?
Question: when a robot hurts someone, who answers for it? Surely not the robot itself. While it’s entertaining to imagine a bot getting cross-examined in court, the disincentives of the justice system probably wouldn’t motivate a bad bot to do good.
Jokes aside, the question of liability is an interesting one. When robots advance to the point where they’re operating independently – especially if their skills are not hardwired, but learned – it’s a bit problematic to suppose that their actions are directly attributable to their maker.
Compare an autonomous robot to, say, a conventional blender. The blender does exactly what it was built to do, nothing more. If, in the course of normal use, it malfunctions, one can pretty easily pin the blame on the manufacturer.
Artificial intelligence, though, upsets that equation. The whole premise of machine learning is that the robot can acquire abilities it didn’t originally have. Can its actions still be pinned on the manufacturer at that point? Once it’s left the nest, so to speak, a robot may be capable of things its manufacturer never had in mind.
Add to that, there probably isn’t just one “manufacturer” to think about. Artificial intelligence is complex, building on multiple technologies that may originate from many different places. For example, if the technology stems from the open source community, it could have millions of creators. Would they all be liable for the robot’s actions? Only some? How to decide which ones?
Kate Brown, Senior Claims Expert at Swiss Re Corporate Solutions and a contributor at Risk & Insurance, suggested a few models to frame society’s response to robot-wreaked havoc:
- Robot as employee
- Robot as pet
- Robot as individual
One thing is certain: we need a set of civil laws to govern the way we deal with autonomous robots and the harm they may cause.
Last year, the European Parliament’s Committee on Legal Affairs issued a motion calling for just that. “The EU report acknowledged that improvements in the autonomous and cognitive abilities of robots makes them more than simple tools, and makes the traditional rules on tort and contract liability inadequate,” Brown said.
For example, the EU proposed registering smart robots, imposing a code of ethics on robot manufacturers and researchers, and recognizing the robots themselves as “electronic persons” beholden to their own category of laws and regulations.
The EU is not the only entity turning wheels in that direction. Japan’s Ministry of Economy, Trade and Industry is drafting robot-related business and safety guidelines. The U.S. Chamber of Commerce also started looking at robotics, among other emerging technologies, recently.
“Someday, in the not too distant future, lawyers and experts in courtrooms around the world may discuss and debate what evidence is needed to demonstrate the proper programming of a robot,” Brown said.
Whether or not a bot will ever have to appear before a judge remains to be seen.