Law and Legal Affairs: There's nothing artificial about reassessing the status of robots

As technology evolves, so must the laws thatgovern it

ISAAC Asimov might not be pleased, but Lilian Edwards has rewritten his most famous laws. Gone is the commandment that "a robot may not injure a human being" - replaced with a pragmatic 21st-century guide for their builders: "Robots should not be designed solely or primarily to kill, except in the interests of national security".

Edwards, appointed in January as professor of e-commerce law at Strathclyde University law school, helped formulate the laws for roboticists at a conference last year, an example of how Scotland is at the forefront of complex legal debates about technology.

Hide Ad
Hide Ad

"There is a worry that people's reaction to technology they don't understand is often hostile," explains Edwards. "So we were trying to think about the pervasive use of robots in society, before it happens.

"They are already being used, in warfare, as caring robots or in the sex-bot market, and some are incredibly humanlike. We are producing robots that learn, but that are not necessarily intelligent.

"So we cannot say a robot is ethically responsible for its actions. But should people deliberately build robots that kill? We decided that as a pragmatic reason that was going to happen, because it reduces human casualties.

"There are questions such as who pays if a robot causes damage? We used to have a debate about the liability for animals. Now we need to have that debate about robots.

"This could be a new area for insurance. We already talk about cyberinsurance and this could be another emerging market."

Robotics is just one of many constantly changing areas of technology with legal implications.

Edwards has taught IT, e-commerce and internet law at undergraduate level since 1996 and been involved with law and artificial intelligence (AI) issues for 25 years. And many of the technology advancements in Scotland can be mirrored for the legal issues, she says.

"There is a history of interest in cutting-edge technology in Scotland - we even had the top engineer on Star Trek," she says. "We have a history of looking for the next thing. Solicitors in practices have been ahead of the game for technology law.

Hide Ad
Hide Ad

"I think the three laws will have to be rewritten again in the future.

"Regulation tends to be behind technology. There's a bit of futurology. Working in the field since 1996, I've seen how speculative things become part of the world.

"I was doing law and was a science-fiction fan and had a lot of friends from that, so I naturally became interested in the crossover of law and technology. When the internet came along, it transformed society in a way AI has not."

Strathclyde University was the first in the UK to have a course in IT law and is now revamping its programme to include topics such as social networking, cloud computing, open source, the computer games industry and even cyberwarfare.

Edwards will be primarily in a research post for the first three years but will then be involved in the LLM in information technology and telecommunications Law, directed by Dr Konstantinos Komaitis. Widely consulted by government, business and the legal profession, Edwards said that while Facebook, emailing and Twitter are growth areas for libel and defamation, cyberbullying is still mostly within schools.

The damage caused to children will most likely lead to regulation, rather than litigation, she believes.

"This idea that the internet is the ultimate free space really doesn't exist any more since becoming a place used for making money and by parents and children," Edwards adds. "It's prone to moral panics. There is a tendency, since the 1960s, to be more relaxed about print pornography, but when people get access to it on the internet, it becomes a moral-panic issue. It's the same with cyberterrorism and cyberwarfare."

Legislation can be easy to rush into in reaction to news reports of concerns over file sharing, privacy or social networking. Edwards believes academics can stay at the cutting edge of the debate, and provide context to whether new laws are needed.

Hide Ad
Hide Ad

"If you were examining the growth in insurance fraud, then would you say we need new laws? Is it about psychology or education instead of legislation? You need that kind of debate and then set it within the context of human rights.

"I think there's a strong feeling that the Digital Economy Act was done too quickly without consulting everyone and not just the music industry.

"It is being reconsidered by Ofcom, by a judicial review, and as part of the Hargreaves review of IP law.

"You could do more things that would not require new laws, such as turning on firewalls and getting people to not click on sly messages on Facebook.

"File sharing created a real problem because the content industries were not prepared. They're still behind the times in their reaction.

"The growth in legal online music sources could have happened five to ten years ago, but the industry didn't want to do that. It's not always up to law changing, but rather looking at how the whole of society is changing."

Edwards believes academics can be the "honest brokers" as they translate "from law to geek", but the work cannot be limited to even those two sides, and the interdisciplinary nature of technology issues and law are key to the work at Strathclyde.

"I was hired by Strathclyde to work on this problem of e-governance and of regulating the information society on an interdisciplinary basis, and I'm really happy with that," adds Edwards."You cannot work with new technology without talking to the people who are developing the technology.

Hide Ad
Hide Ad

"My aim is to foster that interdisciplinary relationship. It's about bringing together the worlds of law and technology and society."

It's a goal with which even Asimov, and his robots, might agree.

The Three Laws of Robotics (by Isaac Asimov)

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

The Three Laws for Roboticists (by Lilian Edwards)

1. Robots are multiuse tools. Robots should not be designed solely or primarily to kill, except in the interests of national security.

2. Humans are responsible for the actions of robots. Robots should be designed and operated as far as is practicable to comply with existing laws and fundamental rights and freedoms, including privacy.

3. Robots are products. As such they should be designed using processes which assure their safety and security (which does not exclude their having a reasonable capacity to safeguard their integrity).

Related topics: