As machines become more autonomous, the government will have to introduce new legislature based around safety and liability, says a safety critical software developer.
Chris Hobbs of QNX Software Systems asked those at a Carleton University Technology Innovation Management lecture Thursday night to imagine a scenario with an autonomous car driving down the street when a small object appears on the road. The car is 82 per cent certain that the object is a child. If the car were to veer off to either side it could cause a collision. What is the car to do?
“The [current autonomous cars] may have to make a random decision,” Mr. Hobbs said. “Nobody can blame you for a random decision.” People laughed, but were also a little unsettled. With the life of millions of motorists potentially in the hands of these machines, random isn’t very comforting.
OBJ360 (Sponsored)
The value of an Algonquin College degree: Experiential learning, taught by industry experts
Zaahra Mehsen was three years into a biology degree at a local university when she realized she wanted to take a different path. “I realized that it’s not my thing,”
Progress can create unlikely allies
There was a time when mining exploration and the environment were like oil and water. Several years ago, I attended social impact investing conferences in America and the U.K. with
Mr. Hobbs is an expert in designing embedded software for systems that can involve high risk. Some of the software he’s worked on includes robots performing hip surgery, train brakes, autonomous cars, undersea drill heads and drug infusion devices.
In all these cases, a few bugs in the system could mean serious repercussions to a person’s health, and maybe even cause death, which is why Mr. Hobbs believes it critical that government consider these issues.
“I don’t think society has woken up to all the issues associated with liability,” he told OBJ after his presentation. “[Tech] has sort of crept up on society. You have something like 12 computers in your car. Many people don’t realize that.”
Liability can be a particularly tricky, Mr. Hobbs explained. For instance, imagine an Uber autonomous vehicle that goes to pick up your child from a hockey game. If there is a collision and the autonomous vehicle is at fault, between Uber, the car manufacturer, the system designer, and the person who called the car, who is liable?
“The government has to come out with laws and the infrastructures to answer [the questions around automation],” Mr. Hobbs said.
Another two major issues facing us today are security and hardware. Security in that nearly everything has Wi-Fi and GPS connections these days – from medical devices to trains and these are easily hacked, Mr. Hobbs explained.
The average lifespan for a processor chip is about two years, “which is great for cellphone companies,” but not great for the consumer, he joked.
During the talk he explained the various ways different companies and cultures come to decide what exactly is ‘safe enough’.
For instance, under the ALARP system (As Low As Reasonably Practical) used by the British, Mr. Hobbs gave an example where a company will assign a cost-to-life ratio and make decisions based on that ratio. Once the cost of a human life is fixed, railway companies using ALARP will weigh the cost of safety enhancements against the lives saved. The enhancement will only be made if it is cost-effective.
“Within 25 years, fully autonomous cars could be commonplace in Ottawa,” Mr. Hobbs said, but he wonders if Canadian legislature will be able to catch up in that time.