Driverless cars: the moral decision
Driverless vehicles will need to be programmed with a clear and agreed set of rules for decision-making, according to new research published by international law firm Gowling WLG.
In its report – The Moral Algorithm – Gowling WLG finds that concerns over the so-called ‘trolley problem’ where a vehicle must choose between hitting defined individuals – may have been exaggerated, with most of the experts interviewed agreeing that autonomous vehicles (AVs) will never be programmed to make such distinctions.
Nevertheless, the paper argues that harmonised safety regulations will be needed for other decisions, such as when it is permissible for a car to break the rules of the road, or when determining the ‘assertiveness’ of a vehicle when it interacts with other road-users.
The report concludes with a series of eight recommendations, including the creation of an independent regulator to balance the legality, safety and commerciality issues surrounding autonomous vehicles, the development of a policy regarding how the moral algorithm will operate in terms of major safety situations and a programme of public education and consultation.
Commenting on the outcome of the research, Stuart Young, a partner at Gowling WLG said, ‘It is important not to equate regulation with a burden. It can, in fact, facilitate new markets and important developments. Unless completely new legislation that accommodates new products in advance of them being produced is implemented, this is likely to impose huge additional risks on the companies producing them, as a result of regulatory uncertainty.’
‘The Moral Algorithm’ study took the form of interviews with industry specialists and representatives from the UK Autodrive consortium during September and October 2016 as well as desktop research and analysis of publicly-available information.