What was discussed?
- The need to ensure that road safety is not reduced was a key theme. There was discussion regarding how road safety needs to improve and not reduce. The risk of death or injury on our roads is significantly higher than for life in general and for other types of transport networks, such as rail. Non-motorised road users such as pedestrians and cyclists bear a disproportionate brunt of this risk.
- There may be circumstances where it is necessary for emergency services to be able to talk to or control self-driving vehicles. If emergency services can get sufficient access to control a self-driving vehicle, it could be possible for someone with malicious intent to do so too. Hacking is a clear safety issue 鈥 and there is a risk that vehicles could be used as weapons without the need for the perpetrator to be in the vehicle.
- The need to regulate delivery robots which are designed to use pavements (for the final mile from where a lorry load is dropped off to an individual dwelling) differently from vehicles that use roads was discussed. It was also noted that pedestrians and other vulnerable road users need to have safe and accessible spaces, including the pavements, and that this is essential to road safety. The question of delivery robots was said to go well beyond the safety of self-driving technologies and need careful consideration. There are clear reasons for existing restrictions on the use of road vehicles in spaces such as pavements.
- Accessibility and the importance of inclusivity by design from the outset. There was emphasis, for example, on the importance of the technology being able to locate accessible drop off points. Automated vehicles should not be permitted to be taxis without building in a wheelchair accessible requirement. The launch of app-based lift hire has resulted in a fall in the number of wheelchair accessible vehicles on the road (as a higher proportion of licenced taxis are wheelchair accessible).
What about insurance?
Previously the House of Lords commented 鈥淲e do think it is a very striking omission from the Bill that there appears to be no reference to insurance鈥
An amendment dealing with insurance was proposed (Amendment 34) but not moved.
鈥淟ORD BERKELEY 34鈽卂 Schedule 2, page 80, line 4, at end insert鈥
鈥(c) after subsection (7), insert鈥
鈥(8) When a person on a road or other place in Great Britain suffers damage as a result of an accident involving an authorised automated vehicle and the person was not at the time an occupant of that vehicle, it will be assumed for the purpose of this section that the authorised automated vehicle caused the accident unless proved otherwise.鈥濃
Member's explanatory statement
This amendment provides that, where a person other than the occupant of an authorised automated vehicle suffers damages as a result of a collision with that vehicle, the vehicle can be assumed to have 鈥渃aused鈥 that collision for the purpose of determining liability (in accordance with section 2 of the Automated and Electric Vehicles Act 2018), unless proved otherwise.鈥
This amendment proposed an assumption that the authorised automated vehicle caused an accident unless proved otherwise, moving the burden of proof completely on to the autonomous vehicle. This presumption of liability would apply regardless of whether the self-driving feature was active at the time of the incident. This was proposed to recognise the lack of balance between a pedestrian or a uninsured cyclist and an automated vehicle (with insurance behind it) and to redress that imbalance between the typical person on the street and large companies. It was discussed that autonomous vehicles will have at least half a dozen cameras recording every factor in an accident, leading to more information being available in the event of an accident involving an autonomous vehicle than there is for crashes today. A compelling argument against the amendment was that it could encourage the perception that the safety of self-driving vehicles somehow reduces obligations on other road users.
This did not look to change the insurance provisions set out in the Automated and Electric Vehicles Act 2018. Autonomous Vehicles are still required to have Road Traffic Act 1998 compliant insurance with unlimited cover for 3rd party injury. The 2018 Act imposes a form of direct liability for Automated Vehicles, as an accident may not be caused by human error if, for example, the self-driving function fails. Without the 2018 Act a victim of an accident could otherwise only have recourse against the vehicle manufacturer. If an accident is due to the self-driving function, insurers can then seek to recover outlay from product manufacturers.
It is understood that Mr Browne, a member of the House of Lords, has engaged with the Association of British Insurers regarding automated vehicles issues and the insurance industry.
The third reading of the bill in the House of Lords occurred on 19 February 2024. No amendments were suggested ahead of third reading. Key points raised included the dangers of accumulation of data, including personal, commercial and data associated with the security of the state. These points related to both personal privacy and the potential for hacking by a malign foreign power or individual hacker. The bill was passed and sent to the Commons. Its first reading in the Commons occurred on 20 February 2024.
Insurance 鈥 horizon scanning
Insurers should be aware of a potential move towards insuring the vehicle rather than the individual. There is also a move to insurance being provided by manufacturers (to remove the cost of potential disputes regarding liability between insurers and manufactures in the event of accidents 鈥 as to whether the cause was a failure of the self-driving function).
The Hansard updates can be found here -
Contents
- Follow the leader: Insurers using algorithmic underwriting
- Premium finance – a poverty premium
- 'Clear and unambiguous' exclusions: Cameron Soule v Woodward Design + Build LLC
- A blow to manufacturers as Thai Court confirms insurers’ denial of cover in wind turbine case
- Adapting to change or falling behind? The FCA under fire from the National Audit Office
- Voldemort causes havoc at Sellafield – nuclear risks and insurance policies