It’s been a busy couple of months in the progress of self-driving vehicles in the UK. At the end of July, the government announced a consultation on the ‘next steps for self-driving vehicles'. This is, chiefly, a move to further establish the laws necessary to regulate self-driving vehicles, as well as to work out how pilot schemes can be rolled out.
The regulatory side of self-driving cars is a Labyrinthine issue, made up more of pitfalls and loopholes than solutions. After all, if a self-driving car does accidentally read the speed limit as 120mph rather than 20mph, who’s accountable for the subsequent accident?
I don’t know, and I'm not convinced that the government does, either. But my guess would be that blame could fall at the feet of the vehicle manufacturer. What tentative legislation has been laid out so far in Germany and other territories points towards that answer, and Volvo’s statement (way back in 2015, no less) that it would accept liability for any accidents caused by malfunction of its autonomous vehicles suggests the same.
Regardless of that aspect of self-driving cars, the British government and the automotive industry is progressing with making fully autonomous cars a reality, and it seems likely we’ll see real advances within the next few years. Initially, it’ll be certain pilot schemes for taxis and bus services.
Now, I can see the advantages of this. Obviously, the government is pushing the message about the jobs and economic incentives that the country will benefit from if we become a leader in self-driving tech. Some 38,000 jobs, no less. I’m all in favour of that. It’d be great for the UK to become even more established in tech development and production (automotive and otherwise). It’s a no brainer, surely.

Similarly, in its press release, Next steps for self-driving vehicles as future passengers help shape self-driving vehicles law, the government states how autonomous transport will “reduce human error, which contributes to 88% of all road collisions.” Well, perfect. What’s not to like about that?
As I said back in another column about driver aids, it is unarguable that modern safety assistance systems in vehicles are a good thing, and a direct reason why our roads are now a much safer place than they once were.
And yet, as I drove down a broad, open country road in a brand-new Fiat recently, with the car frantically telling me that the speed limit was 30mph, when it was without doubt a 60mph limit, I couldn’t help but feel that the tech is still not good enough. The same car was also the inspiration for my comment above, after it told me that the speed limit on a motorway (no roadworks or extraneous road furniture or confusion, just an ordinary motorway) was 120mph.
I’ve also spent a lot of time in a new Volvo EX30. So far, the speed limit recognition has been quite accurate, but the lane-keep assist is panicky and will steer you away from white lines, regardless of the small matter of oncoming vehicles. It also applied a random dab of emergency braking for absolutely no reason, which I’m sure the person driving behind me really enjoyed. Nice burst of adrenaline to wake them up. At least I know that Volvo would have taken responsibility for the accident…
I’m trying not to be a luddite, here, and I do believe that full self-driving vehicles will be a part of our future, and can be extremely useful in certain scenarios. The tech is impressive, as Tesla has already shown by releasing videos of its cars driving autonomously around London and the notoriously awkward ‘magic roundabout’ in Swindon.

Yet, 'impressive' is one thing. 'Good enough' is another. Every time I get in a car with all of the latest tech on board (and I include Tesla in this, by the way) it will do something to remind me that computers are still not as clever as humans. I can’t think of a single car I’ve driven that hasn’t mis-read a speed limit, mis-judged a road line, panicked and slammed on the brakes because of some long grass or an errant shadow, or done any number of things that my human brain was capable of computing without a second thought.
Modern car driver aids actually remind me of riding a horse; ironically, the absolute antithesis of self-driving vehicles. Everything can be going smoothly on your horse ride until, suddenly, trusty Dobbin gets spooked by an entirely inoffensive wheelie bin, or a leaf blowing in the wind, or some other mundane thing. And the next thing you know you’re perched on a tonne of animal that you’ve no hope of out-muscling, and which wants to flee into the far horizon regardless of whether you’re on board or not.
It’s a similar feeling when your Volvo randomly brakes for no reason, except obviously, I can’t hope to calm my Volvo down by patting its neck, making soothing noises, or walking it round in circles. Maybe I could get out and offer it a mint?
Anyway, when a car’s semi-autonomous systems panic, it’s normally a momentary thing that I can override. Because they’re only semi-autonomous, after all. I can apply steering force to override the lane-keep assistance, and I can ignore a mis-read speed limit.
But if the car was fully-autonomous, what can I do? Can I override it, to stop my Fiat from doing 120mph, my Volvo from executing an unnecessary emergency stop, my Volkswagen from swerving away from a solid white line and into oncoming traffic, or my Mercedes for not seeing a free-roaming New Forest pony headed towards the road we’re on? No. I may even be sitting in the back seat, simply doom-scrolling on my phone while I wait for my autonomous taxi to get me to work.
So, I will say this again: driver aids are a good thing for road safety, and I am not even against the idea of fully autonomous vehicles. I really do believe that, in the long run, it could be useful tech that will potentially enhance transport options for a lot of people.
But it also fills me with fear. Because it is largely the same lidars, radars and cameras, the same complex ECUs and sensors, that control self-driving vehicles as control the semi-autonomous aids in today’s cars. Call me a control freak, but dealing with a car that’s got it wrong and which you can override, is one thing. Being a passenger in a car that’s got it wrong and which you can do nothing about, is far scarier.
Tesla can send its cars around torturous roundabouts, Nissan can drive a Leaf from Wales to London, and Volvo can send out driverless taxis in California all they want. Until I see semi-autonomous driver aids in our everyday cars being routinely reliable and accurate, I won’t be convinced that we’re ready for full self-driving cars.
