Self-Driving Cars: Still Waiting

By
8 mins. to read
Self-Driving Cars: Still Waiting

Tech Trouble In The Sunshine State

During the Chinese New Year celebrations heralding the Year of the Dragon in San Francisco, a self-driving Jaguar I-PACE taxi operated by Waymo (the self-driving car incubator owned by Alphabet/Google) narrowly avoided mowing down some revellers and then came to a halt. Within minutes, the crowd shoved firecrackers through the windows and set the hapless machine alight. Some heavies started to smash the windscreen as a cheering crowd looked on and applauded. The episode was recorded on YouTube. The $150,000 vehicle was left a charred wreck.

Nobody is suggesting that resentment against self-driving taxis is restricted to the Golden Gate City’s vibrant Chinese community. While this was the first time that the city’s 600-or so robotaxis had been destroyed since a pilot scheme was first rolled out in 2022, there have been numerous incidents which suggest that many townspeople are not fans. A group of vigilantes which calls itself Safe Street Rebel has taken to immobilising them by placing traffic cones on their bonnets. Apparently, this deactivates the cars’ sensors, and the useless vehicles must then be recovered by their operators. Many self-driving vehicles have been vandalised.

And yet last week, the California state regulator allowed Waymo to extend the services of its robotaxis to the freeways which surround the city where the speed limit is 65 miles per hour. Now, one can ride autonomously to the Golden Gate Bridge from Silicon Valley. The California Public Utilities Commission voted to allow Alphabet and the rest to begin citywide autonomous taxi services at all hours of the day in August last year. Waymo was permitted to drive at speeds of up to 65 miles per hour, even in bad weather; while Cruise (owned by General Motors) was limited to 35 miles per hour and was not allowed to drive in bad weather. Autonomous cabs, with empty driver seats and self-turning steering wheels, have become a common sight around San Francisco.

Naturally, human cab drivers (including those who work for ride-hailers Uber and Lyft) feel that they are being done out of a job. But much more than that, there is a widespread perception that autonomous cabs are dangerous. In San Fransisco, there have been at least 75 incidents where robotaxis have caused a nuisance or endangered life. They have knocked cyclists over, crashed into buses and some have stopped without warning in fast-moving traffic. In one incident a pedestrian was knocked over and then dragged until the vehicle stopped, her body trapped underneath. Self-driving cars were supposed to be safer than those driven by fallible human beings who often misjudge manoeuvres or drive drunk or even fall asleep at the wheel. Instead, robotaxis are perceived by many to be a menace.

Autonomous vehicles operated by Cruise have also gone wrong. In October last year a Cruise robotaxi ran over a pedestrian in San Francisco. Nine Cruise executives were sacked, and the company suspended operations across the USA after admitting that it had misled the authorities about the seriousness of the incident.

Elon Musk has been assuring investors that his Tesla electric cars have been no more than a year away from being fully autonomous for the last decade. Tesla’s Autopilot function, say critics, is essentially a type of enhanced cruise control rather than full vehicular autonomy. The Washington Post revealed recently that, since it was introduced in 2014, Tesla’s Autopilot has been a factor in 736 accidents, of which at least 19 were fatal. There are ten or more lawsuits pending in the USA against Tesla. The Insurance Institute for Highway Safety recently rated Autopilot “poor”.

Earlier this month, the sister of Senator Mitch McConnell, the Republican minority leader of the US Senate, drowned after her Tesla reversed into a pond. Texas police are treating the incident as a criminal matter.

Back In The UK

Fully self-driving cars are still not yet permitted on British roads. Transport Secretary Mark Harper MP announced last December that self-driving vehicles could be licensed to operate here as early as 2026. He told BBC R4’s Today programme that the government was drawing up legislation so that people could have “full confidence in this technology”. That was optimistic.

Yesterday (14 March), Mike Hawes, CEO of The Society of Motor Manufacturers and Traders (SMMT), urged MPs to pass legislation to legalise autonomous vehicles on British roads, observing that other western governments had already done so. He thinks that Britain risks being “stuck in the slow lane”. In Europe, France and Germany have given limited permission for autonomous vehicles to operate, but in Britain “test” vehicles are still only permitted with a backup human driver.

The Automated Vehicles Bill, announced in the King’s Speech last November, was first mooted in 2018. It is currently working its way through parliament. The bill, if enacted, would allow autonomous vehicles without safety drivers in the front seat on Britain’s roads from as early as 2026, potentially making way for driverless taxis and delivery vehicles. By the terms of the Bill, self-driving car users will not be prosecuted if their vehicle causes a fatal crash. But manufacturers may face any legal or criminal consequences. Tesla take note.

We have the technological potential in this country to make a difference. Oxford-based Oxa, which specialises in autonomous commercial vehicles, is already operating buses in Jacksonville, Florida. Oxa was founded in 2014 by Paul Newman, who is Professor of Information Engineering at the University of Oxford, and Ingmar Posner. Oxa is backed by IP Group and the insurance giant AXA XL.

The Challenge

It is now clear that the currently available technology behind self-driving cars is inadequate, even though it was assumed ten years ago that it would have been perfected by now. The question is: How long is it going to take to get it right? Essentially, there are at least three technical challenges associated with autonomous vehicles.

The first is so-called phantom braking – the vehicle’s sensors identify an obstacle that simply isn’t there and slams on the brakes. This can cause accidents, especially in fast-moving traffic. One possible explanation is that cloud cover can cause shadows which the poor machine assumes are objects in the road.

With all the hype around AI we tend to underestimate how extraordinarily subtle the human brain is. It has, after all, evolved over millions of years. We can judge what is real and what is a mirage in our field of vision, based on experience. Machines don’t have this ability – at least not yet. Turning into oncoming traffic (that’s turning right in the UK and Japan and turning left in the USA and Europe) requires the use of judgment which is difficult to replicate in machine learning.

Secondly, it follows that driverless cars can’t cope with situations that are unfamiliar. Unpredictable situations like sudden changes in the weather, ambiguous road markings, or the erratic behaviour of other drivers can confuse an autonomous vehicle’s systems. They can unexpectedly speed up or slow down.

Thirdly: How safe do we need autonomous vehicles to be? There is no universally agreed standard as to how safe autonomous vehicles should be. Should we aspire to build machines that never crash at all? Or is a certain amount of serendipity acceptable? We know that allowing people to drive their cars freely in the UK carries a price. In 2022, there were 1,711 fatalities on British roads; 27,742 people were seriously injured and there were 135,148 casualties of all kinds. These fatality statistics have been improving over time and Britain suffers from many fewer road deaths than comparable countries such as France. (Don’t even think about developing countries such as India where the traffic fatality statistics are horrifying).

As a society then, we think that a certain number of deaths and injuries are an acceptable price to pay for motoring – though this is an issue we do not discuss. If autonomous vehicles could be shown to cause fewer deaths and injuries than human drivers, would that make them better? There is a philosophical (and political) debate to be had here. How safe is safe enough?

Autonomous vehicles fare best in cities where the roads are spacious and uncongested and traffic flows freely. That may be why there have been fewer accidents in Phoenix, Arizona than in San Francisco. (Although in December last year, two of Waymo’s vehicles hit the same truck that was in the process of being towed away in Arizona). British cities, in contrast, are congested and our roads are generally narrower than in the USA.

It seems that, once again, the tech titans have oversold their technology while governments, national and local, have given them the benefit of the doubt. As a result, they have licensed the use of autonomous vehicles without human backup drivers which have the potential to do harm (as conventional human-driven vehicles do).

In the UK, there has been active discussion in recent years about “smart motorways” (where the hard shoulder is opened up to traffic in certain conditions, sometimes with fatal results). But the debate about the safety of autonomous vehicles has not yet even begun.

***

The Master Investor Show last Saturday (9 March) was a huge success. If you were there at the Design Centre, Islington, I hope you will agree with me that it had a vibrant buzz – not least because we had a record footfall of enthusiastic people. These included experienced institutional and retail investors as well as every day folk who just want to get their feet on the first rung of the investment ladder. The Arsenal football team even paid us a visit on their way to the Emirates Stadium. I suppose some of those footballers might need investment advice, come to think of it.

Just one tip for next year’s Show: if someone offers you a business card made of solid gold, don’t put it in your pocket, as I nearly did – you are supposed to scan it and return it with thanks. Another lesson learnt.

Listed companies cited in this article which merit analysis:

  • Alphabet INC. (NASDAQ:GOOGL)
  • General Motors Corp. (NYSE:GM)
  • Tesla (NASDAQ:TSLA)
  • Uber (NYSE:UBER)
  • Lyft (NASDAQ:LYFT)
  • AXA SA (EPA:CS)

Comments (2)

  • Andrew Dawber says:

    There is a simple solution. Britain has a great railway network that is being very badly utilised. Take up the tracks, cover it in tarmac and open up this network to self driving lorries only. This would greatly free up the motorways. A driver can then take the trailer for the last 5 or 10 miles to delivery point. If a trailer full of baked beans smashes and burns on the network it is not the end of the world. The railway network is fenced off and generally free from pedestrians, cyclists and animals. It’s the ideal environment to learn from mistakes. Once lorries are proven safe then open up for self driving coaches and taxis etc.

  • Paul Bennett says:

    Tesla fsd version 12 looks like the real deal, pure neural networks

Leave a Reply

Your email address will not be published. Required fields are marked *