Another November, another budget. The budget announced today is an interesting one for the tech world. With investment in 5G networks, extra recruitment of computer science teachers, and investment in the AI industry, it’s clear that the Tories want to get behind tech. But it goes further than that. Chancellor Philip Hammond wants self-driving cars on UK streets by 2021.
The move comes as part of the post-Brexit measures since the UK won’t be subject to the same constraints as the rest of the EU. Under the new regulations, self-driving cars can be tested on UK roads without a human operator on board.It certainly makes the UK more attractive as a production site for the car industry.
A segment on BBC Breakfast saw people canvassed for their opinions on self-driving cars. Many of those asked expressed concerns about “trusting their lives to a computer” – which they do on a daily basis but never mind.
Another problem was the perceived “lack of ethics”. One chap worried about the cars being able to make decisions or distinguish between situations. As a perpetual pedestrian, I can confirm that we have the same fears about human drivers. I’m less likely to worry about self-driving cars since I bet they’ll use their indicators, pay attention to stop lights, and not drive the wrong way along one-way roads.
But I digress.
Wait – are self-driving cars even safe?
Many detractors point to the fatal crash in 2016 of a Tesla Model S. Joshua Brown’s car hit a truck while on autopilot; it seems the car didn’t recognise the side of the truck, only being programmed to recognise the front and back.
Tesla pointed out the tech was in a “public beta phase”, so they were definitely anticipating bugs in the software. But a bug that ends up killing someone?
Still, autopilot doesn’t mean take your hands off the wheel completely.
At the moment, drivers are still expected to interact with their self-driving car. There’s nothing stopping you from grabbing the wheel or hitting the brakes. True, that does defeat the point of self-driving cars, but they’re still in the testing phase.
Often, other drivers are the problem.
In 2016, a self-driving car by Google hit a public bus in Mountain View. It was attempting to drive around sandbags in the road. The car knew the bus was there but the software predicted it would slow to allow the car to pass. It didn’t, and the car hit the side of the bus.
That said, the human test driver also thought the bus would yield to the car. How much can you blame software when a human makes the same assumption?
Plus, let’s take this one step further. If both vehicles are piloted by software, then yes, there’s still the risk that the same collision could happen. But the software in the bus might have noted the car’s failure to slow and applied the brakes itself – a decision the human bus driver didn’t make.
A study by the University of Michigan’s Transportation Research Institute found that driverless cars were involved in twice as many accidents as normal cars. Before you condemn self-driving cars, be mindful that they were being hit by human drivers. Some think the driverless cars stick so faithfully to the rules of the road that it throws human drivers, who are used to bad driving from other people.
So what are the advantages of self-driving cars?
Human error causes more than 90% of car accidents. Eliminate the human element and, in theory, you’d reduce the number of accidents.
You’d pretty much eradicate drug/drink driving. The same applies to texting and driving. Passengers could text and call people to their heart’s content.
Then there’s the option of the smart city. Imagine heading into a city and finding your self-driving car has already communicated with the parking facilities near your destination. Your car takes you straight to the nearest parking space without endless driving up and down. That’s more time at your destination, enjoying dinner, watching a film, or hanging out with friends.
And the disadvantages?
I don’t think that the computer making the decisions is necessarily the problem with self-driving cars. No, the problem is the security aspect. A driverless car is the last thing you want hackers to get into. Would you want your car’s brakes to be turned off on the M1? Delivery cars could be manoeuvred off main streets and into quiet side alleys by nefarious individuals.
The Guardian think we don’t actually need to worry about security. The extra tech, strangely, makes them more secure.
But, given self-service checkouts still can’t distinguish between products, and the dreaded ‘blue screen of death’ is an ever-present danger in UK offices, can we trust the computers in self-driving cars not to go wrong too? It would be pretty hairy to have to turn your car off and back on again while you’re doing 60mph down the A1.
Will we see self-driving cars on UK streets?
Uber certainly thinks self-driving cars are the future. They’ve bought 24,000 vehicles from Volvo as the start of their driverless fleet.
While it’s difficult to know how much Uber drivers make per hour, the company takes a fee of 25% out of passenger fares. It usually costs me between £9 and £9.50 for a 15-minute trip. If the driver manages a similar return journey in that hour, then Uber is only making £4.50 – £4.75 an hour. It doesn’t take a genius to see why Uber might want to dispense with the driver and net the other 75% of that fee themselves.
But regardless what Uber think, the British government seem to see them as the future…
Over to you – what do you think of self-driving cars? Let me know below!
Need me to write technology posts like this for your blog? Check out my Please Google and Your Users Bundle!
[…] simpler than self-driving cars but who said a tech gift had to be […]