- News
- Reviews
- Bikes
- Accessories
- Accessories - misc
- Computer mounts
- Bags
- Bar ends
- Bike bags & cases
- Bottle cages
- Bottles
- Cameras
- Car racks
- Child seats
- Computers
- Glasses
- GPS units
- Helmets
- Lights - front
- Lights - rear
- Lights - sets
- Locks
- Mirrors
- Mudguards
- Racks
- Pumps & CO2 inflators
- Puncture kits
- Reflectives
- Smart watches
- Stands and racks
- Trailers
- Clothing
- Components
- Bar tape & grips
- Bottom brackets
- Brake & gear cables
- Brake & STI levers
- Brake pads & spares
- Brakes
- Cassettes & freewheels
- Chains
- Chainsets & chainrings
- Derailleurs - front
- Derailleurs - rear
- Forks
- Gear levers & shifters
- Groupsets
- Handlebars & extensions
- Headsets
- Hubs
- Inner tubes
- Pedals
- Quick releases & skewers
- Saddles
- Seatposts
- Stems
- Wheels
- Tyres
- Health, fitness and nutrition
- Tools and workshop
- Miscellaneous
- Tubeless valves
- Buyers Guides
- Features
- Forum
- Recommends
- Podcast
Add new comment
81 comments
Unrealistic expectation. No technology has ever been delivered in a perfectly safe form. Think of the history of commercial flight, or closer to home, the introduction of carbon fibre bicycle frames. Far fewer catastrophic failures these days.
My prediction is that some sections of British motorway will be opened to autonomous vehicles for large scale trials within 5 years.
Well, OK , it's an "unrealistic expectation" - which is why I suspect self-driving cars will never work and why it will turn out to be one of those tech-fads that over-promise and under-deliver and turns out to be a waste of money. Unless they turn all roads into motorways, with everyone not in a car banned from them.
You can't compare autonomous vehicles to carbon-fibre bike frames! The crucial point, as illustrated by this very case, is that the half-way house of self-driving cars that need a driver ready to jump in, is intrinsically dangerous and can't be allowed on public roads. So they have to be either 'in perfectly safe form' or not allowed on the road.
This!
At present, the machine learning systems don't seem to be up to everything they'll encounter in a typical urban (or rural, for that matter) road environment.
How many people are we willing to sacrifice so that the machines can learn, so that the later models are safer and can deal with the Real World?
The difference is that every day around the world hundreds if not thousands of pedestrians are involved in collisions with cars driven by humans. Whilst the individual driver may or may not learn something from the experience that does not filter out to all the other humans. With software, every single mistake, every unusual occurence, every unique scenario gets captured by a multitude of sensors, fed back for analysis and the lessons actually do get learned by all the linked AIs.
Can't be arsed to go find KSI figures for AI driven cars v human drivers but I seem to recall that already they are something like 1/10 as likely per mile to be involved in a RTC.
If nothing else, the sensors and software are already being installed into human driven vehicles in the form of the collision avoidance systems mentioned in the report that Uber engineers had turned off in the test car. Right now in the UK the only thing preventing autonomous vehicles driving certain stretches of motorway and major A roads is legislation.
Interesting report, deserves wider publicity. Converting to metric, 5.6 sec at 70kmh means the radar identified the victim 108 metres away. Average braking distance (without reaction time) on a dry road at that speed is 27 metres, so at that that point there was a considerable margin for safety.
What should have happened was a controlled braking to slow the vehicle, which would give the computers more time to ascertain the nature of the obstruction from the sensor input. Which is what a human would do, if you see something in your headlights but aren’t sure what it is or which way it’s moving.
The input-identify-react method used shows that engineers think of it as a mechanical problem, and don’t appreciate the subtlety and nuance of human decision making. Which tells me they’re a long way off making it work in all situations.
It had both radar and lidar, both of which saw the pedestrian, but the software ignored her/believed her to be a stationary object. The computer failed to track, or work out trajectory.
The Volvo's own pedestrian avoidance measures had been turned off by Uber as it resulted in a jerky driving style.
Christian Wolmar's blog makes for very good reading, his take on self driving cars is informative and refreshingly pessimistic. His views on the railways is always knowledgable.
Two points-Why was the car driving at a speed inconsistent with it's headlight range?
And don't these self driving cars have a radar system? Radar works in the dark doesn't it?
The radar did pick up the pedestrian 5.6 seconds before the collision, but the AI didn't categorise correctly and predicted that she wasn't in the path of the car. Lidar then picked her up 5.2 seconds before, but again predicted no collision - it registered her as stationary.
I'm assuming the cameras they used for the footage might not have been high quality, but I'm pretty sure the woman pushing the bike was in total darkness until lit by the car lights a second before impact. Whilst it might have been avoided if the "vehicle operator" had been fully in control rather then being a bored passenger watching a phone or whatever. And definitely would have been avoided if the programmers had thought that roads might contain people anywhere, why was she crossing a road, in the dark, without waiting for the traffic to clear or sure they had slowed down? I suspect the accident would have happened if it was a normal car in those circumstances.
If I recall, the video footage was much darker than you would expect from the general lighting conditions, so the pedestrian would have been much more visible than you'd expect from the car footage. Personally, I don't think that drivers can pay enough attention if they're not driving and are just being passengers until they're suddenly in the middle of a situation.
Christian Wolmar wrote a piece for CUK in last month. A snippet:
"my view is that the driverless car dystopia will never happen. Sure, there may be limited use for such vehicles, such as airport shuttles or other pre-fixed routes, but the vision of everyone being in driverless cars that are shared use is, frankly, a fantasy."
https://www.cyclinguk.org/cycle-magazine/christian-wolmar-are-driverless...
If you consider the complexity of programming something as complex as a car in a public space of ever-changing and unpredictable dimensions and hazards, factor in programmer error (like the Boeing 737 Max software) and the many possible points of failure, let alone the potentially disastrous chance of malware infection, it really does seem to be pie in the sky.
And at a time when there are more and more scientists, health professionals, NGOs and many others pushing for active travel, reduced numbers of cars in towns and cities (whatever their source of power), it has all the signs of an endless project that will suck up vast amounts of money and time but won't really produce satisfactory results.
No surprise then that the US bods want cycle helmets and that 26,000 road deaths (which has been rising for a few years now) is seemingly acceptable collateral damage.
The account talks repeatedly of 'jaywalking'. Jaywalking is only a 'thing' in certain legal systems (notably the US).
It's worrying if that concept, that is irrelevant to the UK, where there is no such thing, was built-in to the logic of the vehicle. It all fits precisely what worries me about autonomous vehicles - that the law will changed to accomodate them at the expense of pedestrians and cyclists, rather than the reverse.
I think that is more of a symptom of Uber being rubbish and cutting corners. It does seem that autonomous vehicles are a lot harder to get right than originally thought, but I still think that they have tremendous advantages in the long run. Give it another 10 years maybe.
Jaywalking does seem to have a particular hold over the American psyche. Just about every US TV prog or movie, there is a scene where the protagonists chase each other across an otherwise deserted street which seethes with traffic the moment they race across, barrel rolling over bonnets (hoods?) etc. Miraculously no one is run over, although the baddie escapes to prolong the drama when the goodie is encumbered by one final vehicle. By the next scene the traffic has melted away. Only in America.
Here's a link direct to the full PDF report:
https://dms.ntsb.gov/public/62500-62999/62978/629713.pdf
Cygnus, that link doesn't work for me, but here is a story from the LA times about how the vehicle wasn't programmed to spot pedestrians
https://www.latimes.com/business/story/2019-11-05/self-driving-uber-in-c...
Link in original post should work now, I was fixing it whilst ol' Squirrel Nutkins (aka HawkinsPeter) was pointing it out as broken.
But basically, yes, unless you used a designated crossing point, Uber's AI did not see you as a pedestrian. So jaywalkers were not seen as human (you could argue this may reflect the general thought process of drivers generally - AI or otherwise).
You've got some trailing characters in your link. Try this instead: https://www.theregister.co.uk/2019/11/06/uber_self_driving_car_death/
Thanks for posting that, anyhow.
Yeah just noticed - and fixed.
The forum topic is the headline from the article. I've fixed it below:
Remember the woman (supposedly in contol of the Uber self-driving car) that killed another woman who was crossing the street?
Pages