A journalist from France has caught on camera the moment a Nissan driverless car passes a cyclist without leaving enough space.
The video, shot in London as Nissan showcased its driverless progress, shows how the car’s console registers the cyclist, but then fails to move over to give him space.
Tetsuya Iijima, global head of autonomous drive development at Nissan, is behind the wheel, but fails to over-ride the car and move out either, the video, spotted by BikeBiz, shows.
One of the French journalists in the car can be heard saying: ”I was a little scared for him" in French.
Last year we reported how Adrian Lord, of the transport consultancy Phil Jones Associates, fears that once technology that prevents pedestrians and cyclists from being hit by vehicles makes it to our roads, it opens the door for vulnerable road users to take advantage of the impossibility of being injured.
He said: "Once people realise that an autonomous vehicle will stop [automatically], will pedestrians and cyclists deliberately take advantage and step out or cycle in front of them?
“If that’s the case, how long would such a vehicle take to drive down Oxford Street or any other busy urban high street?”
Meanwhile professor of transport engineering at the University of the West of England, John Parkin, told the Financial Times (link is external) that much of the infrastructure that's being implemented to keep bikes and cars apart in inner-city environments, will be made redundant by autonomous technology reaching maturity.
"When fewer cars are driven by humans, in cities at least," the professor said. "There would be less need to segregate cyclists from traffic. This would allow roads to be designed as more open, shared spaces."
Add new comment
47 comments
By the sounds of it they're learning to be more like real human drivers! Next thing they'll be bleating on about road tax in electronic voices.
Seriously though, do these algorithms contain strict red lines on adhering to safe distance minimums, or is there some allowance for taking reasonable (clearly not in this case) risks if it decides it's safe to do so? Hope it's not the latter...
How many crashes has Airbus had due to automation? And that's without the complexities of being on a road in heavy traffic.
As a software engineer, I don't see self-driving cars being a thing anytime soon. At least not safe ones, or ones that don't require a lot of human intervention. Worse, this is just going to lead to drivers that will be incapable on the occasion where the self-driving doesn't work.
Please tell us, Velo, just how many crashes have Airbus had due to automation? And for the same period how many more crashes due to human error?
I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned, and several where the cause was attributed to human actions:
https://en.wikipedia.org/wiki/Accidents_and_incidents_involving_the_Airbus_A320_family
Automation
vs. Human
Maybe this last one is unfair... or maybe not. Outside of science fiction (at least as yet), computers aren't prone to emotional instability and even then I'm willing to take my chances with Marvin the paranoid android from H2G2 instead of some of the psychos out there behind a wheel currently - HAL9000 on the other hand ...
Don't come at me with facts, this is the internet! 3.0 (post fact version)
I thought the argument that's been made is that excessive automation 'de-skills' the pilots so when they _do_ need to take over, they can't cope and screw things up? So it's not that the autopilot makes the error, it's that it leaves human pilots more prone to do so than they used to be.
It seems to be one of those plausible-but-debatable thesis that some expert argues for, and which journalists who aren't-as-clever-as-they-think-they-are, then keep excitedly telling us all about as if we hadn't heard it already.
Might be true, might not, dunno, but seems relevant to self-driving-cars, given that the people using them will probably be drunk or asleep or watching movies when they are suddenly called on to intervene.
Point being I don't think one can put much trust in self-driving cars that require the human being able to 'take over' when things get tricky. They are going to have to be able to cope on their own.
Absolutely. Who wants a driverless car that you still have to be ready to drive at a moment's notice? I want it to drive me home from the pub and then go and pick someone else up, like a taxi without the bad tempered driver.
The big advantage that cars have over aeroplanes though is that when they stop, they just sit there rather than plummeting out of the sky. So car autopilots can just slow and stop (hopefully safely*) if they can't work out what to do, no driver intervention required. In a plane, the pilot HAS to take over or everyone dies.
* There are some situations where there's going to be a crash no matter whether the driver's human or cyborg.
Air France 296, June 1988. Controversial, but the plane delayed the pilot's command to throttle up before hitting the trees.
Air France 447, May 2009. Mixed blame with the pilots not knowing how to react to the automated systems disengaging.
Air Asia 8501 (Qz8501), Dec 2014. Blamed on over-reliance on automation leading to an inability to control the aircraft without it.
Indian Airlines 605, Feb 1990. Controversial, some parties claim the crash was caused by throttle behavior that downed Air France 296.
I could go on, but I think you get the idea. Yes, the automation didn't intentionally down the plane, but the automation mixed with human pilot interaction has led to disasters. "Self driving" cars will still have humans and I presume the cars will still have manual control for some time, probably past our lifetimes. The video in this article shows the degredation of skills, the idiot "driver" didn't take control when the vehicle came too close to the cyclist. That type of over-dependence is what made me think of the airlines.
That's why most manufacturers are now looking at going straight to SAE Level 4 - full autonomy, with no driver intervention.
Good article on what the levels are and current state of development (and its where I've lifted the quote above from):
http://www.techrepublic.com/article/autonomous-driving-levels-0-to-5-understanding-the-differences/
Please stop highlighting plane crashes.
Nervous flyer?
Sounds great - are all cars on the road going to be separated by 90 second gaps? with assistant drivers? and traffic controllers in constant contact with drivers?
If the cyclist in question had swerved and been hit, who would be held responsible and taken to court? the driver or the computer programming team?
And if engineers are screening the results - I hope the team includes a few pedestrians, cyclists, mothers with young children, disabled pensioners. I rather suspect most highly paid engineers and computer programmers may not have the best interests of other road users in mind.
Le singe est dans l'arbre.
le singe est au volant
My mate was victim of a close pass by a Tesla the other day. When confronting him, the driver's response was 'sorry mate, it was on autopilot'.
Sounds like a plan for far more civilised cities to me.
the thought of these simpering geeks faffing about with this nonsense anywhere on public roads beggars belief
People are obsessed by the idea that cyclists and pedestrians will 'take advantage of driverless cars'. it sounds like there judging cyclists and pedestrians the same the same way motorist currently act towards vulnerable road users 'it's ok their get out of my way or their be dead'. The difference is that people in driverless cars aren't going to have there lives put in danger by the actions of a cyclists or a pedestrian on a daily basis, there just going to have a slower journey, oh the humanity.
Aren't they going to predestriase oxford street anyway. So yeah it going to take a very long time to driver down there.
Pages