First driverless car pedestrian death

Hi everyone.

By looking at the video, which seems to have been captured with a very poor night vision camera, much less than what a human eye can detect, the person seems to appear instantaneously, but I am sure that that / a responsible "driver" would have see and avoid killing this person.

In addition the human "driver" and the car manufacturer that provided the vehicle, and the company that was involved in allowing the vehicle on public streets, should also be brought up on charges. The fact that there was a human driver, being used an escape goat, should not accepted as being sufficient, as you can see from the video there was not sufficient attention directed, on a continuous basis, to prevent this. It would take a human a sufficient amount of time to act / react, even when 100% of their attention is directed to this single task / driving. It would take 2-3 times longer to react when distracted and thinking that they do not need to control the vehicle.
Not something that should be done on public streets.

While I thoroughly dislike the ambulance chasers, this is one case that deserves to get everyone involved, to pay for their act, all above mentioned and others not mentioned that may be responsible, with sufficient additional punishment to stop them from doing it again until sufficient testing gets done, and the technology improves to the level that is capable of handling this task.

This is / was not necessary to happen and it's important for everyone to go back do testing, in a controlled environment, before it should be allowed on public roadways.

I am all for technology, but it is apparent that more and more of the people involved are a bunch of idiots that do not know what they are doing, and or care to do the right thing, and more than willing to jeopardize someone's life for no good reason.

It is my opinion that this technology will only be successful, sufficiently safe, when until All the vehicles are able to communicate with each other, and be controlled by a single source. There is no sufficient, data sources input, processing power, AI, to have the vehicles control themselves in a real / public environment, yet.
This is one killing that could / should have been avoided.
 
Last edited:
Same here. I think the vast majority of drivers would have been able to make an aggressive lane change and would have missed the pedestrian.
I agree that humans would have probably reacted. But how many would have killed themselves and 5 kids by swerving, flipping and rolling their minivan numerous times? While I do believe that a human would have reacted, my personal opinion (based on human driver behavior and reactions to unexpected scenarios) is that majority would not react safely.

The technology in that car is not ready for the roads, not as a fully autonomous vehicle.
I don't disagree.
However, I disagree with blaming the company for this senseless death. Whether some Uber SW was keeping the car in the lane, keeping the speed (like cruise control) or whether the driver had their foot on the gas pedal, does not matter. The driver is the PIC equivalent on the road. She was responsible for the safe outcome of the trip. Otherwise why have the driver in the driver seat? Or why have the driver seat at all? Driverless cars should just have a back seat (or sofa), no?

If you set cruise control in your car, do you start texting because it frees up your time some? You still watch your speed and scan for obstacles and such so that you can stop in case you encounter something on the road.

If Uber cannot get out of this one, they need to hire a real lawyer.
 
The driver is the PIC equivalent on the road. She was responsible for the safe outcome of the trip. Otherwise why have the driver in the driver seat?
.
.
.
If Uber cannot get out of this one, they need to hire a real lawyer.


Precisely why Uber won't get out of this one.

The safety driver, Rafael Vasquez, is an Uber employee. Do you think Delta isn't held responsible for if one of its PICs is negligent?
 
I agree that humans would have probably reacted.

There was a human driver. They didn't react.

I counted 25 frames, or less than a second, between the time the first glimmer of the peds white shoes were visible and the collision.

If the car, or the human, had immediately applied emergency braking then maybe the ped would have been injured instead of dead. But this is a "miracle on the hudson" scenario. Yes, Sully could have made the airport if he had turned immediately, but he couldn't turn immediately for a variety of reasons. This is the same deal.
 
If they try to pin it on Vasquez he'll sue the crap out of the also.
 
There was a human driver. They didn't react.

I counted 25 frames, or less than a second, between the time the first glimmer of the peds white shoes were visible and the collision.

If the car, or the human, had immediately applied emergency braking then maybe the ped would have been injured instead of dead. But this is a "miracle on the hudson" scenario. Yes, Sully could have made the airport if he had turned immediately, but he couldn't turn immediately for a variety of reasons. This is the same deal.
A human sitting idle watching a car drive is going to react even slower than one responsible for steering the car.
 
If they try to pin it on Vasquez he'll sue the crap out of the also.


If Vasquez was at least attentive at the time of the accident I would agree with you, but he wasn't performing his role as safety driver.
 
If Vasquez was at least attentive at the time of the accident I would agree with you, but he wasn't performing his role as safety driver.
I agree, but I also think he's got a case saying it's impossible for any human to perform that role.
 
If Uber cannot get out of this one, they need to hire a real lawyer.

Uber won’t get out of it because the engineers will say the LIDAR should have seen the ped long before the headlights illuminated her in the visible light camera.

They’ll do the angle and speed math from the LIDAR head to where she was and then ask for the raw sensor data and show the sensor was either broken and the system didn’t stop and disable itself (or warn the driver that it was impaired), or the sensor was fine and the control software didn’t react.

They’re screwed either way. That LIDAR head should have seen her long before the human driver could. The street lighting and shadows sen in the video are completely irrelevant.
 
If the car, or the human, had immediately applied emergency braking then maybe the ped would have been injured instead of dead.
There's also the option of turning away from the ped. Granted, some will freeze in this situation and feel like the brakes are their only recourse.
 
Precisely why Uber won't get out of this one.

The safety driver, Rafael Vasquez, is an Uber employee. Do you think Delta isn't held responsible for if one of its PICs is negligent?
Oh, she's an employee? I did not know that, thank you for the input.
So while HER negligence was a contributing factor to the collision, the maker of the car will be punished. Just like we should punish Colt, Smith&Wesson and other manufacturers for thugs using devices of those brands to hurt people.
Just like in a "normal" car set on cruise control, she would have hit the bicyclist as well, IMHO. Then would the manufacturer of the vehicle be screwed as well?
We (as humans) need to start assigning blame where it belongs, on our actions. Shifting blame if we make a mistake has become an instinct.
 
There was a human driver. They didn't react.

I counted 25 frames, or less than a second, between the time the first glimmer of the peds white shoes were visible and the collision.

If the car, or the human, had immediately applied emergency braking then maybe the ped would have been injured instead of dead. But this is a "miracle on the hudson" scenario. Yes, Sully could have made the airport if he had turned immediately, but he couldn't turn immediately for a variety of reasons. This is the same deal.

I don't know when the brakes were applied - or if they were. This is why the video plus the data stored in the car will be interesting to compare timestamps. When did the car's system first recognize the pedestrian? When did the system first recognize the pedestrian was a threat? When did the car apply the brakes, and what percentage of braking was applied?

The driver was not paying attention, his reaction possibly occurred when a warning alarm sounded or the brakes were applied and got him to look up from his phone. I can't tell from the video if he reacted to seeing the pedestrian or if he reacted to something the vehicle did and THEN noticed the pedestrian.
 
Oh, she's an employee? I did not know that, thank you for the input.
So while HER negligence was a contributing factor to the collision, the maker of the car will be punished. Just like we should punish Colt, Smith&Wesson and other manufacturers for thugs using devices of those brands to hurt people.
Just like in a "normal" car set on cruise control, she would have hit the bicyclist as well, IMHO. Then would the manufacturer of the vehicle be screwed as well?
We (as humans) need to start assigning blame where it belongs, on our actions. Shifting blame if we make a mistake has become an instinct.

That would mean the software and hardware engineers in this case if the car was “driving”. If the passenger in the car seat was “driving” then you’re correct.

And both will be a legal construct not a moral one. The moral construct would make the engineers responsible.

And there’d be no concept of “a company” in a moral construct.

But law is rarely moral or just. The engineers won’t be charged with manslaughter or murder.

I bet the real reason this happened is that the car is tuned to look for car sized and shaped things and people sized and shaped things and person plus bike confused the software.

Addition of a FLIR sensor and integration with the LIDAR would have given enough data to know the thing crossing in front of it was warmer than the surroundings and likely a human or alive even if it was shaped wrong.
 
I don't know when the brakes were applied - or if they were. This is why the video plus the data stored in the car will be interesting to compare timestamps. When did the car's system first recognize the pedestrian? When did the system first recognize the pedestrian was a threat? When did the car apply the brakes, and what percentage of braking was applied?

The driver was not paying attention, his reaction possibly occurred when a warning alarm sounded or the brakes were applied and got him to look up from his phone. I can't tell from the video if he reacted to seeing the pedestrian or if he reacted to something the vehicle did and THEN noticed the pedestrian.

For me the video makes it seem very obvious that a human would have done no better. That is an important standard. If we hold the companies to a higher standard than that, then we're going to end up with the destruction of the industry - just like GA in the 90's before the liability laws were revised.

Now that said, as an engineer I'm *very* interested to see everything you talked about.
 
For me the video makes it seem very obvious that a human would have done no better. That is an important standard. If we hold the companies to a higher standard than that, then we're going to end up with the destruction of the industry - just like GA in the 90's before the liability laws were revised.

Now that said, as an engineer I'm *very* interested to see everything you talked about.
I'm not sure about letting the driver off too easily.

I have no idea what the car was actually seeing, but he dashcam shows a limited field of view, and not a very good exposure. The Mark I Eyeball is a lot better than it gets credit for. Human vision seems pretty good at catching movement and a pedestrian walking across a road being backlit by oncoming headlights *might* have been noticed before crossing the centerline, but the driver wasn't looking. This might be a case of "we'll never know". Would an average driver see something like that? Maybe, how many other pedestrians are run over at that location at that same time of night? Maybe not, pedestrians do continue to get hit by cars.
 
So while HER negligence was a contributing factor to the collision, the maker of the car will be punished. Just like we should punish Colt, Smith&Wesson and other manufacturers for thugs using devices of those brands to hurt people.

This wasn't just someone "using" the Uber. This was someone employed by Uber for the specific purpose of ensuring safety.

When a company is required to provide a safety driver, and they hire an employee for that role, the employer is responsible for ensuring the employee is qualified and performs the role correctly.

If Smith & Wesson delivered a .357 that exploded in your hand because their employee was negligent in manufacturing the gun, S&W would indeed be liable for your injuries.
 
The engineers won’t be charged with manslaughter or murder.


This is, by the way, an example of an interesting oddity in the US system. (I'm going to over-simplify this a bit.)

Because manufacturers develop products (like automobiles) that move in interstate commerce and regulation of interstate commerce is reserved to the federal gov't, states do not generally have the ability to regulate engineers who practice in manufacturing industries. (Similarly, the states can't regulate engineers who practice in aerospace or other federally controlled areas.) Thus most engineers who design everything from toasters to self-driving cars are not required to be licensed. Engineering licensing (like medicine or law) is done by the states, not by the feds.

Engineers who design public works, OTOH, are required to hold state licenses. Not very many bridges or skyscrapers move across state lines.

So,...

Unless they did something criminal, the engineers who designed the Uber car won't face any legal penalties or barriers to continuing to practice engineering. Legal liability will rest on their employer. The engineers who designed the collapsed bridge in Miami, however, may very well lose their licenses and be barred from further practice should the collapse be traced to negligence in the design.

Kinda weird, huh?
 
Uber is going to show their training materials and say the driver was breaking numeous rules.

This whole thing is getting settled out of court anyway because they won’t want their engineering practices put on trial or in public record.

I bet most of these companies have budgeted for a number of death payouts too.
 
Uber is going to show their training materials and say the driver was breaking numeous rules.

That might save 'em from criminal charges but it won't help in a civil suit in front of a jury. I agree; they'll settle out of court.
 
If Vasquez was at least attentive at the time of the accident I would agree with you, but he wasn't performing his role as safety driver.
The level of attention that you are suggesting, in a monitoring-only role, can not be maintained for a significant length of time. It is outside of the limits of human capability. Even if you keep your eyes outside they will continually relax to a relatively short focal length and you'll miss many of the approaching threats until it is too late to react. It is the act of constantly adjusting the steering to maintain your lane that keeps the brain focused on task and, even then, attention can and does waiver.

Any system which relies on this level of attention in a monitoring-only role is a faulty design.
 
Hi everyone.
I see many posts that seem to think / accept this killing as a normal act.

As an engineer, or even a common sense human, do you think that this scenario should have been tested, before the car / technology was allowed on public roads?
Here are the minimum test configurations that should be expected, an object from any direction, the size of a baseball, or smaller, in any type environment, should be detected and evasive action taken before it will be released on public roads again.

Yes, the Test engineer, the CEO, the human Driver, politicians, and everyone else involved with this process should go on trial for murder.
The gun analogy is completely off the mark. None of the manufacturers of guns, that I know of, have created a gun that goes out on it's own, and kills people. If a gun is used to kill, for anything other than self defense, and the person involved is known they would be charged with murder, why would these people be treated differently?
 
The level of attention that you are suggesting, in a monitoring-only role, can not be maintained for a significant length of time. It is outside of the limits of human capability. Even if you keep your eyes outside they will continually relax to a relatively short focal length and you'll miss many of the approaching threats until it is too late to react. It is the act of constantly adjusting the steering to maintain your lane that keeps the brain focused on task and, even then, attention can and does waiver.

Any system which relies on this level of attention in a monitoring-only role is a faulty design.

So design it such that the monitor driver thinks they need to drive all the time and don’t tell them when the vehicle is driving. Shift back and forth.

There’s also various vehicles already with driver inattentiveness monitors. Should have been one in these things and the number of alerts monitored by the company.

Hell the thing that’ll really hang them is when the prosecution asks them if videos of other drivers also show this level of inattentiveness and they say they don’t know, they destroyed all of them each shift if no accidents happen.

Jury would hang them on that alone. They’ll look like they’re covering up something. Which, they are.

Which is why it’ll never go to trial.

Remember we’re talking about a company that’s so cheap they’re skimming the value of people’s vehicle depreciation after handing them back less than they’re losing driving the vehicles, to make a profit.

I bet there’s all sorts of corners cut in their engineering design.

One online blogger about money did a six month stint with them and documented it. He would refuse pickups that were far away because he had calculated that he would lose money on them. The “management” would chastize him and say he had to take those trips or he would be let go because he was the closest driver and not taking the trip “hurt the company” because another (dumber) driver would have to be dispatched from further away.

Uber also won’t accept vehicles older than certain years which makes the depreciation calculation even worse. Doesn’t matter if the older vehicle is fully maintained and as safe as it was when it rolled off the assembly line. You have to use the higher value vehicles.

Uber wants self driving cars to eliminate their biggest problem. They’re screwing their employees. The employees don’t really care, they want cash flow, not profit. It’s essentially a scheme to borrow from your vehicle depreciation incurred by driving it much more than you would have otherwise.
 
So design it such that the monitor driver thinks they need to drive all the time and don’t tell them when the vehicle is driving. Shift back and forth.
Of course not. The system must be designed so that it doesn't depend on the human operator to monitor it so closely as doing so is not humanly possible for anything other than short periods.
 
Doesn't Tesla sense when the "driver" doesn't have a hand on the wheel?

Yes, but Uber doesn't require that. Tesla uses level 2 automation (they're working on level 5, but none of that is available to anybody right now).

Uber is supposed to be level 4.

But really what Uber is doing is level 4 rules with level 2 tech.

upload_2018-3-22_12-15-47.png
 
Of course not. The system must be designed so that it doesn't depend on the human operator to monitor it so closely as doing so is not humanly possible for anything other than short periods.

Why? It’s a test system. Compare the reactions of both and give the human an override button too. There’s no operational need of actually letting the computer drive to get the data they need. It’s either ready to drive all the time and the human isn’t needed or it’s not ready to drive.
 
I agree that humans would have probably reacted. But how many would have killed themselves and 5 kids by swerving, flipping and rolling their minivan numerous times? While I do believe that a human would have reacted, my personal opinion (based on human driver behavior and reactions to unexpected scenarios) is that majority would not react safely.


I don't disagree.
However, I disagree with blaming the company for this senseless death. Whether some Uber SW was keeping the car in the lane, keeping the speed (like cruise control) or whether the driver had their foot on the gas pedal, does not matter. The driver is the PIC equivalent on the road. She was responsible for the safe outcome of the trip. Otherwise why have the driver in the driver seat? Or why have the driver seat at all? Driverless cars should just have a back seat (or sofa), no?

If you set cruise control in your car, do you start texting because it frees up your time some? You still watch your speed and scan for obstacles and such so that you can stop in case you encounter something on the road.

If Uber cannot get out of this one, they need to hire a real lawyer.

Funny you should mention a minivan. About 20 years ago, My wife and I were driving on I-285 in my 1991 Ford Aerostar minivan, when I saw something floating above traffic in the opposite lane. I saw it sink down and then bounce back up, at which time I realized it was a tire and wheel that had most likely been a trailer spare that was now on the loose. I also noticed that it appeared to be stationary in my windshield, which we all know means that we are on a collision course. I waited to see where on the van it was going to hit, and it looked like just about the center of the windshield, so I made a very aggressive lane change to the right. That really took no more than a sharp tug on the wheel, the van responded as expected, and when I straightened the wheel, the van stayed in the new lane, no fuss, no drama. The wheel and tire glanced off the side of the van, only doing minimal damage. That surprised me, it sure was loud when it hit. That was in a car with no driver assists or stability control. Any modern car would respond the same way.

I do think this is Uber's fault, they sent an experimental vehicle out on the public roadways with what appears to be a terribly undertrained employee and killed someone. From what the dashcam footage shows, a human driver most likely would have avoided the same accident if it had not been an autonomous car. That sure sounds like wrongful death to me.
 
There is no one single cause of this accident, Just like in many aviation accidents it was a culmination of a series of failures...the technology failing to recognize the pedestrian, the human driver being distracted, and the pedestrian stepping seemingly into oncoming traffic all are equally contributors to the fatality.

I know it is cold and should have and could have been prevented but I have little sympathy for someone that steps into traffic in front of a moving vehicle. In my state pedestrians have almost unequivocal right if way, but there is a also a provision in the law that states that pedestrian shall not create a hazardous situation. I see it ALL the time where the pedestrian ASSUMES the right of way and that is just flat out Natural Selection in my book.

Now, yes, this could have just as easily been a child chasing a ball...but it wasn't..is was a grown adult. The Pedestrian is just as much at fault here but that will not stop the ambulance chaser from doing their thing.
 
I can see a few stages: Both using current technology.

1) Driver assist. We already have vehicles with adaptive cruise control, lane assist to keep you between the lines, automatic emergency braking, parallel parking, highly integrated GPS.

2) Autonomous cars. Dedicated highway lanes where all vehicles are autonomous. This could be built.

The trick, for now, seems to be keeping autonomous cars and everything else seperated.

Many years ago on the Interstate near me, two Econoline vans full of younger teens and a couple adults were driving to or from a summer camp. They came over a hill and on top of a semi going well below the speed limit. The first van swerved and the second had no chance, nobody survived. A dedicated autonomous lane might be able to signal upcoming traffic speed and prevent something like this.

Rear end accidents are being prevented now with automatic braking systems, but better AI will be needed to integrate robot cars with humans for a while.
 
Of course not. The system must be designed so that it doesn't depend on the human operator to monitor it so closely as doing so is not humanly possible for anything other than short periods.

The goal is for that to eventually be the case. This is NOT finished, ready-for-prime-time technology. The safety driver is needed during development.
 
The goal is for that to eventually be the case. This is NOT finished, ready-for-prime-time technology. The safety driver is needed during development.
No argument on that. My point is that you can't expect the safety drivers to do something that people are unable to consistantly do.
 
There is no one single cause of this accident, Just like in many aviation accidents it was a culmination of a series of failures...

The vast majority of NTSB reports begin with...

“The pilot’s failure to...”

Fill in the blank.

If we applied aviation rules to this one it would read, “The pilots failure to avoid the pedestrian.”

It would continue and say, “Over-reliance on automation was a factor.”
 
I do think this is Uber's fault, they sent an experimental vehicle out on the public roadways with what appears to be a terribly undertrained employee and killed someone. From what the dashcam footage shows ...
From what the dashcam footage shows .... she was barely paying attention to the road, texting or playing a game on her phone.
No amount of training can FORCE a dumb and lazy person to do something they don't want to do.

For comparison, imagine if she was driving her regular car and while texting, hit the bicyclist. Now would it be a good idea to sue the state whose DMV provided the Driver Ed classes to the driver because she seems undertrained?

Same with the student pilot who ran out of fuel during his solo practice and the ensuing (pun intended) lawsuit against the school for improper training.

I do understand what you are saying but i will repeat my words: we, as humans, need to learn to accept our own responsibility in life, not hide behind some company to shift blame for our mistakes.

(for the record, I don't have a dog in this fight, I don't have any feeling for/against Uber or their automated vehicles, I do not own any of their stock or use them in real life - this is a moral debate to me)
 
Murder requires premeditation. None of the people involved intended to kill anyone.

Once driverless cars are ubiquitous will turning off the computer driver “proven” to be better than humans be an act of negligence if it results in homicide?
 
Once driverless cars are ubiquitous will turning off the computer driver “proven” to be better than humans be an act of negligence if it results in homicide?
It can only be homicide if the death is intentional. The rest of your question is up to the lawmakers to answer.
 
Back
Top