Autonomous Car Problems

Palmpilot

Touchdown! Greaser!
Joined
Apr 1, 2007
Messages
22,453
Location
PUDBY
Display Name

Display name:
Richard Palm
'It's a scam': Even after $100 billion, self-driving cars are going nowhere
Detractors including an industry pioneer are getting louder as the losses get bigger

https://www.autoblog.com/2022/10/08/autonomous-cars-slow-progress-losses-doubt/

Excerpt:

...Over the course of more than a decade, flashy demos from companies including Google, GM, Ford, Tesla, and Zoox have promised cars capable of piloting themselves through chaotic urban landscapes, on highways, and in extreme weather without any human input or oversight. The companies have suggested they’re on the verge of eliminating road fatalities, rush-hour traffic, and parking lots, and of upending the $2 trillion global automotive industry.

It all sounds great until you encounter an actual robo-taxi in the wild. Which is rare: Six years after companies started offering rides in what they’ve called autonomous cars and almost 20 years after the first self-driving demos, there are vanishingly few such vehicles on the road. And they tend to be confined to a handful of places in the Sun Belt, because they still can’t handle weather patterns trickier than Partly Cloudy. State-of-the-art robot cars also struggle with construction, animals, traffic cones, crossing guards, and what the industry calls “unprotected left turns,” which most of us would call “left turns.”

The industry says its Derek Zoolander problem applies only to lefts that require navigating oncoming traffic. (Great.) It’s devoted enormous resources to figuring out left turns, but the work continues. Earlier this year, Cruise LLC — majority-owned by General Motors Corp. — recalled all of its self-driving vehicles after one car’s inability to turn left contributed to a crash in San Francisco that injured two people. Aaron McLear, a Cruise spokesman, says the recall “does not impact or change our current on-road operations.” Cruise is planning to expand to Austin and Phoenix this year. “We’ve moved the timeline to the left for what might be the first time in AV history,” McLear says.

Cruise didn’t release the video of that accident, but there’s an entire social media genre featuring self-driving cars that become hopelessly confused. When the results are less serious, they can be funny as hell. In one example, a Waymo car gets so flummoxed by a traffic cone that it drives away from the technician sent out to rescue it. In another, an entire fleet of modified Chevrolet Bolts show up at an intersection and simply stop, blocking traffic with a whiff of Maximum Overdrive. In a third, a Tesla drives, at very slow speed, straight into the tail of a private jet.

This, it seems, is the best the field can do after investors have bet something like $100 billion, according to a McKinsey & Co. report. While the industry’s biggest names continue to project optimism, the emerging consensus is that the world of robo-taxis isn’t just around the next unprotected left — that we might have to wait decades longer, or an eternity.

“It’s a scam,” says George Hotz, whose company Comma.ai Inc. makes a driver-assistance system similar to Tesla Inc.’s Autopilot. “These companies have squandered tens of billions of dollars.” In 2018 analysts put the market value of Waymo LLC, then a subsidiary of Alphabet Inc., at $175 billion. Its most recent funding round gave the company an estimated valuation of $30 billion, roughly the same as Cruise. Aurora Innovation Inc., a startup co-founded by Chris Urmson, Google’s former autonomous-vehicle chief, has lost more than 85% since last year and is now worth less than $3 billion. This September a leaked memo from Urmson summed up Aurora’s cash-flow struggles and suggested it might have to sell out to a larger company. Many of the industry’s most promising efforts have met the same fate in recent years, including Drive.ai, Voyage, Zoox, and Uber’s self-driving division. “Long term, I think we will have autonomous vehicles that you and I can buy,” says Mike Ramsey, an analyst at market researcher Gartner Inc. “But we’re going to be old.”...
 
It's awfully hard to build a computer to drive a car. There are so many variables, it's almost impossible for an AI to learn how to safely navigate all of them. Or even most of them.
 
Personally, I still think liability is the hardest hurdle to handle.

Who is responsible when the car crashes? The driver (who can't do anything) or the company that produced the car? I don't see any companies willingly taking on that kind of risk and it doesn't make sense for the driver to be responsible. Now assume the company does take on liability? What happens if the accident is the result of the owner not replacing the brakes or tires when they're worn?

We also have the trolley problem. If a crash is imminent, how do we decide between two outcomes: eg. killing a pedestrian or killing the car's occupants? What about if the challenge is killing an infant vs killing a 90 year old? Who makes these decisions? How are they implemented?

Even if technically feasible, there are other challenges that will likely (in my opinion) derail them.
 
Maybe.... but that's a bandwagon I'm not jumping on anytime soon.

Not saying I'm in favor of pilotless passenger flights, just that technically it is a much more predictable situation 90+% of the time, and a much higher percentage if you stay away from air carrier hubs.
 
Personally, if we want to start taking advantage of self-driving capabilities I think we should replace HOV lanes with self-driving lanes on highways/interstates. They can be separated from the normal flow of traffic. You don't have to worry very much about pedestrians. There are no stop signs. It's a much easier problem to solve than residential and city streets.
 
There's this story about hackers playing with the JohnnyCabs: https://www.timesnownews.com/techno...incident, hackers,that lasted for three hours.

or this:
driving.png
 
Personally, if we want to start taking advantage of self-driving capabilities I think we should replace HOV lanes with self-driving lanes on highways/interstates. They can be separated from the normal flow of traffic. You don't have to worry very much about pedestrians. There are no stop signs. It's a much easier problem to solve than residential and city streets.

I would think that long haul trucking is probably a lot more attractive market for this kind of technology at this time. It makes a lot more sense to start off putting these systems in $150,000 trucks that may run a 750,000 miles, on the limited environment of carrying cargo on our freeway system. As opposed to a $60,000 minivan, that may last 200,000 miles taking the kids to school in the morning.

Brian
 
I think many companies are still chasing the wrong goal. A much more limited scope would actually be viable. Such as highways only to start.

Tim
 
We also have the trolley problem. If a crash is imminent, how do we decide between two outcomes: eg. killing a pedestrian or killing the car's occupants? What about if the challenge is killing an infant vs killing a 90 year old? Who makes these decisions? How are they implemented?

Nobody makes that decision - particularly if it is based on AI. The car is not going to be able to know how old someone is.

But, who makes that decision now? The chances that a driver will be prepared are somewhere between slim and none. Most likely a human driver will just slam on the brakes and kill everybody. Not a very high bar to reach.
 
I've said it before - the standard is not to be able to get it right. The standard is to never get it wrong.
Except that the standard is probably only to get it wrong a very limited number of times.
 
I've said it before - the standard is not to be able to get it right. The standard is to never get it wrong.
It could be argued that rather than perfection, the goal should be to make it safer than human drivers. According to the article, traffic deaths in the U.S. are about one person for every 100 million miles driven, while market leader Waymo said last year that it had driven more than 20 million miles over about a decade. That means that they are far short of having enough data to even calculate a comparison, let alone prove that it's safer.
 
I’ve been saying it for decades now. Won’t happen until ALL cars are autonomous and talk to each other. Just too dangerous until then.
 
It could be argued that rather than perfection, the goal should be to make it safer than human drivers. According to the article, traffic deaths in the U.S. are about one person for every 100 million miles driven, while market leader Waymo said last year that it had driven more than 20 million miles over about a decade. That means that they are far short of having enough data to even calculate a comparison, let alone prove that it's safer.

"Better than humans" is a standard that won't work in our current legal environment. When an individual driver screws up, it is almost always settled within the limits of their insurance policy, because they usually don't have assets worth going after beyond that limit. When a company screws up, the liability is often much higher, because deep pockets. Unless a per accident/per person liability cap is placed for autonomously-driven vehicle accidents, the required performance standard is going to need to be at least 10x better than humans.
 
I’ve been saying it for decades now. Won’t happen until ALL cars are autonomous and talk to each other. Just too dangerous until then.

And can talk to each other in a secure manner. Good luck with that (looking at you, ADS-B).
 
And can talk to each other in a secure manner. Good luck with that (looking at you, ADS-B).

And doesn't have any failure modes. And doesn't run slow or lag or crash or overheat or anything that computers sometimes do...
 
I would think that long haul trucking is probably a lot more attractive market for this kind of technology at this time. It makes a lot more sense to start off putting these systems in $150,000 trucks that may run a 750,000 miles, on the limited environment of carrying cargo on our freeway system. As opposed to a $60,000 minivan, that may last 200,000 miles taking the kids to school in the morning.

Brian

This is the world I live in "long haul trucking". They recently did a test from Tucson to Phoenix fully autonomous. It did remarkably well lane changes on the free way around traffic etc, left turn at an unprotected traffic light stuff like that. I'll see if I can find the link.

That being said I'm not going to anywhere near the front of the line to put my guys out of work and "hope" AI can do their job 1000% of the time.

Someday it will happen... Today is not that day IMHO
 
This is the world I live in "long haul trucking". They recently did a test from Tucson to Phoenix fully autonomous. It did remarkably well lane changes on the free way around traffic etc, left turn at an unprotected traffic light stuff like that. I'll see if I can find the link.

That being said I'm not going to anywhere near the front of the line to put my guys out of work and "hope" AI can do their job 1000% of the time.

Someday it will happen... Today is not that day IMHO

Tuscon to Phoenix is a very different environment than New York to Boston. Driving a small car between large trucks in rain or snow is difficult even for a human driver, I don't see how it would work when one or both are automated.

Our roads were built for human drivers. We would be better off building an entirely different infrastructure from scratch for automated vehicles.
 
The first cyclist or pedestrian to be killed by an errant self driving car will make his or her family very rich.
That's already happened. I don't know how much compensation was paid.
 
The first cyclist or pedestrian to be killed by an errant self driving car will make his or her family very rich.
It's already happened. Not sure there was any tremendous payday.˜
 
I’ve been saying it for decades now. Won’t happen until ALL cars are autonomous and talk to each other. Just too dangerous until then.

Only way to do that is to ban driver controlled cars, and that will be a battle. No more classic cars.

I have enough trouble with the automation in my two year old run of the mill pickup truck. I sure wouldn't trust it to do any more that it is already trying to do.
 
I'd rather ban all autonomous cars, even if they were perfectly safe and capable, if having autonomous cars meant not ever getting to drive a real car again.
 
Last edited:
Give me any self driving car, and I guarantee I can find places within 50 miles of my house that would result in them being completely lost and either just stopping or crashing into something. It’s all fun and games until you run into a construction zone with conflicting lines painted everywhere or no lines at all, and you have to figure out where the hell you’re supposed to go.

I’ve been driving since the mid 1970s, and still find places that are a real challenge to sort out. Sure, theoretically you could have road crews put down some sort of guidance patches or devices or whatever. Good luck getting the low-bid paving contractor in BFE to do that right.
 
To me the best application of this technology would be to have autonomous vehicle lanes along major routes, probably mostly interstates. That’s a much more controlled environment than a city street or country road and it’s where you’d want it the most. Let the computer drive the hours long slog up the interstate and you’re only hands on for the first and last few miles.
 
I've said it before - the standard is not to be able to get it right. The standard is to never get it wrong.

That will never happen. It doesn't happen in aviation. Why does anyone think that it is even possible? (nevermind being practical or affordable).

Remember, the most rigorous airworthiness standard is for airline safety, where the probably is that the catastrophic loss of an airplane is 10 to the minus 9 per flight hour, or, putting it another way, there is less than a 50% probability of losing an airplane in the life of the fleet.

Notice that it is never zero.
 
I’ve been saying it for decades now. Won’t happen until ALL cars are autonomous and talk to each other. Just too dangerous until then.


Yep, but that would also require autonomous motorcycles, bicycles, baby strollers, etc. Everything that might be on a street. Including pedestrians.
 
Yep, but that would also require autonomous motorcycles, bicycles, baby strollers, etc. Everything that might be on a street. Including pedestrians.

Well .. Elon's got an answer for that I'm sure! ;)
 
Your cell phone will be “car repellent”.
toss your phone into the highway and create a huge traffic jam. Remember when I said this.
 
I used to have an almost autonomous pick up.

It would shut itself off on it's own.

Sometimes it shut off going down the road, but mostly it shut off when the light turned green...
 
Your cell phone will be “car repellent”.
toss your phone into the highway and create a huge traffic jam. Remember when I said this.

My thoughts had always turned to the fun I could have messing with the autonomous cars:
  • chaff canisters on the motorcycle/handfuls of glitter in a car tossed out the window should stop the cars behind
  • police like hand signals should stop oncoming traffic if you happen to want to cross the street
  • taking advantage of their risk tolerance when their algorithm decides that my vehicle is not predictable
  • etc.
I thought it would be fun... then I started to get stuck behind autonomous prototypes at work while trying to leave the proving grounds for lunch or to go home for the evening and realized that rather than being fun, there will be a lot more road rage and car to car shootings than there are currently; the autonomous cars belligerently follow all traffic laws (because we live in a litigious society), and being stuck behind a car that accelerates like a 90+ year old who is scared to drive/hit anything, then decelerates at a similar rate, then sits stationary for 3 full seconds (timer doesn't start until all wheels are completely stopped) before repeating the process until the next stop sign... absolutely maddening.
 
My thoughts had always turned to the fun I could have messing with the autonomous cars:
  • chaff canisters on the motorcycle/handfuls of glitter in a car tossed out the window should stop the cars behind
  • police like hand signals should stop oncoming traffic if you happen to want to cross the street
  • taking advantage of their risk tolerance when their algorithm decides that my vehicle is not predictable
  • etc.
I thought it would be fun... then I started to get stuck behind autonomous prototypes at work while trying to leave the proving grounds for lunch or to go home for the evening and realized that rather than being fun, there will be a lot more road rage and car to car shootings than there are currently; the autonomous cars belligerently follow all traffic laws (because we live in a litigious society), and being stuck behind a car that accelerates like a 90+ year old who is scared to drive/hit anything, then decelerates at a similar rate, then sits stationary for 3 full seconds (timer doesn't start until all wheels are completely stopped) before repeating the process until the next stop sign... absolutely maddening.
And then there are the ones that stop in the middle of the road because of some commonplace thing that it can't figure out how to deal with. This was linked in the article (the trouble starts about 12:22 into the video):


Eventually it starts up again and then stops in the middle of a very busy road after turning the corner.
 
I hope folks will read the article. It seems well researched and has some good insights into why it's such a difficult problem.
 
Back
Top