CNBC video: "The Future of Flying Is Here: Pilotless Planes"

I don't think a pilot would be shaking the yoke that much even in turbulence. Isn't that much over control input hard on the control linkage?
 
Does the rail industry still employ locomotive engineers in train cabs?
 
Does the rail industry still employ locomotive engineers in train cabs?
Yes they do, although technology is such that they could run without them. It will be a long time before the unions and the government allow trains to run without someone on board.
 
Halfway through our flight, the remote operator on the ground told Olson that the navigation system guiding the aircraft was slightly off. It was likely why the aircraft wasn't exactly lined up with the runway centerline on takeoff.

It wasn't life or death but system reset was recommended and Olson was forced to take manual control for a few minutes while it was reset. It was a chance for Olson to familiarize himself with flying the self-flying plane.


https://www.businessinsider.com/flying-on-self-flying-plane-convinced-its-future-of-aviation-2021-5





 
It may be the future but I don’t think it happens in my lifetime.

Given the current state-of-the-art wrt "software safety" and reasonable near-term expectations in improvements, I certainly hope it doesn't happen in my lifetime.
 
Time the ransomware to trigger just after takeoff. It'll probably infect the plane from someone's device connected to the entertainment system. Someone in finance will have decided to save money, the same computer and network will fly the plane and do the in-flight entertainment.
 
... Someone in finance will have decided to save money, the same computer and network will fly the plane and do the in-flight entertainment.

good luck getting such an architecture certified.

Fortunately, there are times when hollywood doesn't match reality.
 
Who said anything about Hollywood? Think Colonial.
Think DO-178, with which the oil and gas industry does not comply.

...and IFE has already brought down at least one piloted airplane.

Nauga,
standard
 
Last edited:
Colonial didn't have to have the system get any airworthiness certification.
Think DO-178, with which the oil and gas industry does not comply.

...and IFE has already brought down at least one piloted airplane.

Nauga,
standard
You guys are too trusting. Companies will cost costs where they can. Boeing 737Max, anyone? There's a whole thread on that, somewhere on PoA.
 
You guys are too trusting.
...not by a long shot, but I do understand what design, validation, and vertification of flight critical software entail. I don't think the same can be said about a lot of people insisting it is doomed to fail.

Nauga,
stamped
 
I've said it many times - the standard is not that they can get it right. The standard is that they must not get it wrong.

As someone who grew up in the software engineering world, I will never get on a plane without trained pilots. There are simply too many things that can go wrong.
 
... The standard is that they must not get it wrong.

Not quite - the standard is that the probability of getting it wrong is acceptably low.

Note that zero is simply not possible.
 
Does the rail industry still employ locomotive engineers in train cabs?

Yes they do, although technology is such that they could run without them. It will be a long time before the unions and the government allow trains to run without someone on board.

Right now there are still two people in the front of every freight train. I'm not going to worry about pilot-less airliners until that goes away.
 
I've said it many times - the standard is not that they can get it right. The standard is that they must not get it wrong.
I agree that this is an informal goal for flight critical software. It has never been the standard for pilots and pilot training, and we all accept that without blinking.

Nauga,
who doesn't trust people either
 
good luck getting such an architecture certified.

Fortunately, there are times when hollywood doesn't match reality.
Based on the 737 MAX fiasco, is FAA certification really all that high a bar?
 
Last edited:
Based on the 737 MAX fiasco, if FAA certification really all that high a bar?
Take a look at CFR14 part 25, DO-178, DO-160, etc...
It is neither trivial nor perfect.

Nauga,
in another time and place
 
I agree that this is an informal goal for flight critical software. It has never been the standard for pilots and pilot training, and we all accept that without blinking.

If it does not become the formal goal, they will fall short of the required safety standard.

There are certainly thing software can do better, but if the software fails in simple things, it is unacceptable. Say for example, ordering nose down trim that results in a crash in VFR conditions because it doesn’t recognize a failed sensor.

I will repeat, the software must not get it wrong. It must be capable of detecting errant data and resolve what’s correct, and do it with less sensory input than a human has. It must not follow a single or even multiple data point into a crash.
 
In my half dozen years of doing Army UAS.....most of the accidents were still caused by human induced failures.

My vote is to reduce the cockpit by one.....and introduce the systems with a man in the loop where passengers are involved. Then in 20 years when we have great lessons learned....make further reductions after we've proven the 10^-9 reliability.

In those applications where operations are over remotely populated areas....set the kracken loose.
 
Last edited:
If it does not become the formal goal, they will fall short of the required safety standard.

There are certainly thing software can do better, but if the software fails in simple things, it is unacceptable. Say for example, ordering nose down trim that results in a crash in VFR conditions because it doesn’t recognize a failed sensor.

I will repeat, the software must not get it wrong. It must be capable of detecting errant data and resolve what’s correct, and do it with less sensory input than a human has. It must not follow a single or even multiple data point into a crash.

Your desired "formal goal" is not actually the formal safety standard for airworthiness.

look at the latest revisions to AC 25.1309; AC 23.1309; AC 20-115
 
...not by a long shot, but I do understand what design, validation, and vertification of flight critical software entail. I don't think the same can be said about a lot of people insisting it is doomed to fail.

Nauga,
stamped
Unfortunately, you weren't one of the people at Boeing, as I'm sure it would have been done correctly. I'm not being sarcastic, and I'm not being patronizing. I won't even pretend to know anything about flight critical software.

But I'm not talking about software. I'm talking about companies and management who take shortcuts to save money in parts and labor (no backup critical AoA sensor?). Stupid management tricks will doom autonomous planes, not the technology.
 
Your desired "formal goal" is not actually the formal safety standard for airworthiness.

look at the latest revisions to AC 25.1309; AC 23.1309; AC 20-115

the difference between strategic and tactical.

I believe the goal is loosely encoded in the AC under failure modes. The system designers must be depended upon to catch every interaction and determine the correct severity. That is requiring them to not be wrong without stating it as a principal.
 
the difference between strategic and tactical.

I believe the goal is loosely encoded in the AC under failure modes. The system designers must be depended upon to catch every interaction and determine the correct severity. That is requiring them to not be wrong without stating it as a principal.

no, the difference between good enough and perfect.

I guess I'm getting hung up your use of "every" and "all" and "perfect"... your use of absolute terms.
 
no, the difference between good enough and perfect.

I guess I'm getting hung up your use of "every" and "all" and "perfect"... your use of absolute terms.

so it’s ok if system designers let some data interactions go through without understanding them? I meant, if it is only a few interactions, there can’t be a safety risk, right?

perfect IS the goal when it comes to not failing because to say otherwise is acceptance of automated failure.
 
When I was a youngster and aspiring pilot in the 80s, a lot of the aviation literature of the time claimed we had seen the last of new piloted aircraft. Any next generation aircraft would be pilotless. I was so worried I would never get to become a pilot.

Fast forward 40 years later, we are still designing and building piloted airplanes.

They also figured flying cars would be the next big thing in transportation by the turn of the century.
 
so it’s ok if system designers let some data interactions go through without understanding them? I meant, if it is only a few interactions, there can’t be a safety risk, right?

perfect IS the goal when it comes to not failing because to say otherwise is acceptance of automated failure.

well, perfect is the "goal", but perfect is not the requirement.

What is the acceptable probability of a failure leading to a catastrophic loss of an aircraft and all souls on board? (I think you can find that in AC 25.1309 and/or 23.1309 - it's been a few years since I've looked at them - after all, I AM retired)
 
I am surprised no one has mentioned the driver-less semi trucks being tested on the interstate highway system. The trucks don't make stupid driver mistakes and there is never a road rage incident.
What could possibly go wrong?
 
well, perfect is the "goal", but perfect is not the requirement.

What is the acceptable probability of a failure leading to a catastrophic loss of an aircraft and all souls on board? (I think you can find that in AC 25.1309 and/or 23.1309 - it's been a few years since I've looked at them - after all, I AM retired)

I vote for 0.000000000001 or better. It has to be better than a human.

any fault is going to amplified by the number of aircraft. The Max fault crashed two airplanes, which is two more than acceptable.
 
Back
Top