Single-pilot airliners and AI (split from "Envoy captain stricken after takeoff and passes")

It works well as part of one of those LinkedIn trigrams like "Thinker. Dreamer. Maker."

Nauga,
Coffee Holder. Sentient Ballast. Meat Servo.
I remember the first time I got checked out in a 206. The club required at least one of the checkout flights to be approximately at max gross weight, so I had the seats filled with sentient ballast (other club members).
 
Barking up the wrong tree with AI. Neural networks are an overhyped parlor trick; statistical pattern recognition dressed up to look like cognition. Even the name is misleading. And yes, I have developed CNN applications.

Remotely piloted aircraft have been fully operational for decades in the defense space. At some point the accident rate of UAVs will be sufficiently better than piloted aircraft to justify wider adoption of that technology for commercial use, including passenger carriage.

As an interim measure, we could see reversion-to-remote as a risk mitigation measure for single pilot ops. Or a backup pilot on board as a risk mitigation measure for remote piloting. Or both.
 
Barking up the wrong tree with AI. Neural networks are an overhyped parlor trick; statistical pattern recognition dressed up to look like cognition. Even the name is misleading. And yes, I have developed CNN applications.

Remotely piloted aircraft have been fully operational for decades in the defense space. At some point the accident rate of UAVs will be sufficiently better than piloted aircraft to justify wider adoption of that technology for commercial use, including passenger carriage.

As an interim measure, we could see reversion-to-remote as a risk mitigation measure for single pilot ops. Or a backup pilot on board as a risk mitigation measure for remote piloting. Or both.

Our brains are statistical pattern recognition devices with the main variable being quality of network connection ( or so called intelligence )
 
Our brains are statistical pattern recognition devices with the main variable being quality of network connection ( or so called intelligence )
No, they aren’t, which is why AI <> brains. Limiting the capacity of brains to something a person (specifically a programmer in this case) can understand will limit the capacity of AI.
 
Barking up the wrong tree with AI. Neural networks are an overhyped parlor trick; statistical pattern recognition dressed up to look like cognition. Even the name is misleading. And yes, I have developed CNN applications....
I've been wondering about that. Creating actual intelligence, as opposed to programmed responses, seems like a tall order.
 
I've been wondering about that. Creating actual intelligence, as opposed to programmed responses, seems like a tall order.
At least for Tesla, the way I understand it, is that their AI(s) process data collected from the over-three million in-service cars and analysis of that data is used to improve the self-driving software. There is no AI, or learning of any kind, onboard the individual cars.

What this allows them to do is quicky collect data on any specific set of circumstances. They can instruct the cars to report on specific types of interactions with the real world and, within hours or days, start receiving back that information for their AI(s) to process. For example, the fleet can be instructed to report data on instances where another car merged into its lane unexpectedly. The AI(s) would look at the data for better ways to detect those merges in advance. A subsequent software update to the cars would include the improved code that results.
 
I was at Dulles airport, May 25, 1972, along with thousands of other interested people, for a Travel related exposition, Transpo 72.

"On May 25, 1972, veteran test pilots Anthony LeVier and Charles Hall transported 115 crew members, employees, and reporters on a 4-hour, 13- minute flight from Palmdale, California, to Dulles Airport outside Washington, D.C., with the TriStar’s AFCS feature engaged from takeoff roll to landing. It was a groundbreaking moment: the first cross-country flight without the need for human hands on the controls. Fly-by-wire technology was here to stay." source: Lockheed Martin.

The Lockheed Tristar had serious, external events that crippled the company, such as the bankruptcy of Rolls Royce, the manufacturer of the only engine that would fit the central tail mounted engine housing.

The humans monitoring taxied to the runway, lined up, and pushed the 'Go' button, and the plane departed per a SID, joined the low, then hi altitude air routes, flew to the east, descended into the arrival routing, and landed. The Captain pushed the stop button, and taxied to the desired location on the ramp. Obviously, the FAA fully co operated in this flight plan, filed for the whole flight, no changes.

The state of the art mostly electro mechanical analog computers were very primitive compared to those we have today, but were capable of very high quality control of the control surfaces from outputs of the gyro's and navigation radio's. The digital matrixes properly coordinated such tasks as raising and lowering the landing gear. A combination controlled thrust and braking.

The Douglas DC 10 was old state of the art, came to market sooner, and cost much less, except on fuel efficiency. Fuel was much cheaper in those days.

Part of the Lockheed problem could have been the lack of skilled maintenance personnel to keep that equipment at top performance. I have had training in that field, in power plants that produce 250,000,000 watts from a single generator, we held out put voltage to less than 0.1 volt from the setting, hour after hour, untouched by the operators. Combustion air fuel balance was held to less than 1% oxygen, and no CO2 from the stack, while burning a train car load of coal per hour. All that was the result of tightly maintained analog computers, in the 1960's.

I think that aterpster has flown the Tristar, but the auto takeoff and auto land were not active on the planes he flew.

Airbus has heavily transferred the equipment failure response into their computer, there are many fewer switches and valves that the pilot can use for control of systems, and with the installation of local precision augmented GPS, plus locally updated taxi diagrams, auto taxi should be simple, relative to a few years ago.

John Deere uses that kind of augmented GPS to steer their tractors with such precision that the planter runs within 3 INCHES of the cultivator cuts. The farmer plots the edges of the field by driving around it, and sets the row and seed spacing, the John Deere does the rest, monitored by the farmer in the airconditioned cab, studying for his remote college course.


Today, the problem as I see it is that the FAA ATC must have a fully automated route and altitude control set up for the whole trip, in advance, and then adjust the clearance into the flight control computer, with the approval of the "Pilot monitoring", as the plane advances through the system. Unless another plane fails to comply with the clearance they received on the ground, no adjustments would be needed. Only the high volume airports would be likely to have much adjustments, but a missed approach would there cause a lot of "Recalculating"!

If the pilot died, no problem, the plane simply flies as now if communication is lost, per the last clearance received AND ACKNOWLEDGED. Other enroute planes would be route adjusted to keep that route available until the plane and dead pilot reaches the destination.

Perhaps the head cabin attendant would be equipped with a cockpit key, and would be advised to check on the unresponsive pilot? We do need to take care of Plan B!

This concept would be based on algorithms similar to email routing, with shortest route as the prime concern, and weather avoidance a close second. The airline's requested route and altitude would be the starting point, as they would have done a weather adjusted route, and altitude optimized for fuel burn.

Note that AI is not in this concept.
 
Last edited:
Skilled, correction done, thanks.
 
Note that AI is not in this concept.
And I think that is the one missed point. Its not the millions of hours of status quo flying that is the main issue, but the 10 minutes when sheet hits the fan. Can the automation handle it or not and what or who is required to intervene in the name of safety. Thats really all certification requirements are but risk assessment and management tools based on qualitative assessments of said failures and their probable rate. Unfortunately even in the certification world it seems when something is deemed "extremely improbable" to fail, or one in a billion, someone or something has to prove them wrong in the first 500 hours. However, for those who want to follow civilian autonomous flight one needs to keep an eye on the eVTOL/UAM/AAM industry as they are the closest to realizing this form of flight and not the 121 operators.
 
"On May 25, 1972, veteran test pilots Anthony LeVier and Charles Hall transported 115 crew members, employees, and reporters on a 4-hour, 13- minute flight from Palmdale, California, to Dulles Airport outside Washington, D.C., with the TriStar’s AFCS feature engaged from takeoff roll to landing. It was a groundbreaking moment: the first cross-country flight without the need for human hands on the controls. Fly-by-wire technology was here to stay." source: Lockheed Martin.
A lot of things are much easier when you're the only airplane in the sky (or, as in this case, all other airplanes are moved out of your way. That's actually pretty easy to do.

Most weather and ATC delays would vanish if your delayed flight was the only aircraft in the sky. The weather and ATC constraints reduce system capacity and flights have to be delayed so as not to exceed, by too much, the system's capacity.

The big city Monday morning commute would be a breeze if no other cars were on the road.

We have autoland (and HUD) systems to allow us to land in visibility less than what a human pilot can land in visually. There is little similar benefit from a potential auto-takeoff function so no such system has every been widely deployed.

Similarly, I don't see much benefit in auto-taxi as you need someone to evaluate factors such as when thrust shouldn't be used to avoid due to the damage it would cause to whatever is behind the airplane and the threat from ground equipment, vehicles, and FOD that are where they should not be. That would be an issue with a single-pilot airliner, too, as you don't have the necessary visibility from either pilot seat alone--you don't even have it all the time with both seats occupied which is why we need marshallers and wing walkers.
 
However, for those who want to follow civilian autonomous flight one needs to keep an eye on the eVTOL/UAM/AAM industry as they are the closest to realizing this form of flight and not the 121 operators.
The astute observer will note that some 121 operators have been entering agreements with players in the eVTOL space ;)

Nauga,
with many irons in the fire
 
There is no AI, or learning of any kind, onboard the individual cars.

Yes and no. A neural network is deployed on the car to make decisions, but that network is trained elsewhere. That is primarily a limitation of computing power, but also a function of the supervised learning process.

Keep in mind, AI does not "learn". It is trained. That process involves feeding an AI model thousands of data sets, one at a time. For each iteration, the model calculates a result, compares it to the correct answer, and then adjusts it's internal formula weights to move closer to the correct result. Once the model reliably delivers a correct outcome, it can be replicated and deployed to generate inferences on unknown data sets.

In general terms, the more data sets available for training, the more accurate the trained model will be. Tesla's advantage is their ability to collect training data sets from their huge fleet of cars. But that data still has to be curated and labeled, the model retrained, and the results validated before it can be deployed to make inferences in the wild.
 
Last edited:
I don't see how it makes sense to call something that doesn't learn "intelligence," artificial or otherwise.
 
There are accidents with one pilot but there are also accidents with two pilots. The issue should not be about deciding what rules to force on everyone. The proper non-violent solution here is to unfetter the airplane builders and airlines and let the paying passengers have exactly what they choose to have. There will be automated flights that cost less and piloted flights that might cost more.



RIP captain.
This article came yesterday -- Airlines want you to get comfortable with flying without a co-pilot.
Will the remote-initiated auto landing be the answer for an emergency like this?

https://www.theregister.com/2022/11/21/pilot_single/

https://fortune.com/2022/11/21/airlines-pushing-one-pilot-in-cockpit-passenger-jets-instead-of-two/
 
There are accidents with one pilot but there are also accidents with two pilots. The issue should not be about deciding what rules to force on everyone. The proper non-violent solution here is to unfetter the airplane builders and airlines and let the paying passengers have exactly what they choose to have. There will be automated flights that cost less and piloted flights that might cost more.
Just like there are crowded flights with uncomfortable seats that cost less and roomy flights with comfortable seats that cost more.

The airlines decide what to offer, and as long as there are enough cattle that will get on board, they pay no attention to what the customer wants.
 
There are accidents with one pilot but there are also accidents with two pilots. The issue should not be about deciding what rules to force on everyone. The proper non-violent solution here is to unfetter the airplane builders and airlines and let the paying passengers have exactly what they choose to have. There will be automated flights that cost less and piloted flights that might cost more.
I wasn't aware that the FAA was engaging in violence. :confused2:
 
The “problem” with AI, or computers in general, can be summarized in a word. Adaptability. Even the best AI fails miserably when given a new set of variables that don’t fall into the pattern of what it’s “learned”. Even humans fail at this on a regular basis.

I think the biggest difference, honestly, is that we accept a human failing to come up with the right answer, but we do not accept a computer failing to do so. We want to believe computers are infallible. everything in science fiction tells us that. We believe that a computer knows 1 + 1 is always 10, and extrapolate that out to infallibility, if you will.

But I will always trust myself over a computer, even if the computer is almost always better and it makes no logical sense to not trust it. I’d honestly feel better if I killed myself in an airplane (or car) than if the autopilot killed me. Not that I’d feel anything either way, but you know what I mean.

it probably doesn’t help that I’m a programmer and have seen too many instances of a program that’s executed properly millions of times suddenly do something completely unpredictable, when given the tiniest bit of different input.
 
It takes 10 years to certify new airplanes or engines. There is no way that pilotless commercial aircraft are going to get certified and be flying in 10 to 20 years. The 787 max ended any hopes of that happening in my lifetime.

Where I do see automation coming to aviation in my lifetime is in air traffic control. ADSB was the first step but it won’t take much more advancement in technology to eliminate the need for air traffic controllers.
 
Where I do see automation coming to aviation in my lifetime is in air traffic control. ADSB was the first step but it won’t take much more advancement in technology to eliminate the need for air traffic controllers.
From my perspective, it would seem that there's an opportunity for automation in ATC to establish a sequence much earlier in the flight. Today, it is common for one sector to slow us down then the next one slow us up. Sometimes you have more than one slow/fast cycle on a flight.

I don't see that as eliminating, or even reducing the number of, controllers but it would give them better information to work from so that subsequent sectors are working on the same plan instead of each having their own plan.
 
but it won’t take much more advancement in technology to eliminate the need for air traffic controllers.

I would say ATC would be harder to be replaced by AI (than piloting), given all the emergency, priority, defense, and unknown (say lost comm) situations. The decision-making in a certain situation can be harder for AI can handle (similar to the condition posted to the AI driver, if a pedestrian appears in front of a car, would the car run it over or jink-and-crash but it could kill the passengers in the crash?) Things can add up quickly that AI can't handle for ATC decision-making.
 
We believe that a computer knows 1 + 1 is always 10,...


Try that on a quantum computer and you'll find that there's merely a statistical range of answers with a very high probability that 1+1=10. :)

Part of the issue with AI and neural nets is that the training set is not nearly as vast as the training set a human uses to make decisions, nor can it ever be. Your personal neural net began at birth, and even though you probably don't realize it, your decision making process has been forming and changing all your life, and is effected by everything from the kid who bullied you in third grade to your college philosophy class to what your spouse said just before you left the house. A computer will never have the "life experience" that factors into forming human judgement. Much of a person's knowledge comes from sources that don't get associated with the subject matter under consideration when forming a training set.

A simple example:
In some of our early work on missile launch detectors, I recall the system false-alarming on reflections from swimming pools. A human understands what a swimming pool is and where it's likely to be located and also understands that there won't be a SAM site in a suburban backyard, but it's darned tough to put the entire vast data set of a human mind into an AI training set.
 
Try that on a quantum computer and you'll find that there's merely a statistical range of answers with a very high probability that 1+1=10. :)

Part of the issue with AI and neural nets is that the training set is not nearly as vast as the training set a human uses to make decisions, nor can it ever be. Your personal neural net began at birth, and even though you probably don't realize it, your decision making process has been forming and changing all your life, and is effected by everything from the kid who bullied you in third grade to your college philosophy class to what your spouse said just before you left the house. A computer will never have the "life experience" that factors into forming human judgement. Much of a person's knowledge comes from sources that don't get associated with the subject matter under consideration when forming a training set.

A simple example:
In some of our early work on missile launch detectors, I recall the system false-alarming on reflections from swimming pools. A human understands what a swimming pool is and where it's likely to be located and also understands that there won't be a SAM site in a suburban backyard, but it's darned tough to put the entire vast data set of a human mind into an AI training set.
Interesting. After a few dozen "slam on the brakes" false alarms in my Tesla, I realized that what was happening is the car was "seeing" reflections of traffic cones off the bumper of the car in front and it thought the cone was stationary in front of me. I figured this out when I noticed the phantom cones appearing on the screen in the car.
 
Much of a person's knowledge comes from sources that don't get associated with the subject matter under consideration when forming a training set.
Which is part of the reason pilots are non-deterministic controllers and neither explicitly repeatable nor predictable, which is kind of redundant, I know. For the time being we happen to be the best adaptive and reconfigurable controllers available, but our record has improved with the introduction of augmentation.

Nauga,
who will fix it in wetware
 
Which is part of the reason pilots are non-deterministic controllers and neither explicitly repeatable nor predictable, which is kind of redundant, I know. For the time being we happen to be the best adaptive and reconfigurable controllers available, but our record has improved with the introduction of augmentation.

Nauga,
who will fix it in wetware

This is where I'm at as a passenger. I'm all for technology that helps the pilots get me safely where I'm going, but if the SHTF, I want Sully to land me safely on the Hudson rather than a computer try to get me to Teterboro because that is what it "should" do.
 
This is where I'm at as a passenger. I'm all for technology that helps the pilots get me safely where I'm going, but if the SHTF, I want Sully to land me safely on the Hudson rather than a computer try to get me to Teterboro because that is what it "should" do.


Plus how many water landings of airliners are there in the data set that trained the computer? Between them, Sully and Skiles had flown just about every type of winged aircraft, an experience set far beyond the training used for a plane’s AI.
 
That’s all true, but more times than not you have a guy with far less diversity of experience. And the more automation we put in our GA and airline aircraft the less experience in actually flying the guy up front is gonna have.
 
I find myself wondering how good so-called "AI" is going to be at evaluating the weather it sees out the windscreen.
 
That’s all true, but more times than not you have a guy with far less diversity of experience. And the more automation we put in our GA and airline aircraft the less experience in actually flying the guy up front is gonna have.
This is going to be one of the bigger challenges to going single-pilot, in my mind.

Right now, we have a pretty good mentor/mentee system, where (usually) less experienced First Officers fly with more experienced Captains. This is where they learn collaboratively from each other, with the Captain typically mentoring the FO as unusual situations arise during the flight. Now, at a major airline, most every pilot who is hired has been PIC somewhere before, whether that was at the regionals, or military, or other 121 airlines, so it's not as much as a factor, but even when I got to my airline, I definitely learned from the Captains I flew with, even though I had years of heavy jet left seat time in the Air Force. This problem, though would be much more pronounced at the regionals where you typically have CFIs filling the right seat where their exposure to airline type flying (faster airplanes, in the flight levels, through much more diverse and challenging weather systems) has probably been close to nil.

With a single-pilot system, where do the Captains in the seat gain the experience to sit there, without first sitting in the right seat under the tutelage of an experienced Captain?
 
where do the Captains in the seat gain the experience to sit there,
You'll find in the current discussions the term "experience" is not used much by those who believe in this route. Instead they use terms like increased training, skill sets, system managers, etc. when explaining their side. Seems they imply experience can be taught vs learned. There is also a move to split this topic into separate definitions: reduced crew (single pilot) and extended minimum crew ops with their unique acronyms, (SiPO) and (eMCO) respectively. Here are some links to the high level discussions/papers mentioned in the OP articles. There's a similar move on the maintenance side but the pilot side is getting all the coverage.
https://www.icao.int/Meetings/a41/Documents/WP/wp_101_en.pdf
https://www.icao.int/Meetings/a41/Documents/WP/wp_099_en.pdf
https://www.icao.int/Meetings/a41/Documents/WP/wp_323_en.pdf
https://www.icao.int/Meetings/a40/Documents/WP/wp_426_en.pdf
 
Plus how many water landings of airliners are there in the data set that trained the computer? Between them, Sully and Skiles had flown just about every type of winged aircraft, an experience set far beyond the training used for a plane’s AI.

Also think about the 767(?) that ran out of fuel at altitude and glided to a landing...

Or the the crew coordination that helped the DC-10 land (sort of) (without hydraulics) at Sioux City, Iowa.

getting an AI trained to do some of that pilot, er, stuff will be quite challenging, especially with degraded aircraft systems.

Maybe the AI should graduate Test Pilot school...
 
Back
Top