Avionics Production Costs

I have no doubt that getting the government involved triples cost - but upgrading my g1000 to have synthetic vision and waas is $40k.

My iPad and stratus 2 is less than $2k.

The g1000 is very complicated to use. Foreflight is not.

Over-design and needless complication has a hand in higher price.
 
Not true.

Paperwork. The cheap consumer electronics don't need to track parts and suppliers. Plus, the cheap consumer electronics can use whatever parts they want to and can change them at will. A certified box cannot change any of the internal parts without an evaluation of the functional and safety impact.

Show me the certification requirements of a non TSO radio.

There is no TSO to comply with.
 
Did we mention DO-160 and DO-178 yet? Also the whole record keeping thing to keep the MIDO happy.
 
I think a great deal of what dale is saying is spot on, but I also think that some of the price is purely them pushing the price up because they can.

The G1000 is the top dog right now, they know it.

Welcome to capitalism. If the price charged is too high, the won't sell and the price will be lowered. As long as they sell, they'd be stupid to NOT maximize their profits. The company wasn't founded on altruism.
 
One other thing that I don't think was emphasized enough, compatibility testing. These are not stand alone devices like an iPad. They have to work with a bunch of other products from disparate manufacturers. Look at all the different equipment everyone has in their aircraft and all of the different ways it interacts. What a mind bending challenge to get it all to work reliably enough for people to stake their lives on.
 
One other thing that I don't think was emphasized enough, compatibility testing. These are not stand alone devices like an iPad. They have to work with a bunch of other products from disparate manufacturers. Look at all the different equipment everyone has in their aircraft and all of the different ways it interacts. What a mind bending challenge to get it all to work reliably enough for people to stake their lives on.

Not really, the inter-unit communications protocols have been standardized across the industry for quite some time.
 
Not really, the inter-unit communications protocols have been standardized across the industry for quite some time.

Are you suggesting that Garmin can build a G500 and never test it with all of the various autopilots, nav heads, GPS', etc? As long as they have the standard communications interface, ship it.
 
Are you suggesting that Garmin can build a G500 and never test it with all of the various autopilots, nav heads, GPS', etc? As long as they have the standard communications interface, ship it.

That's pretty much the way it's done.
 
Do we forget it is not the FAA that controls the manufacturer of transmitters and communication devices?

And what manufacturer would manufacture a radio and call it an aircraft radio unless it would fit and function as such?
 
Last edited:
One other thing that I don't think was emphasized enough, compatibility testing. These are not stand alone devices like an iPad. They have to work with a bunch of other products from disparate manufacturers. Look at all the different equipment everyone has in their aircraft and all of the different ways it interacts. What a mind bending challenge to get it all to work reliably enough for people to stake their lives on.

iPads aren't standalone either.

As toys, people tolerate the bluetooth dropping pairs or the wireless cycling or poor audio filtering. That's unacceptable for avionics.

It's kinda interesting hearing how trivial all the non-engineers think engineering is.

As a rule of thumb, in software, you spend 50% of your time at all levels of testing. For safety critical systems, that goes much higher. TSO or not, no engineer wants a bug to kill people and no product lead wants a device getting people lost due to poor design or testing.
 
We all know you need to make it work. The question is how much to certification after you know it works? I don't think anyone cares about evil profit maximization just wondering how much profit there is after the gov't approval process meddles in the market.
iPads aren't standalone either.

As toys, people tolerate the bluetooth dropping pairs or the wireless cycling or poor audio filtering. That's unacceptable for avionics.

It's kinda interesting hearing how trivial all the non-engineers think engineering is.

As a rule of thumb, in software, you spend 50% of your time at all levels of testing. For safety critical systems, that goes much higher. TSO or not, no engineer wants a bug to kill people and no product lead wants a device getting people lost due to poor design or testing.
 
That's pretty much the way it's done.
I can tell you've never done it. I'm a one-man shop and even I don't design and sell stuff that way. If you think it's a simple matter to just toss in some "standard" because everyone else adheres to it completely as documented, or that you can get away with just writing off compatibility with equipment that's got little quirks or non-compliant features... well... :dunno: dunno what to tell ya.

There's a reason I don't tell everyone how boats are piloted, since I have absolutely no clue about the subject and have never done it.
 
I can tell you've never done it. I'm a one-man shop and even I don't design and sell stuff that way. If you think it's a simple matter to just toss in some "standard" because everyone else adheres to it completely as documented, or that you can get away with just writing off compatibility with equipment that's got little quirks or non-compliant features... well... :dunno: dunno what to tell ya.

There's a reason I don't tell everyone how boats are piloted, since I have absolutely no clue about the subject and have never done it.

Do you think it's flight tested on every conceivable iteration of A/P? No, it's tested on a bench to make sure it provides the correct signals.
 
Do you think it's flight tested on every conceivable iteration of A/P? No, it's tested on a bench to make sure it provides the correct signals.
And then it's tested on said bench with other manufacturer's equipment, and then the real fun begins -- the part where you fix all the stuff you find that doesn't work. Regardless of what you seem to think, it's not trivial by any stretch of the imagination. Even making sure it works with your own equipment is going to take up significant development and testing time and resources. Then there's regression testing for every little change you make.

Protocols may have been standardized, but there's a lot more to it than just writing code to support how you think the protocol works.
 
We all know you need to make it work. The question is how much to certification after you know it works? I don't think anyone cares about evil profit maximization just wondering how much profit there is after the gov't approval process meddles in the market.

Do you really know that?

'Cause exhaustive testing of a million lines of code is thoroughly impossible.

You can only approximate testing, and getting a good approximation is an art. Have green or overworked folks doing this, and it's going to have big problems.

It's clear you don't do this kind of work. A LOT of devices that don't work or work poorly get out. Anyone remember the Windows 98 roll-out? Serious bugs in consumer devices are the norm, not the exception.
 
I think that what often comes into play in these situations are fixed vs marginal costs, and market segmentation. An example from a different but somewhat related field - healthcare.

I just signed up for a CPR course, and basically the same course is offered for $25 or $60. The difference between the two is that the $60 version comes with a certificate of completion, and the $25 version does not.

I don't need the certificate, because my interest is in being able to save a life if somebody nearby drops due to a heart problem. In my state at least I do not need proof of training to be completely safe from liability (as long as I'm not intentionally causing harm).

However, if I wanted a job as a receptionist in a doctor's office and they wanted their staff to be CPR trained, chances are I'd need the more expensive training. They might need documentation of training for insurance purposes, or to avoid lawsuits.

Looking at the cost side, the cost to deliver the course itself is the same either way (minus the piece of paper the certificate is printed on). However, the instructor probably has fixed costs (employee time between delivering courses, office space, etc). Most likely these expenses are being borne more by the certified classes. Those taking the certified class are less price-sensitive - if you need the certificate for your job than you HAVE to take the class. Those who don't need the certificate might go without, or just watch youtube videos or whatever.

As long as you sell a product for more than its marginal cost of production you're better off selling it than not selling it. However, what you don't want to do is sell a product for marginal cost when you could have sold it for more, especially if you are at risk of not meeting your fixed costs.

Bottom line is that those who need TSOed equipment have little choice (largely commercial operations - casual users aren't really the target here though current regs snag them). So, they are going to pay more. The manufacturer could certainly sell any number of devices at the non-TSO price and come out ahead. However, if they ONLY sold devices at the non-TSO price they'd probably take a loss. At some point they're also in it to profit and I have no idea what the margins are. Barriers to entry are probably fairly high in these industries, but you can only sell at a high margin for so long before competition arises.
 
Do you really know that?

'Cause exhaustive testing of a million lines of code is thoroughly impossible.

You can only approximate testing, and getting a good approximation is an art. Have green or overworked folks doing this, and it's going to have big problems.

It's clear you don't do this kind of work. A LOT of devices that don't work or work poorly get out. Anyone remember the Windows 98 roll-out? Serious bugs in consumer devices are the norm, not the exception.

What are you calling "exhaustive testing"?

As a software engineer I can tell you that if you do good test driven development you can actually get an amazing amount of code coverage in your tests. You write your tests first, then code to the test.

The issue is most companies and shops don't want to pay for the testing and have a "We'll write it correctly the first time" mentality which is murder.. You pay for testing one way or "the other" heh..

But just testing the code doesn't catch all the issues, which is what I think you're getting at. Every single PC out there has slightly different hardware in it.. If you think about it, its actually amazing windows works as well as it does.

Those different hardware challenges would be impossible to test 100% across the board because it would be completely unrealistic to build every possible hardware combination. I imagine aircraft avionics suffer much the same issues.
 
The question isn't how hard engineers lives are or that testing is sometimes insufficient. Simply the question of how much more does certification testing(above what they would do normally) cost and how much of the price increase is because of testing and how much is because they can charge that much.
Do you really know that?

'Cause exhaustive testing of a million lines of code is thoroughly impossible.

You can only approximate testing, and getting a good approximation is an art. Have green or overworked folks doing this, and it's going to have big problems.

It's clear you don't do this kind of work. A LOT of devices that don't work or work poorly get out. Anyone remember the Windows 98 roll-out? Serious bugs in consumer devices are the norm, not the exception.
 
What are you calling "exhaustive testing"?

As a software engineer I can tell you that if you do good test driven development you can actually get an amazing amount of code coverage in your tests. You write your tests first, then code to the test.

The issue is most companies and shops don't want to pay for the testing and have a "We'll write it correctly the first time" mentality which is murder.. You pay for testing one way or "the other" heh..

But just testing the code doesn't catch all the issues, which is what I think you're getting at. Every single PC out there has slightly different hardware in it.. If you think about it, its actually amazing windows works as well as it does.

Those different hardware challenges would be impossible to test 100% across the board because it would be completely unrealistic to build every possible hardware combination. I imagine aircraft avionics suffer much the same issues.

Ah, the code coverage myth.

That assumes all tests are good tests, and that all requirements are good requirements, and that nothing is implicit.

In reality, such coverage testing often includes tests that don't really test anything. Especially related to failure modes; failure to enumerate all possible faults is a very common problem. I've never been involved with a project that had complete and absolutely correct requirements.

It also assumes perfect segmentation. That's something a single very good developer can design in, but it is never realized in practice with a team.

Exhaustive testing means you know every line of code works. Coverage tools give you the illusion that you have that. They are the best we have, but it is still far short of exhaustive testing.

As a systems engineer -- not just a software engineer -- the system must work as a whole for its intended purpose. This means you can find faults in requirements and in the associated tests as well. No coverage testing tool will uncover those.
 
Ah, the code coverage myth.

That assumes all tests are good tests, and that all requirements are good requirements, and that nothing is implicit.

In reality, such coverage testing often includes tests that don't really test anything. Especially related to failure modes; failure to enumerate all possible faults is a very common problem. I've never been involved with a project that had complete and absolutely correct requirements.

It also assumes perfect segmentation. That's something a single very good developer can design in, but it is never realized in practice with a team.

Exhaustive testing means you know every line of code works. Coverage tools give you the illusion that you have that. They are the best we have, but it is still far short of exhaustive testing.

As a systems engineer -- not just a software engineer -- the system must work as a whole for its intended purpose. This means you can find faults in requirements and in the associated tests as well. No coverage testing tool will uncover those.

Well if you write crap tests then yeah you're gonna have worthless testing. Garbage in, Garbage out.

It's also amazingly difficult to write software to a moving target, and any developer that has been in the industry more than a minute can tell you that the powers that be change their mind with the wind and often vast swaths of previously required critical code just went out the window.. to include the exhaustive tests which likely will be a little less exhaustive on the next go around because again, they dont' want to pay for testing..

No client does.. And most product managers don't want to fight that fight with the client so they don't want it either..

I was an IBM engineer for 6 years and was given the "write it correctly the first time" speech more than once because they saw testing as a waste of time and money.

Having said that if you have a good standard, and your people follow it.. you can get excellent code coverage whether you think you can or not. That code coverage though tests the language and the classes, objects, methods and functionality internally so that you get out of model or controller what you expected to..

People being people though, you need someone to monitor the tests to make sure that they are written correctly which again, is rarely paid for. And no testing isn't infallible, but it's a damn site better than having no testing at all which I can tell you from experience is exactly how much many large companies have..

Frankly, I find the best kind of testing also involves a warm body trying to make the code break. But again, they dont want to pay for that and IBM, Dunn & Bradstreet and a few other major companies I worked for would outsource that stuff to China and India where amazingly everything always worked! Even when we wrote intentional bugs into the code..

You get what you pay for.. and I always tell my clients, with testing it's to help find breakage.. and you will pay for that, one way or the other.. All too often I think companies just go light on it and hope they don't get sued.
 
Warm bodies are really useful.

I have my favorite ops testers I like to go to. Show them the hot new feature and let them break it. These are the senior users who really know the high-level system well, and it's their job to get it to work at mission time.

Of course, this is an adjunct to formal testing, not a substitute. But it comes up with conceptual problems every once in a while, something no other testing paradigm can do. This makes these guys worth their weight in gold.

At roughly $4/second ops costs, management is very interested in front-loading the testing. What they aren't interested in is giving us the jet to test on (even on the ground). Some of the systems do require that.

The coverage tests give you verification. They don't in general validate the models.
 
Having been in electronics since 1978, I think the first reply
Because they can be.

is the closest :)

The market has huge barriers to entry.

The biggest one is dealer loyalty. That is quite complex. It starts with the fact that certified avionics must be installed by an avionics installer (actually it doesn't quite but 99.9% of owners could not design and organise it themselves) and the installer makes about 25% on the supply of the gear. And 25% of $10k is a lot more than 25% of $2k.

If you don't believe this, buy some box (say a GTN750 - if you can get one without being a Garmin dealer, which is possible with some subterfuge) and offer to free issue it to an installer, and see how his quote changes from what it was when he was supplying the box.

Another factor, also in dealer loyalty, is that most installers don't really understand the equipment interconnections, and anyway don't have the equipment for debugging. For example almost nobody is able to look at an ARINC429 data stream (with a data scope) and find out why some interconnection is not working. NO. All he can do is phone up Garmin or whatever for help. Very few dealers would move to a new vendor, and even fewer if the product was cheaper.

One doesn't need to come up with more reasons than the above, for why there isn't much more competition.

Certification is not hard or expensive but you need to know how to work the process and you need a relationship with the right people in the FAA, and for Europe in EASA. A newcomer won't have these connections, and a fresh start is a really daunting prospect. For example a while ago I looked at making a simple PMA part and even finding out the process was difficult.

It costs only about $300 to build a GTN750 and put it on the shelf. Somebody could make and sell one (with a 25% dealer margin built in) for say $2000 but that is quite irrelevant if nobody in the installation pipeline is going to buy it.
 
Back
Top