How long until glass panels cost $5 bucks?

...
And it is VERY disturbing that Garmin Pilot would release a product that you intend to use in an "emergency" with bugs so serious as to cause an outage. This by itself means you cannot trust it. Process does not include testing for catastrophic bugs (at least, not adequately).

Note that this is not the worst failure they could have had. The worst failure (which is VERY common in Garmin non-aviation applications) is to continue to provide data, just incorrectly.


Garmin Pilot is not the emergency app in question - it doesn't provide anything other than a moving map.

Jay's other gyro-substitute apps also rely on GPS and in the event of an issue with the GPS they become useless.

I'm in full agreement that any standby instrument must work reliably and not itself rely on anything external... and that's why they cost what they do.

But no need to beat up on Garmin for the pilot app in this case.
 
In medicine, we often talk about a treatment looking for an indication.

I call it a solution looking for a problem. Another way of saying the same thing. :D

I even remember when I paid an extra $400 for a 20 meg hard drive upgrade. What a deal! That was 20 years ago, or so.

More than 20 years ago, I'm afraid. I spent about $300 on a 20 MByte drive around 1986. 26 years ago. Ouch!!! Consider how much main memory we put in PCs today. :D :D
 
More than 20 years ago, I'm afraid. I spent about $300 on a 20 MByte drive around 1986. 26 years ago. Ouch!!! Consider how much main memory we put in PCs today. :D :D

And notice how they're not as many orders of magnitude faster at doing the same tasks, as the upgrade numbers would suggest they should be. :)

Bloat, bloat, baby. (Sung to the tune of Ice, Ice, Baby.) :(
 
And my first pc did not even have a hard drive, did everything with true floppy disks. The problem with technology is it is great as long as it is used in an appropriate fashion otherwise it can be quite harmful.
 
And notice how they're not as many orders of magnitude faster at doing the same tasks, as the upgrade numbers would suggest they should be. :)

Bloat, bloat, baby. (Sung to the tune of Ice, Ice, Baby.) :(

Single digit MhZ to 2 Ghz and now you have to wait. The push button to fully operational on screen today should be near instant...but you have to wait.
Two billion instructions per second should not take almost a minute to boot up.
 
Single digit MhZ to 2 Ghz and now you have to wait. The push button to fully operational on screen today should be near instant...but you have to wait.
Two billion instructions per second should not take almost a minute to boot up.

Common misconception there.

Very, very few applications are truly processor bound. And it has been this way since the late 80s, at least at the introduction of the 80386. Other things that affect execution time are I/O (always orders of magnitude slower than CPU), memory access, network latencies, thread synchronization and other overhead (this is almost universally underestimated), graphics rendering, and so on. And of course feature bloat makes this much worse.

Disk access in particular has not grown at anywhere near the same rate as processor speed, and access times in the milliseconds are still prevalent. This affects bootup time significantly.

But the real reason bootup (and other applications) don't scale is poor design. People assume it's an easy job, and it isn't. And the languages that support threading automatically (Java and most stuff after it) encourage really bad design.
 
Common misconception there.

No misconception at all. And I understand what you're talking about. I use to build custom computers from scratch..as in take a pile of IC chips, wire and a soldering iron, then design from a blank piece of paper and go from there. Not even prepackaged designs or PC boards...completely new down to machine code language.

My point is that the software nowadays is bloated to the point that it's crap even though it does work for the most part. A 2+ghz computer (the commands in the software have to be executed somewhere somehow, not in magic land) can't keep up with essentially the same type program (word processor, spreadsheet, whatever) that a 1mhz Apple II could do without delay running off a FDD. Yea, so what, an Apple II can't do a movie while editing a video while running email, photo editing and having 20 screens open doing real work at the same time. But when you take something like a word processor or spreadsheet...you'd think the bazillion times faster computer could at least not make you wait..especially after it took a minute or so to load up a ton of other junk to make it ready to do whatever.
Software is like politics, only screwed up...
 
I remember making that argument when Windows 3.1 came out....that was the first 80x86 OS with a fully integrated GUI that you couldn't operate without (unlike Windows 2, which was an application running on top of MS-DOS).

It means the most basic operations require graphics rendering. 25 years later, that's still expensive.

Bloat is a (big) problem, but it's not THE problem. I've only met a handful of application programmers who understood how to design a multithreaded program to get within a factor of 2 or 3 of peak processor speed. And a factor of 10-100 is much more common. It's quite difficult to do, and expensive. When market forces say cost is the only variable, that goes out the window first.

Make no mistake -- leaving market forces alone to govern how software engineering is done guarantees that performance will always be barely tolerable. If you want to fix that, you'll need another control variable.
 
The Avare app on my Android phone has made the $Garmin$ 295, 296, 396, 496 series aviation GP$ OBSOLETE. :yes:
 
Back
Top