Android-based EFB discussion

Follow up on this... I took the ASUS cable that came with the wall charger out and tried to charge my Nexus from the 2.1 amp outlet on the car charger. No joy. Used a cable from iGo in the same outlet - charged the unit just fine.

For reference, here are links to Amazon.

igo coiled cable: http://tinyurl.com/8coetzs
iGo micro-usb tip (A97) http://tinyurl.com/8zt35zk
Mediabridge dual charger http://tinyurl.com/8u8all3

The iGo tip works with my iGo auto/AC adapter too... so I can run my laptop and my nexus, or my phone, from the charger. Just a plug for iGo, as I bought one three years ago and it's powered all my different laptops and usb stuff properly.

The 2.1 amp outlet on the car charger must be made to the Apple standard, not to the worldwide (everyone else) standard.

I wonder how the iGo cable counters the information the N7 is receiving from the car charger?


Sent from my Nexus 7
 
Last edited:
The 2.1 amp outlet on the car charger must be made to the Apple standard, not to the worldwide (everyone else) standard.

I wonder how the iGo cable counters the information the N7 is receiving from the car charger?


Sent from my Nexus 7
My guess is that the iGo shorts things right at the tip, since it has interchangable tips.
 
The 2.1 amp outlet on the car charger must be made to the Apple standard, not to the worldwide (everyone else) standard.

That's not my experience. I have multiple 2.1A chargers and ALL charge the iPad2 just fine. NONE would charge the Nexus 7, so it would appear there's something funky in what Asus is doing. Cables didn't make a difference. Cables that wouldn't charge it with my iPad adopters would charge it on the adapter that came with the Nexus.

I also learned from those forums there are times the Nexus is actually charging, but the status display won't indicate a charge since it is charging at a lower rate.

From a charging standpoint, my iPad2 is much more "industry standard" than my Nexus 7.
 
That's not my experience. I have multiple 2.1A chargers and ALL charge the iPad2 just fine. NONE would charge the Nexus 7, so it would appear there's something funky in what Asus is doing. Cables didn't make a difference. Cables that wouldn't charge it with my iPad adopters would charge it on the adapter that came with the Nexus.

I also learned from those forums there are times the Nexus is actually charging, but the status display won't indicate a charge since it is charging at a lower rate.

From a charging standpoint, my iPad2 is much more "industry standard" than my Nexus 7.

Well, if you consider Apple's new method to be the "industry standard", you're right. I'm afraid the rest of the world disagrees, but that's pretty irrelevant, now. We are now all stuck trying to find chargers that will really put out 2.1 amps to Android devices -- and there's no good way to tell by looking at them. How dumb is this?

Sent from my Nexus 7
 
My guess is that the iGo shorts things right at the tip, since it has interchangable tips.

Makes sense. There are lots of guys making their own cables (over on the Android forums) by soldering the two appropriate wires together.

Sent from my Nexus 7
 
If you're willing to root your device, that may be another option. I have found a software mood that lets my HTC Incredible use full-current charging via "USB" connections like my 2.1A car charger, for instance.
 
Well, if you consider Apple's new method to be the "industry standard", you're right. I'm afraid the rest of the world disagrees, but that's pretty irrelevant, now. We are now all stuck trying to find chargers that will really put out 2.1 amps to Android devices -- and there's no good way to tell by looking at them. How dumb is this?

Most adapters DO tell you the amperage output. They're all either 1.0 or 2.0 or 2.1A. I can't fault Apple for needing more juice to charge the bigger bettery in the iPad, rather than needing 12-15 hours to charge it at a lower rate. What frustrated me was that my Asus would recognize ANY of the 2.xA chargers that all worked fine with the iPad and several Android devices. I fault it for being finicky!
 
Well, if you consider Apple's new method to be the "industry standard", you're right. I'm afraid the rest of the world disagrees, but that's pretty irrelevant, now. We are now all stuck trying to find chargers that will really put out 2.1 amps to Android devices -- and there's no good way to tell by looking at them. How dumb is this?

Sent from my Nexus 7

I am rather curious -- what is this "new standard" and what is the "correct standard"? What is the technical difference and history behind the two?
 
I wish there was a viable alternative to Foreflight on Android.

Garmin is a very viable alternative to ForeFlight...and cheaper at the moment. It's not as intuitive an interface as FF, but it's certainly usable. I've been using it more than FF of late just because of the convenient form factor of the Nexus 7 when flight instructing in and out of different cockpits all day.
 
Most adapters DO tell you the amperage output. They're all either 1.0 or 2.0 or 2.1A. I can't fault Apple for needing more juice to charge the bigger bettery in the iPad, rather than needing 12-15 hours to charge it at a lower rate. What frustrated me was that my Asus would recognize ANY of the 2.xA chargers that all worked fine with the iPad and several Android devices. I fault it for being finicky!

Yes, the charging device says "2.1 amps" -- but if it's made to the Apple standard, your Android device won't know to draw 2.1 amps. It will pull the lower 500 milliamps, which is not sufficient to keep up with demand when you're running a big screen, GPS, Bluetooth... It will think it is connected to a PC, because it won't sense a short in the appropriate wires -- which is the worldwide standard.

Sent from my Nexus 7
 
I am rather curious -- what is this "new standard" and what is the "correct standard"? What is the technical difference and history behind the two?

I am no engineer, so bear with my limited electrical vocabulary. That said, here is what I have gleaned from the interweb:

There was one primary way for devices to "know" they were plugged into a wall charger (and could therefore draw a full power 2.1 amp charge): The device sensed a short circuit in two of the wires, either in the USB cord, or in the unit itself, and that was the signal it used to know what to do.

No short, it knew to only pull 500 milliamps, like a PC would provide. Yes short, and it knew to pull 2.1 amps, like a car charger or wall charger would provide. Simple, elegant, and it worked. (I may have it backwards.)

Then, Apple thought up a "better way" to do it. Apparently (and this is conjecture -- no one seems to know, or at least they're not saying) Apple started using a resistor instead of a simple short to delineate between the two. This is just enough different so that Android devices can't tell 'em apart.

The chargers will still work with Droids, but the device will always see it as a PC -- and will therefore never draw 2.1 amps.

And the device dies a slow death in heavy usage -- as it did to us Sunday.

Currently there is apparently no way to tell a charger made to the Android standard from one made to the Apple standard -- which is causing a fair amount of confusion in the Android world. The key seems to be to use only Android-device-branded chargers, which are (of course!) four times more expensive.

It's all just a stupid PIA.



Sent from my Nexus 7
 
The maximum charging current in the USB specification is 900 mA. The needs of devices jumped far ahead of the specifications.

A later addendum called the Battery Charging Specification was added in haste in December of 2010, long after Apple's iPad 2 had been designed months before and had already hit the streets, with many of their computer models also having been updated to charge them months prior to the device hitting the street, since that hardware development work started far earlier than the iPad 2 release date.

The USB 3.0 spec (and all USB specs) includes the concept of digital negotiation between the device requesting power and the host device in 100 mA increments. A "charge unit".

Various schemes for supporting more than the six maximum 100 mA charging units that can be requested of a standard USB 3.0 host are currently in use.

Apple's utilizes the presence of specific voltages on certain pins to extend the functionality of their USB ports on their computers, and the shorted pin idea you've mentioned has been done on some other manufacturer's charging devices.

The shorting technique shorts the data pins. It's meant for charger-only devices and data comm is not possible with simultaneous charging from a cheap device. Hard to talk when your data likes are shorted together b

The iPad was one of the first widespread non-industrial devices that needed high-current and Apple had to design their own solution because there was no standard at the time.

Later devices, such as some of the Barnes and Nobel Nook readers also had to draw high current and simply require a special charger.

While various manufacturers have decided one scheme or another is best for them, there's no ratified worldwide standard, even if USB 3.0 could truly be considered a standard. (Various host chipsets early on didn't meet the intent if the standard for things the standard was really created to solve, like data transmission speeds. Problems with hubs and power weren't straightened out for quite a while and problems still persist in the real world.)

Even powered-off machine charging was something Apple had to consider as did the chipset makers early on before there was a standard.

The answer still lies with the manufacturers really.

Does the device manufacturer provide a tested and working car charger solution they're willing to put their brand-name on for their device?

That's the best way to get it right for most non-technical users.

Or wait for the 3rd Party charger manufacturers or user community to start labeling charging devices as compatible with the device.

(Note, for the record, Apple doesn't make one either. They make the venerable airline adapter and an impressive international wall charging adapter kit, but left the liability and other problems with automotive 12VDC charging up to third party vendors.)

As far as your hope that micro-USB will fix all of your charging problems forever, don't count on it.

Just by picking micro-USB, if the manufacturer doesn't utilize the extra side connector with additional pins, they're choosing to lock the device at USB 2.0 for data rate, since micro-USB simply doesn't have enough pins in the single connector format to do USB 3.0.

Five pins isn't enough. So no 5GB/s data transfers for any device that shackles themselves to micro-USB.

So Jay, yes. It's a PITA, but not one of Apple's making.

The "standards" didn't keep up with device power requirements and all manufacturers plowed forward with engineering various solutions for their customers.

Since there are significant lead times between when a final end-product chipset that a manufacturer requests from a chipset manufacturer, to add a feature like high-current charging, and when final production motherboards and daughterboards come off the assembly line, the USB 3.0 charging spec was too-little, too-late.

In the end, tiny gauge wiring and circuit board traces amongst various other things, hampered development of high-current charging ports. That and no devices that needed it really, until iPad on a widespread basis.

USB was originally designed to replace parallel ports, traditional serial ports, etc. The fact that it has even held up this long, doing things it was never originally intended to do, is pretty impressive.

But Apple didn't cause any of your headaches with your new high current draw device. Those problems were already out there.

Ever see a USB Y-cable for portable USB hard drives in the USB 2.0 days? The drive needed more than 500 mA of power. The simple solution was to simply use two USB ports on the host.

USB shows some signs of age today. Most machine manufacturers would prefer not to have a 5V bus, and many don't, since lithium ion batteries lead more directly to 3.3V operation in laptops. Charge pump and other schemes raise the voltage to the 5V nominal spec.

If you look carefully at Apple's new connector, it's pins are wide. I suspect one of the reasons they decided to drop the Dock connector was current carrying capability of the pins. When the new connector hits the iPad, watch for a faster charge time announced as a new feature, I bet. Less pins, wider conductors, bigger gauge wire inside the cable. Just a guess.
 
Excellent info. That clarifies much about the situation. Goes to show that few solutions are the "perfect" solution forever. (Except for Oreos....using TWO cookies and the filling in the middle. That has stood the test of time.)
 
Yes, the charging device says "2.1 amps" -- but if it's made to the Apple standard, your Android device won't know to draw 2.1 amps. It will pull the lower 500 milliamps, which is not sufficient to keep up with demand when you're running a big screen, GPS, Bluetooth... It will think it is connected to a PC, because it won't sense a short in the appropriate wires -- which is the worldwide standard.

Sent from my Nexus 7

Apple senses voltage levels on pins to determine available current.
The Universal Charging Solution "standard" calls for pins to be shorted to indicate a charging device. It's A standard, but calling it THE standard is a stretch, in my opinion. That would be like calling BetaMax THE standard. It was a standard, it just wasn't successful. I can think of many other examples.

So if a charger is made to work with Apple, they won't short the pins at the charger, and your non-Apple device won't charge correctly.

It seems to me that this is a case where the incredible popularity of Apple devices results in economic pressures to conform to their standard, rather than others.

Sort of like POSIX/GOSIP? It doesn't really matter who publishes a standard, in the end what matters is what people will pay for.
 
Last edited:
I the consumer want my iPad and my Galaxy S2 to charge from the same plug =(
 
Excellent info. That clarifies much about the situation. Goes to show that few solutions are the "perfect" solution forever. (Except for Oreos....using TWO cookies and the filling in the middle. That has stood the test of time.)

Agreed. Thanks for clarifying my admittedly muddy understanding of a very complicated situation...

Sent from my Nexus 7
 
I the consumer want my iPad and my Galaxy S2 to charge from the same plug =(

I'm seriously considering just cutting up a couple USB extension cables so one always acts like the not-Apple standard and one acts like the Apple standard and then I don't have to figure out which of my chargers do which thing.
 
FYI... there's a new FREE Android moving map App... AVARE. Very simple, but effective. Full sectional and AFD coverage. Check it out
 
Back
Top