Thermocouple wire connector

If both ends of the wires are in physical contact in a buttsplice,
But still a butt splice. That was the point of my comment. You’ll also note JPI permits soldering which is a stated no-no by thermocouple OEMs to include Omega. Yet it is acceptable in this case as is the splice. You’ll find the OEM system tolerances drive the thermocouple practices where some systems are critical like on turbines and do not permit such variance in practices. It is what it is.
If it is in a more protected area, probably not much difference.
Exactly. Depending on the system the difference does not fall outside the indicator scale tolerance. Wire routing has a bigger influence on readings than connector selection.
 
It depends on the temperature difference between the two connectors. If near a hot exhaust pipe, one connector could be much hotter than the other, resulting in that same difference being added/subtracted from the temperature you are actually trying to measure. If it is in a more protected area, probably not much difference.

When you say "temperature difference between the two connectors" do you mean the difference between the male and female connector for a single sensor? Or between the connectors of two different sensors?

Because if it's the difference between the male and female connectors, I can't imagine there would be much difference in their temperatures since they are... you know, connected.
 
Every possible metal combination creates a small voltage that varies with temperature. For most metals commonly used for conductors (wiring) the voltage is too small to be useful. Some alloy combinations generate a voltage large enough to be useful, and that’s what thermocouples are made from. The Type K thermocouple commonly used in EGT probes has a Nickel-Chromium alloy on the + side and a Nickel-Aluminum alloy on the – side. When you connect those wires at the “hot junction” you get up to some tens of millivolts. At the other end where the thermocouple is connected to copper wires (the “cold junction”) it's colder so you get a lower voltage, in the opposite direction. From the difference between the hot and cold voltages you can calculate the temperature difference between the hot and cold junctions.

Note that a thermocouple by itself can't give you an absolute voltage, just the difference difference between the hot and cold junctions. Thus the gauge is calibrated assuming a standard temperature for the cold junction (commonly a 32°F ice bath for precise laboratory measurements or 60°F for real world measurements). Some gauges may include some other means to measure the cold junction temperature and compensate the reading accordingly. I can think of a number of ways to do that but I don’t know what’s common in aircraft instruments, I’ve only used the uncompensated ones.

What that means for an uncompensated gauge with the cold junction being at ambient temperature is that if the air temperature rises, the indicated temperature will be lower for the same actual engine temperature. So assuming you have a gauge calibrated at 60°F and your EGT is 1400 and the outside air temperature is 90°F, the gauge will read 1370, not the 1400 the engine is actually operating at.

Now if the cold junction is inside the engine compartment where it could be even hotter, well, you can do the math. Ideally, you would keep the cold junction at the temperature it’s calibrated at. If the extension wires all the way back to the gauge are the same Ni-Cr and Ni-Al then the cold junction is at the gauge and if you have cabin heat so it's never too hot or cold behind the instrument, the gauge will be reasonably accurate.
 
When you say "temperature difference between the two connectors" do you mean the difference between the male and female connector for a single sensor? Or between the connectors of two different sensors?

Because if it's the difference between the male and female connectors, I can't imagine there would be much difference in their temperatures since they are... you know, connected.

Thermocouple is a two-wire device - positive and negative. Each wire might go through a connector (male/female combo made from a different metal type). The temperature difference I was referring to is between the two connectors. If the connector is made from the same metal type as the thermocouple wire (the positive wire is different material than the negative wire), then this temperature difference does not play any role.
 
On my planes that I've installed CHT and EGT sensors on the connectors from each sensor less than 1/2" apart. So any ambient heat affecting one, will have pretty much the same impact on the other.

But back to my original question, how much of a difference in the reading would using the wrong connectors create?
 
But back to my original question, how much of a difference in the reading would using the wrong connectors create?

As long as they make good contact, no difference. Using the wrong (thermocouple vs. copper) wire between those connectors and the gauge would make a difference.
 
But back to my original question, how much of a difference in the reading would using the wrong connectors create?
Not much.

Every time you connect any two dissimilar metals you have the potential for a thermal effect - how much depends on the difference in temperatures at other connections and the type of metals. But, lettuce assume that you have a loop of Nickel-Chromium wire connected with copper crimp (ignore the tin coating for now) connectors plugged into each other. How much voltage do you get? Not freeking much because with the two connections about 1/4 inch apart, they are going to be purd near the same temperature - temperatures elsewhere in the circuit don't matter - because their are no dissimilar metal junctions. You just have a loop of wire with nothing going on. Put those same connectors in a larger thermocouple circuit and you get the same non-effect. The two junctions are very close together and probably very near the same temperature, so no net effect.

Now, I'm not saying it's best practice to introduce a bunch of random metals in a thermocouple circuit. But, yea, chances are it ain't gonna give a very large error.
 
I do this for a living. Reading some of these comments was making my head hurt, so I drew a picture that "might" help explain this a bit. The bottom line is that if your junctions (butt-splice, knife connector, etc.) are at the same temperature from one side to the other (which is probably 1" or less), then there will be no effect on the final temperature reading. You DO NOT have to use special splices or special connectors. But you DO have to use the proper thermocouple wire from the sensor all the way to the instrument. (You cannot switch to copper wire along the way.) Any modern instrumentation package/system/controller that is designed to accept thermocouples directly will have internal temperature compensation. Ambient temperature in the cockpit will have no effect on the displayed temperature.

thermocouple.png
 
Any modern instrumentation package/system/controller that is designed to accept thermocouples directly will have internal temperature compensation. Ambient temperature in the cockpit will have no effect on the displayed temperature.

I guess they're not "modern", but uncompensated gauges are common in ultralights and experimentals since they require no electrical power. Interestingly, since an air cooled engine without a thermostat will generally run the same temperature difference above ambient, on hot days when the engine runs hotter, the uncompensated gauge reads lower by the same amount, so it always points to the same place of the engine's running properly.
 
The instrument should have some method of accurate temperature measurement of the area where the thermocouple wires terminate to allow for compensation of the reading.
How is that accomplished?

Isn't the system installed to measure the temperature in the termination area? Or, do you need ANOTHER system to compensate for the first system? If so, where does that end?

As you can see, I am confused by the statement above.
 
How is that accomplished?

Isn't the system installed to measure the temperature in the termination area? Or, do you need ANOTHER system to compensate for the first system? If so, where does that end?

As you can see, I am confused by the statement above.
The thermocouple measures the differential temperature between the hot and cold junctions. By using a different method (RTD, thermistor, etc.) to measure the actual temperature of the cold junction, the actual temperature of the hot junction can be calculated. Those other devices don't have the hot/cold junction issue to deal with, but they're not as robust as a thermocouple.
 
The thermocouple measures the differential temperature between the hot and cold junctions. By using a different method (RTD, thermistor, etc.) to measure the actual temperature of the cold junction, the actual temperature of the hot junction can be calculated. Those other devices don't have the hot/cold junction issue to deal with, but they're not as robust as a thermocouple.

Exactly!
 
How is that accomplished?

Isn't the system installed to measure the temperature in the termination area? Or, do you need ANOTHER system to compensate for the first system? If so, where does that end?

As you can see, I am confused by the statement above.

Sorry - to be more clear, the statement should have been "The instrument should have some method of accurate temperature measurement of the area where the thermocouple wires terminate to the instrument to allow for compensation of the reading."
 
Back
Top