4-20 mA standard?

A

Thread Starter

arip

Does anybody know why must it be 4-20 mA. Is there any technical background behind it? Why
msut it be 4-20..not other ranges? Same goes why is it 3-15psi. Is there a standard for this?
Appreciate some answers. Thanks.

 
K
> Does anybody know why must it be 4-20 mA. Is there any technical
> background behind it? Why must it be 4-20..not other ranges? Same goes
> why is it 3-15psi. ...

Physically, technically, it's arbitrary, but not without some reasons. Having a "live" zero-level is useful since it allows differentiating
between a informational zero and the thing just not working. Plus, there's energy available in even a zero signal to do useful work, e.g.,
loop-powered devices. As for why choose 4-20 mA or 3-15 psi or 10-50 mA, divide those by some small value and you have 1-5 somethings. Ultimately, I think it's just a convenient choice.

> ... Is there a standard for this?

I think there are formal standards that deal with this, but it's about as defacto a standard as I've heard of.

--
Ken Irving <[email protected]>
 
While I was not involved in selecting 4-20ma by ISA as a standard, I do remember some the period of time before the 4-20ma standard. My
recollection is that there where at least three current signals in use in the US: 4-20ma, 10-50ma, and 1-5ma. I believe that 4-20ma was introduced by Honeywell in the late '50s. 10-50ma was Foxboro's standard and 1-5ma was
Taylor's. Foxboro 10-50ma and Honeywell 4-20ma typical signal voltage was 1-5v while Taylor's was different(0.25-1.25v I think). The standardization on 4-20ma occurred I believe in the mid '70s. I believe that the 10-50ma
signal was not chosen as the standard because of safety considerations, both personnel safety and intrinsic safety(Foxboro's supply voltage was around 60-70 volts). I might also suspect that 4-20ma was chosen over 1-5ma because it gave a better power supply margin to supply instruments and a bit better signal voltage level compared to Taylor's 1-5(0.25-1.25v). There may have been political issues at work since at the time in the US since Foxboro and Taylor were the major instrumentation suppliers and 4-20ma may have been
an acceptable compromise. Also, Internationally I believe that there might have also been a 2-10ma current signal in use.

Some people have already mentioned some of the advantages of current loops but to reiterate a few and maybe add a few, current loops have the
advantages of being able to send signals over long distances, can provide power for loop elements via signal wires, have improved noise immunity, are tolerant of poor connections, provides ease of having multiple instruments
in a loop, and can easily provide a single reference differential signal voltage. In addition, the standard current loop also provides the advantages of a live zero.

The 3-15 psi standard is a bit more vague. I believe that 3-15 standard was in use in the late '40s, early '50s with the introduction of pneumatic transmitters. My recollection is that it certainly was the standard in the early '70. The origination of the 3-15psi signal may have come from the design of pneumatic transmitters and in particular the design of the
flapper/nozzle assembly. A flapper/nozzle clearance of 0.002" is common in pneumatic instruments and 3-15psi provides fairly linear response of flapper/nozzle vs. nozzle backpressure in that region. There was also a
6-30 psi signal around which I believe was used typically on the output side, probably due to the available power in the signal.

Bill Mostia
===========================================
William(Bill) L. Mostia, Jr. PE
Independent I & E Consultant
WLM Engineering Co.
P.O. Box 1129
Kemah, TX 77565
[email protected]
281-334-3169
These opinions are my own and are offered on the basis of Caveat Emptor.
 
W
The standard is 4-20 mADC because the ISA S50 committee many years ago decreed it would be. The reason it is not 0-20 mADC is because it was thought that having a "live zero" would provide a safety indicator if a twisted pair cable were cut, while the "dead zero" would indicate 0 whether there was a circuit or not.

The same is true of 3-15 psi.

Walt Boyes

---------------------------------------------
Walt Boyes -- MarketingPractice Consultants
[email protected]
21118 SE 278th Place - Maple Valley, WA 98038
253-709-5046 cell 425-432-8262 home office
fax:801-749-7142 ICQ: 59435534
---------------------------------------------
 
C

Curt Wuollet

Short answer: because it's the standard!

I think this came to be with experience learned from analog computing and telecom as well as instrumentation. The range is a good compromise for a lot of issues with moving analog information over distance.

First of all: Why a current loop at all?

Because current is the same in all parts of a series circuit. You automatically get out what you put in. Analog voltages are attenuated by wire resistance X line length. Making the signals a controlled constant current provides quite a bit of noise immunity as well. Induced noise tends to produce a voltage change across the high impedance of the source, not the low impedance of the line and the current stays constant. Multiple nodes all see the same current.

Why 4-20 ma?

Higher currents use excessive power and cause more line losses requiring larger conductors and larger power supplies. The active devices available when this was standardized played a role also.

Lower currents lead to contact problems and "dry switching" type concerns. Leakage is a fact of life and the signal has to be of a magnitude where practical leakage values are an insignificant part of the error budget. This is also part of the reason for the 4 ma. "zero".

The 4 ma. zero provides open wire indication and bias power for devices that are line powered.

Mostly it was arrived at from what was demonstrated to work and practical limitations. This is a very good way to establish a standard, perhaps the best. Unfortunately there are few established this way. There was a great deal of foundation work done on telemetry and measurements as the electronics industry was growing up in th 50's and 60's before everything became digital. We take a lot for granted with today's accurate, stable and long lived equipment.

Regards

cww
 
Decades ago there was much debate about this.

An elevated zero is required to power the instrument at a zero signal output.

4 to 20mA won with 10 to 50mA a close second.

3-15psi??..... sorry.... I'm not that old.

Vince
 
B

Bruce Durdle

The reasons for the 4-20 mA standard (and, before it, the 10-50 mA standard) have a lot to do with the original 3-15 psi/20-100 kPa air
signal. The origin of the suppressed or "live" zero in the pneumatic system was two-fold - if you want to reduce a passive pneumatic signal quickly, you need to maintain some differential between the pressure and the "sink" - in this case, atmospheric air. The basic characteristic of a flapper-nozzle assembly also has a bit to do with it - the "linear" range of the flapper pressure is about 3 - 15 psi for a 20 psi supply. Next question - why a 20psi supply? Anyone else remember 6-30 psi equipment for
final actuators needing a bit more grunt??

The original electronic instruments were direct imitations of the pneumatic systems with bellows replaced by magnet-coil assemblies. The first I came across were used in a geothermal power project (1958 vintage)where the pressure at the power station was controlled by manipulating valves in the steam field 1 1/2 miles or so away. The flapper-nozzle was replaced by a
contact moving between positive and negative terminals - I guess a bity of mains hum in the amplifier kept it jittering. In the mid-70's Foxboro produced a range of pressure transmitters using the magnetic principle which used the same basic mechanism as their pneumatic equipment - but the size of the coils made the assemblies a lot bigger ...

Note re "suppressed" vs "elevated" zero - the official ISA definition is that a "suppressed" zero is when 0 psi or deg C or m of water ... is below to 0% value on the range - "elevated" zero is when o units is above 0% (and can be above 100%). Yes - it is extremely confusing - and some
manufacturers use the erms the other way round.

Bruce.
 
D

david mertens

Another reason why the 1-5mA was dropped is because it required a seperate mains to power the instrument. The 4mA standard gives the
instrument builders 3.6 mA to use as power consumption for the supply of the instrument itsself thus allowing the 2 wire system already very popular with 10-50mA devices (Which were banned because of safety considerations). At that time 3.6mA was only sufficient to supply the
least energy consuming devices available, pressure transmitters and valve positioners. Most other devices still required a seperate supply voltage until very recently.
 
Top