4-20 ma Standard Instrumentation Measurement

V

Thread Starter

vinayak34

1.Why 4-20ma is standard instrumentation measurement , Why not it can be 3-19ma or any other... I know the Live zero concept when the cable is cut you won't b able to get 4ma from field .. so if v use 3-19ma as standard wht's wrong in that?

2. Why 250ohm resistance in parallel is used as standard ...whether to get linearity in voltage output??

Kindly give your valuable suggestions
 
> 1.Why 4-20ma is standard instrumentation measurement

ISA 50.00.01-1975 previously defined the standard as 10 - 50mA. The change to 4 - 20 mA was likely due to the increase in transistor based electronics, or the advent of intrinsically safe circuits.

You'll have to look into the details more if you're interested because my memory is fuzzy.

>2. Why 250ohm resistance in parallel is used as standard

A 250 Ohm (or 249) conveniently converts 4-20mA to the voltage signal standard of 1-5V by way of Ohms law, V = IR.
(4 mA) (250 Ohm) = 1 Volt etc..

Also refer to this thread: http://control.com/thread/1026235722
 
Top