Strange 4-20mA pressure sensor readings

R

Thread Starter

Ron Davis

We are experiencing strange behavior reading the 4-20mA signal from a pressure transducer on a water storage tank.

We are reading the 4-20mA signal into a 0-10V analog input on an AB MicroLogix 1100 using a resistor (initially 500ohms) across the input. The system is being powered by a 12VDC battery and associated charger.

Initially we used an existing transducer that is rated to operate on 10-36VDC power. Measured voltage powering the sensor was 13.2VDC. We found that this sensor was not capable of driving the output to 10V at 20mA, but appeared to be able to drive 5V across a 250ohm load at 20mA. During testing we would bleed the pressure from the sensor to read zero and then restore the pressure to around 42 psi(15mA). If we restored the pressure too rapidly, the sensor would "saturate" and give us approximately 20mA signal and would not drop back to 15mA after the pressure would equalize.

Only by momentarily shorting across the 250mA load would the signal clear and return to expected readings.

After "resetting" the signal we left the system only to find a short time later that the signal had once again saturated.

A new sensor from a different manufacturer was purchased in an attempt to address this problem. We found similar behavior from the new sensor. The manufacturer confirmed the need to "reset" the sensor after reaching a saturated state. After placing into service this sensor successfully read the tank level down to 14mA region but then "stuck" at that reading, even after the tank was refilled to a higher level.

We have installed dozens of similar systems without ever experiencing this phenomenon. The only difference is that the other systems drove the sensor at 24VDC instead of 12VDC.

Is anyone familiar with this phenomenon and can explain how to address it?

Thanks!
Ron Davis
 
In my experience a transmitter requires 10 Volts minimum (some even 12.5 volts) at its input terminals so that it can operate. You have already said that the PLC input is 500 ohms. Therefore at maximum current signal of 20mA PLC input is 10 Volts. Allow say 2 volts drop along transmitter wires. This means that your power supply voltage for the loop should be 10 + 10 + 2 = 22 volts minimum. So work on a standard of 24 volts and your loop should function normally. If you use a lower voltage then your loop will not be able to reach maximum output of 20mA. Simple ohms law really.
 
The failure to drive the output to 10V at 20mA is a classic symptom of insufficient voltage from the power supply.

The operating voltage needed depends on the total loop resistance.

The documentation for most 2 wire loop powered transmitters has either a chart that shows voltage versus loop resistance or a formula for calculating the minimum voltage needed to run the loop. The higher the resistance in the loop, the higher the voltage needed to run the loop.

I'm looking at a brand name voltage vs resistance chart that shows that the transmitter will fire up and run at 10.6Vdc with no load, but a 250 ohm load requires 16.3Vdc.

I'm guessing that your 13.6 volts is very marginal to drive a 500 ohm load.

Many industrial pressure transmitters are configured to have the current output go to a fail-safe value when a fault is detected. The upscale or downscale value is either upscale or downscale. Typically, the upscale fail-safe value is well above 20 mA, some value above 21 mA. Likewise the downscale fail-safe value is 3.6 or 3.7 mA, well below 4.0 mA.

European vendors typically comply with NAMUR NE 43 (explanation here: http://tinyurl.com/8xqv67k) for fail-safe output values.

I wonder whether a marginal voltage supply is stressing the transmitter to point it 'faults' upscale. Maybe the fail-safe value is 21+mA, but the output can't get there due to insufficient voltage.
 
Top