Square root relation with 4-20 mA signal


Thread Starter


I am facing a problem with simulation of some current instruments installed in a plant. In some instruments after implementing square root the value in Dcs for 4mA is not coming zero. Instead,it is giving a value of supposedly 25.00 when 4mA is forced through HART (ABB-DHH) for a range of 0-2000 NM3/Hr. Can somebody tell me why is this happening and how to rectify it so that the value comes as zero at 4mA as required by me? I have tried a few things by changing some parameters in the DCS side but did not get the desired result....can somebody give me a solution to this?????
We are using ABB 800xA system and configured AI810 module for analog input. If I give a source of 4mA to a particular channel through barrier using supposedly a Fluke multimeter, the reading in the DCS is fluctuating from 0 to 11.8 Nm3/hr. By 0 to 11.8 I mean, only these two readings are coming simultaneously and none of the intermediate readings are shown.

The range for the instrument I am discussing is of 0-2000 Nm3/hr (i.e., its a DPT). The 0 reading sometimes becomes constant but soon changes to 11.8 Nm3/hr. I really can't find out if this is happening because of any external factor or not. I have already checked all my settings in the DCS system and it seems to be fine and is same as the settings for previous instruments. I would further like to inform you that a couple of days ago during calibration of the instruments using a pressure gauge, everything was fine and all the readings were coming well but during the loop checking this problem started coming.

I would really appreciate it if somebody could help me.......!!!!!!!!!!
If you are seeing a change from 0 to 11.8 Nm3/hr on a 4-20 mA span, and which is scaled to 0-2000 Nm3/hr, I think it means that you are seeing a small input change ( from 4 to 4.000557 mA ).

You might want to check the specs on your both your current source and the measurement.
Calculate by:
When S800 AI I/O modules are used with AC450 the default setting has a 0.2% deadband on the measurement. You should make it a point to always remove this. Otherwise you get precisely this kind of effect.

You should also check the filter time settings of your AI modules as standard practice during commissioning. If your PID scan time is 1 second, you can eliminate a lot of noise without significantly affecting response by using a filter of 100 to 200mSec.