Accuracy of Field Transmitters

W

Thread Starter

Waqar

I wanted to know about accuracy of Field pressure transmitters. Usually it is mentioned in specification sheet of transmitters that the reference accuracy is e.g 0.065% of span and then the ambient temp. affect, static pressure affect etc. on accuracy of transmitters. What I interpret from this is that the reading of transmitters can be drifted by the value specified by accuracy.

Can any body tell how this drift happens, specially in modern smart transmitters?

How much time can we expect that the transmitter remain accurate after it is calibrated in workshop using standard instruments? Is this drift dependent on current reading of transmitter, or it is fixed throughout the span?
 
Every Transmitter on the market today comes with a spec sheet that will give you a acceptable inaccuracy. You should not concern yourself with this to much since these are acceptable inaccuracies and is found on all transmitters.

What you should be looking at to get very good accuracies is to look at your application and select an transmitter that has a slightly higher capability. So for a application where you want to measure say 0 to 10Bar find a transmitter that can measure say 0 to 15Bar or smaller. The closer you can get to the transmitter's max capability with your calibration the more accurate and reliable your transmitter will be.

Where these inaccuracy tolerances comes into play is when you use a very big transmitter on a very small application. Something like using a transmitter with a capability of 0 to 100Bar on a application where you calibrate the transmitter only for 0 to 5Bar. What happens then is that the inaccuracy tolerance is acceptable for a big range and application but this inaccuracy % becomes very big on a small application since it is still the same % of the capability of the transmitter and not of your calibration causing continuous shifting in the readings that can at times look like drifting.These shifts is not noticeable on a big calibrated range but will be very noticeable on a small calibration for this size transmitter. To give a practical example, say you have calibrated this 0 to 100Bar transmitter for only 0 to 5Bar. If the inaccuracy tolerance is equal to say 20Kpa this 20Kpa shift will not even be noticed if you have calibrated the transmitter for 0 to 80Bar but 20Kpa shift will be very noticeable on a calibration of 0 to 5Bar. This is in fact not drifting but only a inaccuracy tolerance shift that takes place on all transmitters.

Get a smaller transmitter that has a capacity closer to your max calibration value and you will not get any "drifting " problems anymore.

Drifting is mainly found in capillary differential pressure transmitter applications like close pressurized vessel level applications but there again we have found other ways and means to counter act and minimize these drifts. Drifting is when you see the reading slowly increases and then decreases again and it does this smoothly and continuously. If you monitor this reading for awhile you will find the highest and lowest drifting points and then you split the error in the middle to find a good zero start off point. Add your upper range value to this zero start off point and the drift error will be over and under over the complete calibrated span line and not just above or below the calibrated span line.

So bottom line is that on normal pressure and DP applications and especially with SMART transmitters you will not have any drifting or inaccuracy problems provided that you use the right size transmitter on your application.
 
> Can any body tell how this drift happens

What you already listed: ambient temperature, meter body temperature, humidity all affect instantaneous readings. One manufacturer's reference conditions are stated as 25°C (77°F), 10 to 55% RH.

Chemical reactions at the wetted diaphragm can cause effects like hydrogen ion migration which affects performance.

>How much time can we expect that the transmitter remain accurate after it is calibrated in workshop using standard instruments?

I suspect that the use of 'standard instruments' to attempt to calibrate a modern smart pressure transmitter makes the transmitter less accurate, not more accurate, than factory calibration.

Component aging affects stability. One manufacturer's spec is ±0.015% of URL per year for stability.
 
accuracy of a PT depends on the span, for a typical yokogawa transmitter accuracy can be given as:

Accuracy:+-.015 + .05 * (Maximum Measurable value/span)

So, when Span = Max Measurable value accuracy is maximum, as span decreases accuracy decreases.
 
Top