Confusion in defining calibration range

G

Thread Starter

Gopi

Hello,

I have a confusion in defining calibration range.

As far as i know calibration range is defining ranges for which corresponding 4 mA or 20 mA should be generated for a given sensor. Is this correct?

When comes to my work i am doing data sheets for field instruments in my datasheets like pressure transmitter and temperature transmitter there are fields like:
Range Limits
Calibration Range

i will define range limits my considering Max operating Pressure/Temp that will come to 66 % of the desired range.i am clear about range limits.

but for defining calibration range i am not clear. when i made little research on internet some said Min Operating pressure/Temp and Max operating pressure/Temp will be calibration range. But i feel this not clear because some people contradicted this.Please clarify me regarding this calibration range.

To add about temperature transmitters, i am using both RTD's and Thermocouple's for my application. when i interacted with some vendors they said there is no use of defining any calibration range at all for temperature sensors as everything is predefined in the form of table's and said calibration range is needed only for pressure sensors and other sensors other than temperature sensors.

But some vendors are contradicting this statement.

i am now totally confused in defining calibration range. If to define please guide me how to decide calibration range with respect to sensors application (Pressure, Temperature, Flow etc).

Please help me as soon as possible.
Thanks in advance:)

With Reagrds,
Gopi
 
R
It is confusing as you say.

The calibrated range is where you define your 4 & 20 mA points
e.g. it might be possible to set the 20 mA point for a pressure transmitter between 50 and 150 psi, but you set it at 100.

The calibration range is 0 - 100 psi

The span is 0 - 50 to 150 psi, sometimes on a data sheet you see it expressed as 50 - 150 psi

I normally specify a transmitter so the calibration range ends up near the mid point or higher in the span.

Hope this helps a little.
Roy
 
J

James Fountas

Hi Gopi,

The calibration range may not be the same as the span of an instrument. For example, if you want a temperature sensor with a span of 600 to 1400 C, but the critical process temperature is around 1150 C, you may do the following. You can order a temperature sensor with a span 600 to 1400 but request a three point calibration at 1125, 1150 and 1175 C. You want accuracy around 1150 and outside of that you don't need a high accuracy. Calibrating the instrument across the whole range may not give you a great accuracy around the point that is most critical to you.

Best to talk to people at the other end about paper work and make sure that everyone understands the paper work in the same fashion.


Regards,
James Fountas
 
hi,

just added, range limits are basically an limitation for operating range itself. yes some are around 60% some are up to 75% from its max operating limits. i.e if an xmitter have max operating limits for 100 bar and minimum operating limits are 0 bar then the range limits will be around 0-60 or 0-75 bar. it depends the vendors. about calibration range, well I was told that it was actually the ranges where you put 0 to 100% or yes the 4-20mA equal to the desired values you want the xmitter to operates. you have Temp transmitter have the max op range for 100 C (use an RTD sensor for example) but you only need it to operate from 12 C to 75 C (it depend on process, well i don't know what kind of process but it's just an examples) then you calibrate it. (some said it was re-rangging not calibration because calibration was set the 0-100% values as exact same as max and min limits of the instruments) to 12 C for 0% or 4mA and 75 C for 100% or 20mA. well that's all I can tell you. maybe want to try looking at the internet perhaps some instrumentation e-learning center, because I was little confuse about the differs between calibration and re-ranging itself. sorry if this can't help.
 
Not all instruments require calibration... PRTs don't need it unless you are being extra fussy about accuracy. Some instruments do. The factory calibrated range is the range of values over which the instrument has been tested. This isn't the same as the operating range which will be the range over which it is actually used or can be used. Nor is it the range of the outputs. This depends on the purpose of the instrument in the application.

So, if you have a PRT temperature sensor and want to use it over the range 50-70degC, then you can set your span and bias on the 4 - 20 mA to be whatever you like. It might be you will set them as 0 and 100 deg C for an indicator (but match the indicator span and bias to your sensor span and bias) or 50 degC as 4 mA and 70 degC as 20 mA if it is for control.
 
R
And if you weren't confused enough before you will be even more confused now after reading all the previous posts LOL.

Roy
 
J
For range, calibration, sensor limits, and trim, please refer to the
technical white paper on this page:
http://www.eddl.org/DeviceManagement/Pages/Calibration.aspx

Range Limits: This should be called sensor limits. This is the minimum and maximum input the sensor can handle.

Calibration Range: The confusion is in the term "calibration" because it means three different things:

- Sensor trim: correct reading when sensor has drifted

- Range setting: set the points corresponding to 4 mA and 20 mA respectively

- Current trim: if output current does not match DCS AI card

For instance, a differential pressure transmitter may have a sensor that can measure from -250 inH2O to +250 inH2O. These are the lower and upper sensor limits.

For that DP transmitter you may set a smaller range: 0-200 inches to measure the level in your tank. These are the lower and upper range values.

When you receive your DP transmitter from the manufacturer, sensor trim was already done in the factory so no need to redo that. However, after years in operation it may drift and need sensor trim. In your workshop your European deadweight tester may use metric weights so you cannot test at 200 inH2O so you pick something close to your range values: 0 and 5000 mmH2O. These are your lower and upper trim points.

Cheers,
Jonas
 
M
Hi Goppi,

As far as I have worked and studied and learned from my seniors the difference between the calibration range and instrument range is as follows.

Instrument Range: this is the max range on which an instrument can work. For example if the instrument range of a temperature transmitter is -50 to 1000 then that's the max temperature that transmitter can bear for its normal functionality.

Calibration Range: This is the range that you set according to the process requirement. For example, in the above temperature transmitter case our process requirement is -30 to 800 then we will calibrate our transmitter between these two temperatures.

So in data sheets we put -50 to 1000 in the instrumentation range and -30 to 800 in the calibrated range.

Regards,
Maria
 
M

Mark Van Donsel

All,

So, to summarize what I have read here:

1. The calibrated range is whatever you program the 4-20 ma signal to be, i.e. 4ma = 0 degrees F and 20 ma = 100 degrees F.

2. The instrument range is a totally different animal, encompassing the entire range an instrument is able to measure at.

3. If I had an RTD that was calibrate (3-point) from 0-120 degrees C, a transmitter that had a calibrated range of 0-100 degrees C, and a normal operating range of 50-70 degrees C, my accuracy is unaffected by the fact that the ranges of the RTD and transmitter are different because the operating range and the calibrated range of the transmitter are well within the calibrated range of the RTD. I am going to take a leap of faith here and say that it would change nothing and have no effect on accuracy or performance if I had changed my transmitter's calibrated range to 0-120 degrees C under the same circumstances. Would you agree?
 
Are you aware that an RTD is calibrated on 'paper' only; there is no means of adjusting the output an RTD?

The RTD calibration certificate tells you how much error the RTD had at the given calibration points. Whatever error occurs at a given temperature is faithfully replicated by the 'calibrated' RTD at that temperature.

The fact that you have a 'calibrated' RTD might be of interest to the bureaucrat who audits your documentation in a regulatory environment but means nothing to the temperature transmitter.

There are systems (PAC/PLC/DCS) that allow for a numerical offset or a two point zero/span adjustment or even multi-point look-up table adjustment of an analog input signal value that 'corrects' for known RTD or T/C error, if you have those features and go to the effort to do so.

There are likely high end temperature transmitter models out there that do the similar things. And there are also cheap analog TTs with low end specs.

With regard to whether changing the transmitter's range from 0-100 to 0-120 Deg C has an effect on accuracy, what does the temperature transmitters spec sheet say the accuracy spec is?
 
M

Mark Van Donsel

I am aware that the RTDs are calibrated on paper. Given the fact that the operating range of the process being measured is 50-70 C, The only thing that could possibly happen by changing the calibrated range from 0-120 to 0-100 is that it may be slightly more accurate.

What I am really looking for here is confirmation of what my intuition is telling me; that and ruffled feathers because of the calibrated range change is only bureaucracy and really has nothing to do with the operation or accuracy of the instruments in question.

Thank you very much for your response.
 
Top