RTD Calibration

A

Thread Starter

Anonymous

Hi guys. I need to know what type of device is required to calibrate RTD's. I've looked through tonnes of catalogs and I am really confused. My PLC system uses Platinum RTD's connected to an AB Flex I/O analog input modual. I want to make sure the temp that is being displayed is accurate and also create a benchmark so a preventative maintainance program can be adopted. Thanks for any help.
 
M

Mike Ryan at ICT

Try this for a quick calibration at the 0C and 100C points. Using distilled water, make crushed ice and mix with more (distilled) water. Put this mixture in an insulated container (like a picnic jug) and circulate the mixture. Insert the RTD and check for 0C/32F. Now boil the water. Insert the RTD and check for 100C/212F.

Using common tap water will introduce errors due to the minerals. Distilled water is at 0C when ice and water co-exist. The mixing (slow speed paint mixer attached to a drill) maintains a uniform temperature distribution. It is not necessary to use a mixer at the boiling point. All of this is affected by ambient pressure, but I don't think there is much error at normal pressure.

If you are only interested in validating the factory calibration, this should work and it is not expensive. If you need to test at other points, you will probably need to contact a calibration service. Testing RTD's requires precision resistors and current sources that are traceable to the National Bureau of Standards.
 
S

Saeed Beheshti Maal

usually for calibration of RTD you require a precise temperature controlled oil bath to immerse the RTD, and a precise resistance measuring device, normally a whetstone bridge. The temperature/resistance table of RTDs can be found in IEC standards upon which you can check the calibration of your RTD.
 
The plt. RTDs we use have no adjustments. You have to adjust for any problems in your software. I use concept FBD for my programming and made my own custom block for adding/subtracting a factor to a measured temperature.
 
M
Interesting -- the RTDs I've used don't have any calibration pots on them. <smile>

For a quick 'sanity check' just dunk the RTD into a cup of ice water.

Mark
 
B

Bob Peterson

Depends on what you want to do.

If you have a well calibrated thermometer, you can easily setup a three point calibration yourself.

There really is no way to "calibrate" an RTD itself though.

If it is connected to a transmitter you can tweak the span and zero on the transmitter to get the reuslts you want, or if it goes to a PLC you can tweak the scaling.
 
D
Took this out of the RSLogix help file search RTD:

This tab, reached after entering an RTD module in your configuration and accessing its Advanced Configuration (Cal) tab, allows you to disable the automatic calibration feature of the module.

Leaving this check box empty (default setting) enables periodic calibration, which occurs once every 5 minutes. Calibration is performed once when the bit is cleared to zero and every 5 minutes thereafter. You can program the calibration period to occur at a particular time or periodically through the use of ladder, using the enable and disable bits. Register C:e.6 is used to enable or disable periodic calibration.
Placing a check in this check box sets the calibration disable bit to 1 and disables periodic calibration.

RSLogix 500 - Copyright Rockwell Software 2000, 2001, 2002

PS: In my experience with RTDs your reading are more than likely correct, unless there is other programming scaling the final value.
 
C

Curt Wuollet

You could divide and conquer by using lab standard resistors to calibrate the setup. Temperature standards are problematic but a stirred bath with tracible instrumentation should get you close. Absolute accuracy in temperature measurement is the Holy Grail.

Regards

cww
 
Hello Anonymous

I am not sure what you mean by "benchmark".

We have developed a procedure for the food processing areas of the plant I work at to test the RTDs. We use the manufacture's specs as the rule for finding out of tolerance loops. The equipment used to test is a simple hot box and an ice bath when needed. The hot box is NIST certified and the thermometers are calibrated. All of our testing equipment is NIST certified and sent out annually to maintain the certification. Periodically the reports are reviewed by FDA and USDA reps to insure we maintain a safe, healthy product. Since there is no way to adjust an RTD normally any out of tolerance loop is due to the transmittter or terminations detiorating.

When the manufacture's specs are used all of the loop devices are consider in that equation as is the tolerance of the test equipment. We always test in place.

Regards,
Mark
 
S

Saeed Beheshti Maal

Calibrating media for 0° C is OK but the method described for 100° C is only valid if boiling altitde be at sea level; otherwise boiling point of water will decrease from 100° with increase in altitude from sea level
 
No discussion of calibration can take place without first discussing what level of accuracy you desire. None of the comments in this list mention this which is quite surprising to me. Ice water method is not very accurate for many reasons, including impurities in the water, mixing issues, etc.

If your desired uncertainty is 3C, then no problem. If you need less than 0.1C, then as a rule of thumb in metrology, you need a standard that is 0.01C. Actually, when you get into this area, you need to consider rental or purchase of primary standards as defined by the ITS-90 schedule (google it if you don't follow). The temperature scale is defined around the triple point of water which is set at exactly 0.01C. A good TPW cell will last many years and give you 0.0001C uncertainty.

We use these routinely to calibrate our temperature probes.

Also, if you are measuring only small range, then thermisters are by far the best and most accurate technology with 10 year stability.
 
RTDs need not be calibrated by customers as they are factory calibrated and fixed. Their accuracy depends upon the material's Temperature Vs Resistance characteristics. Platinum is having quite a linear characteristic. Calibration is only on the transmitter where ZERO and SPAN adjustments are provided. For SMART transmitters, calibration is done via SMART communicator. Refer to RTD table for your type of RTD for the correct resistance value then, you can inject the corresponing resistance value to your transmitter in place of your RTD.
 
G

Gerald Beaudoin

We get around the sea level/barometric pressure issue by boiling water and then checking the temperature with a certified reference thermometer. This then indicates the boiling temperature for "the next little while"... unless a relative low or high pressure system moves into the area rapidly. We then use the indicated temperature instead of 100C or 212F as the proper boiling point of water on that particular occasion.
 
R

Ron Ainsworth

PRTs and RTDs are calibrated all the time by instrument technicians and calibration technicians in companies concerned about quality and risk management. Although PRTs and RTDs cannot be physically adjusted, the transmitter, indicator, or PLC will typically allow you to adjust the coefficients that characterize the PRT or RTD.

Manufacturers will use the tables referenced by IEC 60751 and ASTM 1137 by default. These are standard coefficients to the Callendar Van Dusen equation which describes PRT and RTD behavior. Corrected coefficients can be acquired by measuring temperature and resistance pairs across the span of the thermometer and solving the equation algebraically (or with software).

The right equipment to use for this depends on the temperature range and the accuracy of the thermometer, but in general you need a temperature source to heat or cool the sensor and you need a traceable temperature display to indicate the actual temperature. A dry-well, Metrology Well, or Microbath will typically meet the requirements.

To calculate new coefficients you will need an instrument to indicate the resistance of the RTD or PRT at each temperature. A DMM, or even better, a specialized temperature calibration readout will do the job.

The accuracy required will depend on the accuracy of the device being calibrated. Most organizations require a 4:1 test uncertainty ratio. So a ±1°C device typically needs a ±0.25°C calibrator.
 
What is considered a "small range" in the realm of temperature measurement? We go from about 65 degF (cold water) to about 242 degF (boiling food mixture).
 
One thing to remember, which I have just fallen foul of, is to calibrate the RTD with the leads attached. I have just had six PT100 sensors made up with 6m long leads, but the leads were (from memory) 0.075 ohms per metre. Over 12m, this is 0.9 ohms, which for PT100s is over 2 degrees Celcius. I believe you can get around this by:

a) using lower resistance cables. I don't know why we selected that cable. I expect it's because I'm thick.

b) using three of four wire PRTs. Four wire PRTs are the most accurate. Two of the four wires form Kelvin connectors. Three wire PRTs effectively cancel out the lead resistances, provided the leads are similar, which they usually are.

c) using PT1000 PRTs. These have resistances of 1000C at freezing point, so any resistance in the leads only distorts the readings by only a tenth as much.
 
Ok Ok I've read everyones responses and hey there are some great idea's. but hey everyone seems to miss the one big problem, unless you're an OEM Mfg "You can't calibrate a RTD". You can bombard them with gamma rays if you like and your still not gonna change them to indicate correctly. now that said You can "verify" their calibration and the best way is always a third party certified device and hey there are like hundreds of those out there in the world
 
Top