process instrumentation testing


Thread Starter

Mark Ray

Hello A-list
How or where can I find information and guidelines to test RTDs , Thermocouples, and Pressure transmitters for accuracy? I would like to know the accepted practices used and recommended time periods to perform these tests. It has become my responsiblity to establish and implement a verification of accuracy schedule of the process control instruments used on our pasterizers.
Mark Ray
I agree Fluke has the best hand held instruments for RTDs and other thermal devices but you will probably need a Hart calibrator for pressure xmitters plus each device should have it's relative spec. sheet with it and test and
calibration procedures. Ron L
A number of companies make HART Calibrators including Fluke, Rochester, Beamex and others. These calibrators typically are programmed with the required test procedures for a wide variety of HART field devices. Calibrating the sensor
requires appropriate references and special calibrators to support the pressure, temperature and etc. references sources.

On the other hand, calibration of the loop current and reranging the field device is typically done using standardized HART Commands. You should expect even the simplest HART host application to be able to rerange the field device
or test the loop current. This is really simple stuff. see

Wally Pratt ([email protected])
Chief Engineer
HART Communication Foundation
At present I try to keep it simple by using an ice bath and boiling water for physical constants to test the temperature devices. Pressure is applied with compressed air and measured with a NIST certified instrument. All of our devices go back to a plc and it is scaled accordingly.
The handhelds work fine for setup and simple loop testing but does not test the sensor accuracy. We have them all Hart, Smart, Brain and Prolink plus plenty of Flukes. My concern about using the communicators is that the sensor performance
is not tested unless some constant energy is applied at different values. Furthermore I want to know if there are recommended intervals for testing the temperature and pressure sensors given the type or application.
I am surprised there was not more response to this topic as of yet, since I am talking about the pasterization process which has been around awhile.
Mark Ray

Bruce Durdle

A point to watch, if you want "accuracy", is that any equipment used for checking transmitters needs to be about 3 x better in acuracy than the item being tested. With a typical Smart pressure transmitter being specified to 0.1%, and the newer devices to 0.025%, you either have to have access to a very good test calibrator, or simply rely on the transmitter to do its job and use only basic go-no go type tests. But to try and check a 0.1% transmitter with a 0.1% or worse dead weight tester, and you're wasting your time unless you are prepared to go to a great deal of trouble. (What is your local acceleration due to gravity? Here, its 9.801 m/s^2 - assume 9.81 and the error in calibration is immediately 0.1%.)

The ice point is a pretty good test, but the boiling point depends on atmospheric pressure, and therefore on the altitude and weather
conditions. Again, if you want "accuracy", you will need to take eg a Fortin barometer reading and correct your readings accordingly. But a th
ermocouple is accurate to about 1 deg, and effects of thermowell etc make the accuracy of the transmitter not very relevant in the overall system, even with RTDs. Even with pressure instruments, you might have to consider things like the head added by fluid in impulse lines.

The present "state of the art" with instrument calibration reminds me of the hi-fi scene about 20 years ago. The aficionados would go into raptures about squeezing another Hz out of the frequency response of the amplifier circuit, but forget all about the crudy dynamics of the pickup and cartridge assembly, not to mention the speakers. And, of course, most of the loss came in the wetware anyway - why try for a top frequency above 20 kHz when your hearing starts to roll off at 10?

Even with ordinary instruments, there is probably more error introduced by shoddy installation or incorrect location of instruments - with the
capability of modern equipment, it is doubtful if there is any benefit in repeatedly checking for accuracy.

Now let's see if THAT gets some discussion going!


Curt Wuollet

Hi Bruce

Why would that provoke people, it's mostly true. Except the HiFi thing was more like 40 years ago here. People tend to take their measured readings way too seriously. I have seen very few installations that could possibly return readings that taxed the accuracy of the instruments. This
is particularly true of temperature measurement. There are simply too many ways to degrade accuracy. Fortunately, most applications can get
by with relative readings or are interested in changes. Really precise measurement with thermocouples is a laboratory thing, not strung out all over a plant. RTD's are better, but you'ld be amazed at how few people doing it know what a Kelvin measurement is about. Most of the recent instrumentation is not going to be a problem, leaving just the physics problems to be solved. They are too often ignored.


Hola Amigos¡

In common measuring and control work what really matters is not the actual absolute accuracy or precision of your device, but the relative accuracy of it respect to the process you intend to control, there is not an isolated optimal
calibration procedure for temperature pressure or any other physical variable. there are only cost/performance adequacies for each process. In my experience in the medical field, what can be technically possible is almost always far
above of what is operationally relevant, and unless you are marketing a product where selling irrelevant capacity is a way of differentiate yourself from the competition, you first need to specify your process characteristics before selecting a calibration process.