Accuracy Definition


Thread Starter


Dear Control Nerds,

I have this basic query on accuracy definition of pressure transmitters vs flow meters.

Pressure transmitters almost always have accuracy based on % of configured span. Flowmeters like magnetic flowmeters, Coriolis or vortex flowmeters have the accuracy based on % the flow rate.
Why is this so? Can the accuracy be defined for the two other way round?

Thanks in advance.
Static pressure is a scalar variable, traceable to independent calibration standards.

Flow measurement, however, is a variable that depends on the fluid, the uniformity of flow profile, fluid properties, pressure and temperature, flow rate, and pipe layout.

Only in extremely rare cases can traceable calibration of the flow measurement is possible for a specific fluid and flow conditions.

The accuracy is a combination of the uncertainty of the transmitter electronics and the sensors used, and the various uncertainties in the process.

In the past, the indicator accuracy (or uncertainty) was defined in terms of mid-range uncertainty (not full scale) as a more useful estimate. The flow measurements tend to reflect that practice.