I have an application where volumetric air flow is measured through venturi tubes and SMART Differential Pressure Transmitters (DPTs).
DPT sensor's Low Range limit & Upper Range Limit are -10mbar to 10mbar respectively.
The DPTs are calibrated 0~2mbar (LRV-URV).
In the PLC, the loops are configured 0 mbar to 4mbar (through HART communicator, new LRV-URV were set in the DPTs without applying these pressures).
So the Transmitters operate beyond their calibrated range. I think this is not O.K. but can someone be a little more specific in explaining this situation and possible set-backs (e.g. what about measurement accuracy, repeatability e.t.c.)
My e-mail is firstname.lastname@example.org
It depends on how you use the word 'calibrate'.
Does 'calibrate' mean factory custom calibrated (not the standard factory calibration over the full working range) with the factory's pressure source and calibration gear or does it mean configured to a certain range within its working range limits?
All the major vendors will custom calibrate to a specific range which more or less invalidates the general stated accuracy when re-ranging to a different range. Without submitting that particular DPT to a calibration check at the increased range (which requires fairly expensive calibration gear), you have no means of determining how the change in slope of the custom calibration affects a 0-4mbar range range which is outside of the 0-2 mbar custom calibrated range. It might be neglible, it might be significant, there's no way to know.
If the 0-2mbar range is just a configured range within the standard calibrated range of -10 to 10 mBar, then using the vendor's accuracy calculation from their spec sheet will tell you the accuracy difference is between 0-2 and 0-4. The calculated accuracy is likely to get slightly better over the wider range.
The assumption when a device gets a custom factory calibration is that the range for the application is known and will not be changed over the life of the instrument.