We have a differential pressure transmitter (orifice plate) measuring flow with the square root function enabled. I would like to know the procedure to calibrate the instrument bench. Is there a mathematical formula to calculate the values of pressure that I put in the instrument input for the signal 4 ~ 20mA output with the square root function enabled?
> We have a differential pressure transmitter (orifice plate) measuring flow with the square root function
> enabled. I would like to know the procedure to calibrate the instrument bench. Is there a mathematical formula
> to calculate the values of pressure that I put in the instrument input for the signal 4 ~ 20mA output with the square
> root function enabled?
Q = k*Sqrt(2gH)
G = gravitational acceleration (m/s2)
H = differential pressure across the orifice.
You just have to get initial calibration for k since k also changes with Q. Obviously you have to measure k for various test values for Q. Then only you can use the equation.
What the engineering book formulas leave out is that the calculations for a real world calibration are based on the percentage of design maximum flow rate and the percentage of DP at the max flow rate.
The square root calculation is done on a percentage value, not on a pressure value (in any of dozens of engineering units) and the result is a percentage of maximum flow.
Either percentage value (flow or pressure) can be converted back to pressure or flow values with respective engineering units, but the calculations are done with percentage values.
The 100% flow rate and 100% DP are the values provided in a 'sizing' document for the primary flow element.
(flow rate as percentage of 100% flow rate, in decimal format)^2 = percentage of differential pressure at 100% flow rate.
To calculate flow rate:
square root of percentage of pressure = percentage of flow rate at 100% design flowrate
There's a table of values as an example posted here:
I do not know the process flow.
I have a pressure transmitter with a range of 0 ~ 28 kPa and output of 4 ~ 20 mA. The square root function is enabled.
Today calibrate I using the following formula:
0mA = 0% = 0² x 28kPa= 0 kPa
8mA = 25% =(0,25)² x 28 kPa= 1,75 kPa
12mA= 50% =(0,5)² x 28 kPa = 7 kPa
16mA= 75% =(0,75)² x 28 kPa= 15,75 kPa
20mA= 100%=(1)² x 28 kPa= 28 kPa
FORMULA IS SOMEONE CAN DEDUCT?
mA output = SQRT(KPA / 28) * 16 + 4
Why don't you set the transmitter for linear output first and then calibrate it as normal dp 0 - 28KPA and then reset it for square root output. The calibrated span should not change for 0 and 28 KPA. You can then check the calibration at various points with the above formula if you want.
Otherwise use the above formula with the transmitter set for square root.
From the original information given I assume that the orifice plate has already been sized and the DP at maximum flow has been determined as 28KPA. For the calibration of the transmitter the flowrate is irrelevant as the transmitter has to be calibrated for the DP of 0-28 KPA. Just plug in any KPA value from 0-28 into the formula which I have given and you will get the corresponding mA output of the transmitter which you can use for calibration. Set output to 4mA for 0 KPA and 20mA for 28 KPA. Use the formula to verify intermediate KPA values.
The unit for Q for the given equation is m3/s. The DP must be in meter.
Standard flow calculation for orifice is the DP is measured m not %. Obviously you can use percentage too..
There is no direct result using the said formula. It has to be calibrated with some other absolute flow measurement before it can provide meaningful results. Otherwise it is as good as "Rubbish in rubbish out".
I like what David Todd said about calibrating the transmitter in linear mode (techs are very familiar with that) and then setting transmitter for square root output. Excellent, pragmatic and very clear.
When done calibrating in linear (4,8,12,16,20), set the transmitter for square root output and check the loop current performance.
It should show:
4mA = 0% = 0² x 28kPa = 0 kPa
5mA = 25% =(0,25)² x 28 kPa = 1,75 kPa
8mA = 50% =(0,5)² x 28 kPa = 7 kPa
13mA= 75% =(0,75)² x 28 kPa = 15,75 kPa
20mA= 100% =(1)² x 28 kPa = 28 kPa
See the calibration tutorial here:
DP (%) DP (kPa) SQRT(%) SQRT (mA)
0.00% 0 0.00% 4.00
25.00% 7 50.00% 12.00
50.00% 14 70.71% 15.31
75.00% 21 86.60% 17.86
100.00% 28 100.00% 20.00
Ditija: an RTD simulator has manual input and ohm output to the transmitter input. A TC simulator has manual input and mV output to the transmitter input
If you are looking for Fieldbus and Profibus, that is used in transmitters not in simulators
Beamex has a calibrator which has RTD and TC simulation capability, and which can also communicate Fieldbus or Profibus. However, Fieldbus and Profibus is for documentation, nor for simulation. Simulation value is input manually.
What is it you are trying to do? Is this a lab experiment?