Measuring Steam Quality

R

Thread Starter

ryanb

We're trying to measure the quality (dryness fraction) of saturated steam going into a hospital sterilizer. But we're having a hard time finding a good way to go about doing this.

It seems that a throttling calorimeter is a popular way to do it, but this requires taking a sample, and we would rather avoid doing this. Plus we're trying to take the measurement right at the sterilizer. We can't take a sample in the sterilization lab.

Does anyone know of a better way to do this? Without taking a sample would be ideal.
 
As long as you have a properly calibrated temperature sensor in the incoming steam, the amount of super-heat tells you everything.

Super-heat is the degrees above the condensation point. There is a slight dependence on atmospheric pressure. So you can monitor incoming, chamber and discharge (condensate) temperatures and you have everything.

Common engineering calculations
 
>As long as you have a properly calibrated temperature
>sensor in the incoming steam, the amount of super-heat tells
>you everything.

ryanb clearly stated that the steam is saturated, so there is no superheat. I don't have any knowledge of a method of measuring the quality of saturated steam other than by sampling.
 
Agreed on the technical definition of saturated steam, but the issue is a matter of "dryness fraction," suggesting some condensation is occurring before use.

So by knowing the incoming temperature (for a given chamber pressure), you can insure that most of the condensation occurs where it is needed. Is it a perfect indication, no, but it does give you a practical measure of the sterilization settings.

The sterilizer manufacturer has already done the homework, and it is covered in the operating manual. You can also measure the condensate temperature as well.
 
Top