How to Calibrate the Gas Turbine Vibration Sensors

A

Thread Starter

Any

we have v64.3 model ansaldo make Gas Turbine. Turbine supervisory instruments of vibration sensors removing and re-fixing for GT major overhauling. Hence, How to calibrate the vibration and before removing and before re fixing procedure may be sent pl early
 
Any,

To verify the accuracy of the calibration of any sensor one needs to know how the sensor works. If you are referring to velocity type vibration pick-ups (sensors), commonly called "seismic" vibration pick-ups, they have a small weight suspended from the top of the sensor housing with a spring. There is a magnet (permanent magnet) on that weight and a coil surrounding the permanent magnet in the housing. When the sensor is "shaken" (vertically up and down) the permanent magnet moves and generates a small voltage in the stationary coil, the magnitude of which is a function of the movement of the permanent magnet/weight which is a function of the force caused by the vibration: more movement, more voltage.

The output is scaled, usually in millivolts/unit of measure (in/sec, or mm/sec, for example).

None of the velocity vibration pick-ups I have worked on had an ability to adjust the sensitivity of the output. So, this idea of "calibrating" velocity vibration pick-ups is incorrect. The best one can do is to find what's commonly called a "shaker table", which has the ability to generate a variable amount of vertical movement at some frequency, mount the vibration sensor on the table, and measure the output versus the applied movement ("vibration"). It's a pass or fail test--either the output is proportional to the applied movement of the shaker table, or it's not. There's no adjustment on the vibration sensor.

To test linearity of the sensor output, one usually applies several different "vibrations" from the shaker table and records--then analyzes--the sensor output to ensure it's linear.

But, there's no "calibration" to be done--presuming we are talking about the majority of velocity (seismic) vibration pick-ups.

If the turbine at your site uses another type of vibration sensor (proximity, or accelerometer) then one has to understand how the sensor works and then find the appropriate "simulator" to measure output. But, in both of these cases (proximity and accelerometer) I've never encountered sensors with adjustment, either. One can only check that the output is as per specification by applying a simulated input, and verify linearity. But, an adjustment cannot be made to the scaling/sensitivity of the devices.

Most device manufacturers have some information on their website about testing methods or criteria. And, if the don't--they usually have some descriptions of how their sensors work that can be used to devise a test.

But, to the best of my knowledge the overwhelming majority of vibrations sensors (of just about every common type) cannot be "calibrated" in the field like a pressure switch or temperature transmitter can be calibrated. There's no adjustment of sensitivity or zero or span or scaling--it is what it is. And, if a test determines the output is not linear or the device is not producing the specified output per input then it must be replaced.

Hope this helps!
 
I think that the information given my CSA answers very well the original question. I would like to insist that on the fact that vibration sensors are never calibrated. They can be checked, but not calibrated. What operators usually mean by calibrating is just checking that the output signal of the sensor is lineal. If it is not, the sensor has to be replaced.

The fact that some devices used to perform this verification are called "calibrators" helps to go on with the misunderstanding.

These are examples of the equipment described by CSA to check the linearity of proximity sensor systems:

www.ge-mcs.com/download/test-and-calibration-equipment/1q07_nps_tk3.pdf

http://stiweb.com/downloadManuals/CMCP-TKPro_Manual.pdf

And also for accelerometers:
www.modalshop.com/calibration.asp/9110-Portable-Vibration-Calibrator?ID=784

Hope it helps. If you need further information on how these devices work, you can search your preferred engines, there are some interesting demo videos showing them.

<b>Moderator's Note:</b> When copying and pasting long URLs, make sure you remove any spaces the forum software places in the URL.
 
An historical reply related to the older TSI instrumentation...

One can "loop calibrate" the detector and the demodulator, thus input a vibration simulation and trim to give the proper display.

like for the older GE seismic detectors, the expected output was something like 0.800VACrms for 15mils @ 3600rpms. the recommend TSI card calibration was just to use a Variable AC signal to calibrate the card 2.00 to 10.00 VDC @ TP Blue.

I have taken the shaker table to the turbine and using a test accelrometer, set the detector vibrating and trim for output. this will also compensate for cabling and connection losses.

an easier way was to bench check the detectors and used the bench as found output to be the input AC into the TSI card for its calibration. the problem here becomes if a detector that was marginal had its loop compensated then it should be fully documented what was done.

besides magnitude, the phase polarity of the vibration and the detector's output becomes critical if the detectors are used for balancing. besides finding the internal wiring of the detectors in error from the factory (B pin high), the more probable error will be in the cabling and connections so a polarity check from a mounted detector should be performed. the cabling can be easily checked with pin to pin continually, but the detector will need a "drop spike" measured with a strip recorder.
 
Top