hello every body,
my question is how to calibrate the IGV's of ms5001 gas turbine with MKVI starting from the beginning step - by - step.
appreciate your feed back.
have good day
*One* of the reasons these procedures aren't published is that there are subtle but very important differences between units (Frame 5 or Frame 7EA or Frame 9FA+e) in the exact steps to be followed, depending on the auxiliaries in use on the machine. (I'm not excusing the packagers of GE-design heavy duty gas turbines from the responsibility of providing proper instructions, they should, but we're not going to change those corporations or their deliverables, unfortunately.) But, asking for these instructions in a public forum like this for a particular machine is very difficult.
Also, many Mark VIs are being used as upgrades or retrofits to older, existing turbine control systems (Mark I, Mark II, Mark IV, etc.) and the auxiliaries in use on those machines also varied greatly. Many retrofit panels are SIMPLEX panels, and some are TMR panels; this is also an important consideration. To be able to detail the exact process, step by step we would need to be able to see the P&IDs (what GE calls the "piping schematics" or schematic piping diagrams) for your particular unit, because it's much more than the AutoCalibrate procedure accessed via Toolbox; it involves establishing hydraulic pressure, measuring the fully closed and fully open positions using the Manual positioning feature of AutoCalibrate to be able to tell AutoCalibrate what the end-points of the stroke process are. So trying to develop a "cook book" step-by-step procedures such as you're requesting this via an Internet forum such as this is nearly impossible without the ability to understand exactly how the unit at your site is configured and controlled, and the P&IDs (piping schematics) are crucial to such an undertaking.
We're going to try to help you, but one of the questions I would like for you to answer is why you feel the need to re-calibrate LVDT feedback. Because, there are only a couple of cases where it's required. One would be if the inlet compressor casing was disassembled, another would be if the IGVs were replaced, and another would be if the IGV actuator and/or linkage was disassembled. In other words, only if something was done to affect the physical stroke of the device such as replacing the actuator or disassembling the linkage is it necessary to recalibrate the LVDT feedback. Replacing a servo-valve does *not* require calibration of LVDT feedback, because simply replacing a servo does *nothing* to affect the physical stroke of the device. Only something that would affect the stroke length and it's endpoints would really require a recalibration. (Replacing the LVDTs would require running the AutoCalibrate routine, but it wouldn't be necessary to measure the stroke of the device to put into the calibration procedure since nothing was done to physically change the stroke; one could just re-use the previous stroke measurement. It would be prudent to verify the calibration after performing it.)
Also, simply "calibrating" LVDT feedback during a maintenance outage is really just asking for trouble, especially with units equipped with DLN combustion systems. As has been said before on control.com, most technicians never calibrate a temperature switch or a pressure transducer without first establishing the 'as-found' condition. And if the device is found to be calibrated properly, that's the end of the exercise; people don't continue to re-calibrate something that's found to be working properly. But almost *nobody* ever checks LVDT feedback before initiating an calibration sequence. If nothing has been done to affect the stroke of the device, it's not really necessary to calibrate the feedback, though it should be verified, and calibrated only if necessary. I've seen more units end up in trouble when people "calibrate" LVDT feedback during a maintenance outage and don't do it correctly, especially gas valves on DLN combustor-equipped systems.
Calibrating IGVs *properly* requires a modified machinist's protractor, one with the rule cut down to allow it to be inserted between the IGVs when they are fully closed. Most sites don't have the proper equipment (modified machinist's protractor) and simply use the indicator on the compressor casing. This is generally acceptable, and is really the norm because most sites don't have the proper equipment. But, depending on the condition of the indicator (most mechanics/millwrights walk on them or drop things on them and I've *never* seen one checked for accuracy or adjusted to make it accurate after a maintenance outage) this can be a risky proposition. Power output is very much a function of IGV angle; if they are not at the proper position when the unit is at Base Load it can either be over-fired or under-fired. Over-firing means the combustion gas temperature is hotter than optimal/design and hot gas path parts will deteriorate faster. Under-firing means the unit will not be producing as much power as it could (this isn't really related to efficiency, just power output.)
One of the things about AutoCalibrate is that it's really kind of "dumb" meaning that it has to be told what the device's position (the IGVs in this case) is at the minimum and maximum positions, so they have to be measured *before* an AutoCalibration is attempted. This is key to allowing AutoCalibrate to work properly: it has to know what are the physical limits of travel at both ends of the stroke. Because all AutoCalibrate can do is output maximum current to move the device to each of endpoints, waiting for the LVDT feedback voltage to stabilize, and then use the measured values to perform the calcaulation of the "offsets and gains." In the case of IGVs, one measures the angle in degrees and enters the angles in degrees into Toolbox. In the case of other devices, one measures the physical stroke (if appropriate) and then calculates the physical stroke in percent of specified 100% stroke length, so they're not entered in mm or inches, they're entered in percent.
I will outline the steps involved and you can ask questions, being prepared to provide details if requested. I don't have a working copy of Toolbox connected to a turbine, and it's been many years since I've done this so I'm trying to recall all the steps. Perhaps some others here can provide some clarifications as necessary. Also, this procedure presumes the LVDT zero-stroke voltage has been set correctly and that the servo-valve outputs to the servo-valve have been verified for proper polarity; both of these are critical to a successful calibration.
You must generally be in OFF or CRANK mode to calibrate LVDT feedback. The unit speed must generally be less than approximately 28% of rated speed.
Establish hydraulic pressure. Some units have Auxiliary AC motor-driven pumps for Aux. L.O. and Aux. Hydraulic supplies; some units have to be cranked with the starting means to develop L.O. and Hydraulic pressures. This is one of the important differences.
Open Toolbox, provide the appropriate password to be able to force logic and perform AutoCalibration.
*Usually* with 20TV-1 de-energized (presuming your unit has a 20TV-1) and hydraulic pressure the IGVs will move to the minimum (fully closed) mechanical stop. If the IGVs on your unit move to the minimum mechanical stop, measure and record the IGV position; you must use either a machinist's protractor or the IGV position indicator on the compressor casing (not the indicated angle from the LVDT feedback).
You need to find the servo-valve output (regulator) which has been assigned to the IGV servo, and then right-click on the regulator and selected 'Calibrate.' (The AutoCalibrate screen with some details of each of the buttons can be found in the Toolbox Help file; select Search, set the configuration to Maximum if it hasn't already been done; when the database is complete, type calibrate in the search window and then double-click on the proper title in the lower window.
Energize 20TV-1 (presuming you need to energize 20TV-1 to be able to position the IGVs) and use the 'Verify' (Manual position) feature of AutoCalibrate to move the IGVs to the maximum mechanical (fully open) mechanical stop. The easiest way to do this is to put in a position much greater than Measure and record the IGV angle using either the machinist's protractor or the IGV position indicator on the compressor casing.
Here's where I'm a little fuzzy on the details. I don't recall where or when you put in the measured min and max positions, whether it's before or after you initiate an AutoCalibrate procedure. But, at some point you need to tell or input the minimum- and maximum position values so that when AutoCalibrate moves the valve to the end points and you click on 'Fix' it knows what position this LVDT feedback voltage corresponds to. You need to do this for the minimum and maximum end points (fully closed and fully open, minimum mechanical stop and maximum mechanical stop), then when you click on 'Calibrate' Toolbox will tell the VSVO card to calculate the "offset and gain" values for the LVDT feedback calibration.
When that's done, you need to verify calibration using the 'Verify' (Manual positioning) feature. Do so by putting the IGVs at at least two different positions (one of them the maximum operating angle) and measuring the actual position (either with a machinist's protractor or with the IGV position indicator on the compressor casing). If everything is okay, then you need to click on 'Save' in the Calibrate area to write the values to the Control Constants so they will be saved for future reboot, and don't forget to upload the new configuration to the HMI.
Hopefully, you can answer some questions about your unit, and with the help of others who have access to Toolbox and Mark VI panels, we can develop a detailed procedure for you. But, that means we are going to need help from you, patience, and maybe you will be the one to provide the Toolbox details since you have access to Toolbox and a Mark VI panel.
But, I'd really like to know what's prompting the need to (re-)calibrate the LVDTs.
Hello every body ,
Thanx a lot for your reply, it was helpful with your explaining about my queries.
in fact we are working in south Baghdad-2 power station and it is currently under construction, so I asked you about IGV because we will do the pre-commissioning on our 8 gas turbines (frame5 ).
thanx for your help again.
i am good fan of yours,i always used to study the answers of yours and i acquired a well knowledge with in one year. i have 2 questions.
1.i have one question regarding this: before or after starting the auto calibration procedure, i have to enter the values of min and max IGV angles for LVDT 96TV1, channels in PCAAH1A hardware tab, and then download parameters in the PCAA card?
2.when to fix min and max end? when servo current reaches min and max values in the trend displayed during auto calibrate?
please reply sir
You are referring to a Mark VIe turbine control system, correct?
1. Before you begin an AutoCalibration procedure, if you don't tell AutoCalibrate what the fully closed- and fully open IGV positions are then it can't properly calibrate (scale) the LVDT feedback. Without setting and downloading the min- and max positions AutoCalibrate won't know what the positions are when it drives the IGVs closed and open, so it can't scale (calibrate) the LVDT feedback.
2. Fix Min and Fix Max are to be done when the IGVs are at the minimum and maximum angles--and that is, as you correctly wrote, when the servo current reaches positive maximum (for closed) and negative maximum (for open) AND the LVDT feedback has stopped changing (or isn't changing). When you click on Fix Min you are telling AutoCalibrate to use the min position value you downloaded to the PCAA card as the minimum position for this particular value of LVDT feedback. And, when you click on Fix Max you are telling AutoCalibrate to use the maximum position for this particular valve of LVDT feedback. AutoCalibrate then calculates the offset- and gain values for scaling the LVDT feedback so that at any position the LVDT feedback should be nearly exactly equal to the actual physical position of the IGVs.
Without telling AutoCalibrate what the minimum- and maximum positions are when you click on Fix Min and Fix Max AutoCalibrate will use the default values which are entered at the factory--which may or may not be correct (for IGVs, they are usually incorrect, because the mechanical stops and IGV actuators are NOT always adjusted exactly the same for every machine--one has to physically measure the fully closed (minimum) and fully open (maximum) IGV angles and then input the unit-specific values into the proper fields of the PCAA card and download them to the PCAA card BEFORE beginning the AutoCalibration procedure so that it knows when you click on Fix Min and Fix Max what those positions are so that it can properly scale (calibrate) the LVDT feedback for that machine.
Even if the Control Specification says the minimum operating angle for the IGVs is, for example, 34 DGA, and the maximum operating angle is, for example, 84 DGA, the adjustment of the mechanical stops is to be slightly less than 34 DGA and slightly greater than 84 DGA--to prevent the actuator from hitting the stops when operating. When shut down, it's okay if the IGVs are slightly less than 34 DGA--the Mark VIe doesn't care about that (unless it's MUCH less than 34 DGA).
Hope this helps! (Thank you for the kind words! It's gratifying to hear my writings have been of use.)
>Hope this helps! (Thank you for the kind words! It's
>gratifying to hear my writings have been of use.)
Hello Mr CSA , yes very usefull we learn a lot from you & control forum.
I have a question please, in the control specifications papers i found for example "GCV Position calibration" is this a real calibration procedure ?
you are the best for clarifying any type of doubts regarding GT control system!!
Exactly, i am asking for MARK VIe system only.
But one more small doubt CSA.
As i have to download the min and max values to download the angles for csgv-1,2 LVDT channels in PCAAH1A before starting the auto calibration, i have to vary IGV angle to min and max value to make the mechanical person to measure physically, should i go to regulators in PCAA -> calibrate valve-> and hit min and max position to know that??
AutoCalibrate has a method for manually stroking the device. Put in a manual set point/reference of 0 DGa and measure the angle, and then change the setpoint/reference to 100 DGA and measure the angle. Stop the manual position feature, input the measurements for min and max, download to the PCAA(s), then perform an AutoCalibration.
When the servo current and LVDT feedback stop changing at the min and max positions, then if I recall, there'd a button to calculate or something like that (I haven't done this for over a year). And that will be that. I think there's also some button to upload the new scaling information to the.tcw file and EPROM.
I'm told (and I don't believe everything I'm told until I can verify it ) that the 'Help' information in ToolboxST is much improved, though I seriously doubt that it gives a step-by-step procedure, if one understands all of the buttons/functions it's likely one can develop a procedure from that.
And the information you've learned from control.com!
I don't have access to ToolboxST at this writing, but we have covered the basic steps and using the ToolboxST 'Help' file you should be fine. I will have access to ToolboxST next week.
Hope this helps! Please write back to let us know how you fare!