MK-V IGV LVDT Feedback Problem


Thread Starter


I came across a problem in our Frame 9E IGV calibration. During calibration I used the GIAGC to check the LVDT feedback voltages and they were steady and followed through the manual calibration procedure as per the GE guidelines and downloaded IOCFG-rebooted etc. Then I stroked the IGV cross checked from the Autocal screen. Here I noted that I get different LVDT voltages read by RST cores(though DIAGC was stable). This results in different servo currents and ultimately I always get an error between the CSRG and feedback(Allways close to 0.5-0.7 Deg.)

I did this calibration as we experienced problems during starting where the unit trips on IGV trouble. For some reason the IGV actuator gets slower to respond during starting and when we stroke the IGV before starting the problem does not come.

If anyone has experience on above phenomena or has a solution to the above pls post.

To provide any further assistance requires answers to all of the questions below.

You have made several statements that require data.

First, you say the unit is tripping on IGV fault during start-up. Precisely which process alarm is being annunciated when the trip is occurring? If the trip is L3IGVFLT, which of the three possible conditions that can cause the trip are actually causing the trip: L3IGVF1, L3IGVF2, or L3IGVF3? (NOTE: Some units don't use L3IGVF3 for this signal, but many do; in any regard, there are multiple conditions that can cause L3IGVFLT so it's important to understand which one is the culprit.)

Second, exactly when during starting is this trip occurring? 3% TNH? 20% TNH? 60% TNH? When?

Third, you say the IGV response is slow and when you stroke the IGVs before starting the problem does not come? What is the L.O. Header temperature (usually signal name LTTH or LTTH1) when you try to start the turbine without stroking the IGVs? What is the approximate turbine compartment temperature when you try to start the turbine without stroking the IGVs?

Fourth, please provide the signal names and values that you are reporting as differing between AutoCalibrate and DIAGC when you are comparing them. We need the signal names from both applications, and the magnitude of the instability (+/- 1%, +/- 5%, ???).

Fifth, you say that you always get an error between CSRGV (the reference) and CSGV (LVDT feedback) and different servo currents for the three processors. Please provide the values you are seeing for CSRGV, CAGV, and CSGV for all three processors.

Sixth, what is the difference between CSRGV, CSGV and the physical IGV angle at, say, a reference of 57 DGA? And 84 DGA?

Now, you say you are using a manual LVDT calibration procedure from GE. Are using AutoCalibrate to manually stroke the IGVs to obtain the voltages at closed and open positions? Are you then extrapolating the 0% and 100% stroke voltages to put them into the I/O Configurator? (AutoCalibrate will extrapolate the voltages automatically.)

Have you verified the polarity of the servo currents being applied to the individual coils of the IGV servo valve to be sure they are correct?

How are you measuring the IGVs to determine if the LVDT feedback matches the actual physical angle of the IGVs?

What is the servo current gain for the IGV servo output (from the I/O Configurator)?

What is the null bias current value for the IGV servo output (from the I/O Configurator)?

I have seen similar problems at sites which experience cold ambient temperatures and when the unit is started from a "cold" condition (no cooldown, or a very short cooldown period prior to START) the oil in the lines to the IGV actuator can cause sluggish IGV action. But, this is not a problem with LVDT feedback calibration; it's a problem with cold ambient temperatures and poor compartment space heating and cold L.O. temperature.

As for DIAGC, it's been said many times before on DIAGC uses is a configuration file, DIAGC.DAT, that must exactly match the cards and PROMs used in the Mark V panel. And, unfortunately, that wasn't usually checked as part of the commissioning process of the unit. And, if new revisions of cards and/or PROMs have been installed in the panel since the original commissioning, it's very likely that DIAGC.DAT was not properly upgraded. Therefore, unless it is known for certain that the version of DIAGC.DAT on the operator interface exactly matches the cards and PROMs in the Mark V panel, the data is suspect. Unless you are working with GE directly and they have verified that DIAGC.DAT is correct, or you know for certain that DIAGC.DAT matches the cards and PROMs in use in your panel, the data is suspect. That means that DIAGC should be used as a guideline unless it is verified and/or known to be correct.

One last question: Before you calibrated the IGV LVDT feedback, did you check the LVDT feedback versus the actual IGV angle to see if the LVDT feedback needed to be calibrated?

Again, we need much more specific information and the answers to all of the questions above to provide any additional assistance. It's a lot of data, but you've made a lot of statements that we need to quantify and understand.
Thanks for the response and sorry for the late reply as I was traveling.

On your first question the trip is from L86GVT and our values are set at 7.5 deg with 5 seconds delay. We always get this at the time of IGV start to open during start up that is at 80% of speed

LTTH1 is around 54 deg C and compartment temp is around 40-50 Deg C. I do not see this as an issue as we never get ambient temp fluctuations in this part of the world.

In DIAGC it is named as LV9 and LV10 and I took the readings at base load which was 3.13 and 3.02.
In auto cal it looks like below
LVDT 1 -3.12 -3.10 -3.09
LVDT 2 -3.02 -3.05 -3.04

from prevote display
CSRGV 84 84 84
CSGV 84.4 84.7 84.3
CAGV 1.4 5.3 0.5
I am not using AutoCalibrate function to calibrate I just use the screen to look at the values. What I do is there is another user defined screen created to stroke the IGV by forcing JADJ and manually feeding IGV ref. In this process I stroke the IGV to 34 Deg and measure the LVDT voltages and measure the actual physical angle using a protractor. This I do for 84 deg as well and calculate the zero and 100 deg voltages and gain. This is then entered in io-configurator and down loaded to each processor one at a time.

I have verified the polarity and is fine.

Current gain is 10 and null bias is at 3.

On DIAGC I am also not sure whether PROMS are matching with version. I will check with this when we get a shut dow of the unit.

On calibration it had to be done as we removed the IGV for bushing replacement during major in Dec 2009.

You should verify the value of servo current gain with the packager of your turbine. I believe it's much lower than it should be.

Positive servo current will close the IGVs; negative servo current will open the IGVs. When everything is working correctly, and everything is balanced the servo currents should be -2.67% for <R>, <S>, and <T>--that's under ideal conditions (which rarely happens).

In your case, all three processors are indicating positive current, and why? Because the actual feedback is more than the reference and positive servo current works to close the IGVs. In the case of <R>, it's not too positive because the error isn't too large. In the case of <S>, the value is much more positive because the error between CSRGV and CSGV is larger. And, <T>, well it's kind of 'along for the ride', but it's also positive, because, again the actual is greater than the reference.

There's a little-known problem with the Mark V and LVDT calibration. When you calculate 0% and 100% voltages and you put those values into the I/O Configurator, those same values get downloaded to <R>, and <S>, and <T>. But, <R>, and <S>, and <T> really need their own individual values of 0% and 100% voltages to work correctly. That's because they don't see the same voltages, as indicated in your data. That's caused by component tolerances on the discrete components used on the TCQA cards. In other words, the values of resistors, and capacitors, and such (including wiring/cabling voltage drops) are *NOT* the same for the same components and circuits on the three cards/processors. So, all three cards need individual 0% and 100% voltage values, not "average" values.

If you had used AutoCalibrate you would see exactly what is being described. When you use AutoCalibrate to calibrate LVDT feedback not only will it, in the case of the IGVs, calculate the 0% and 100% stroke voltages but it will do so in <R> and <S> and <T>! And, then you will be tasked with averaging the 0% values and the 100% values and putting the average value into the I/O Configurator, and downloading that value to <R> and <S> and <T> and it will still result in unbalanced feedback, and unbalanced servo currents. It's just the nature of the process.

However, when you complete an AutoCalibrate procedure what it does is change the RAM values of the 0% and 100% stroke voltages to be exactly what they need to be for <R> and <S> and <T>. And the three feedbacks will indicate very close to the same values, if not identical, for all three processors. In AutoCalibrate. Not in the Prevote Data Display, because that's the raw input voltage seen by each processor. Each TCQA card will use the RAM values of 0% and 100% stroke voltages to calculate the feedbacks so they are nearly, if not, identical, to scale the voltages seen by each processor. And the servo currents will usually be pretty well balanced, as well.

Then when you average the values, and put them into the I/O Configurator and download them and reboot the panel, you will see unbalanced feedbacks and unbalanced servo currents.

When you do your calibration and put the 0% and 100% stroke values into the I/O Configurator and download them to <R>, <S> and <T> and reboot and then stroke the IGVs and verify the accuracy of the calibration by re-measuring the actual angle versus the reference, what kind of error do you get? I'll bet it's not 0.1 DGA, or even 0.5 DGA. It's probably on the order of 1.0 DGA or greater, right?

I think part of your problem is that you believe that all three processors should show the exact same voltages and feedbacks. And, they won't. And likely never did and never will. It's just not possible, when each processor needs to have it's own individual 0% and 100% stroke voltage values because each processor does not see the exact some voltages, as indicated by your data, and they won't because of component (in)tolerances between the three cards/circuits/processors.

GE recognized this when they used three LVDTs on steam turbine applications, and they had to come up with a method for downloading individual 0% and 100% stroke voltages to each individual processor to get the feedbacks to be properly calibrated. Some heavy duty gas turbine sites adopted this method, but because it was never properly documented it has fallen out of use at many sites and can even cause lots of problems if people aren't aware of how it works. (This GE stuff can be really fun, can't it?)

So, part of your problem is that you believe that all three processors should see the same voltages, the ones you read at the TB with your voltmeter, and they won't. And, you believe that all three processors should report the identical feedbacks when you download a single set of 0% and 100% stroke voltages to all three processors, and they won't.

If you don't believe me, just try AutoCalibrate on the IGVs on your next shutdown. I'll even give you the procedure here. Close the IGVs hard against the mechanical stop (by putting in a reference of 0 DGA) and measure the angle with your protractor (let's say you measured 31 DGA). Open the IGVs hard against the mechanical stop (by putting in a reference of 100 DGA) and measure the angle with your protractor (let's say you measured 86.4 DGA).

Open the file F:\UNIT1\ACALIB.DAT with a text editor and find the Servo Valve Output #5 section (which is the IGV servo-valve output on your unit, based on the two LVDTs you reported, 9 and 10) and enter angles as below:<pre>
Save the file and exit your text editor.

You should still have 20TV-1 forced to allow IGV operation (as you needed it to stroke the IGVs to get the two angle measurements above).

Now, open AutoCalibrate, go to the Servo-Valve Output #5 screen (for the IGVs) and then enable AutoCalibrate (sometimes you have to be in CRANK mode, but not starting, just select CRANK mode) and perform an AutoCalibrate. When it's finished print the screen. And take a look at the calibrated LVDT feedback values, they will be remarkably close to each other. Look at the three servo currents, they should be very well balanced, and close to values they should be (if everything is correct; I have my suspicions on this).

Now, stroke the IGVs (using any method you want, AutoCalibrate or the Demand Display) to the max open mechanical position by putting in a reference of 100 DGA, and then put in the MAX open IGV angle for your unit (usually 84 DGA or 86 DGA, whatever it is for your unit). Now, take a measurement with your protractor, one in each quadrant of the inlet and average the values and compare them to the reference. They should be within 0.5 DGA or 1.0 DGA of the reference.

Next put in a reference of 57 DGA (or whatever the MIN operating angle is, the FSNL angle) and when the IGVs finish moving take another four readings with your protractor and average them and they should also be within about a degree or so of the reference.

And, during all of this, keep an eye on the three LVDT feedback values and the servo currents. They should all be pretty equal and pretty balanced.

Do all of the above without ever leaving the AutoCalibrate display (using the Manual positioning feature of AutoCalibrate) and remember to open the IGVs fully before taking any measurements, and take the measurements while stroking them closed, not open, just as in the procedure above.

If that doesn't prove to you that AutoCalibrate will do what it's supposed to do, then I don't know what will. It should also prove that the three processors will each calculate--and require--individual 0% and 100% stroke voltage values.

When you're done, calculate the average 0% and 100% stroke voltage values for the two LVDTs, and put the values into the I/O Configurator, and download them to <R>, <S> and <T>. BUT DON'T RE-BOOT THE PROCESSORS! As long as you don't re-boot the processors, the AutoCalibrate-calculated values will remain in RAM and be used the three processors! When any one of the three processors does get re-booted, it will then use the average values which were downloaded to EEPROM, and there will likely be slight disagreements with the other processors, and servo current disagreements, as well.

And, if you want, you can re-boot the three processors to use the average values at any time. And, if you recorded the values from your previous manual calibration, you could even put them back in the I/O Configurator and re-download them and re-boot the processors to get back to the values we started from.

But, I still believe the value of servo current gain for the IGVs is way too low, by at least a factor of three. But, verify that with your packager to be sure before changing anything. It doesn't explain why you don't get the error if you manually stroke the IGVs before you start the unit, which is just plain strange.

I also would like to know how you verified servo current polarities for the IGVs.

But, I don't see any problems with the voltage readings you provided. None. It's the nature of the "system" that they will not be identical. And, unless <R>, <S> and <T> have their own 0% and 100% stroke voltage values for the voltages they see, they won't report the same feedback values.

As for why your IGVs are not working correctly with such a low servo current gain, I will only hazard a guess after you tell us how you verified servo polarities. And, please, ask your packager what the correct servo current gain should be for the IGVs for your unit.
We are currently in a forced outage due to a similar issue with IGV Control Trouble Trip L4IGVT. Both LVDT#1 and LVDT#2 have stable voltages from 0%-100% IGV travel. We have steady hydraulic pressure and the IGV's aren't sticking.

The servo current is one of our main questions. When we have both LVDT's connected the servo current look as follows
<pre> <R> <S> <T>
-0.5 13.01 -2.34 </pre>
When we disconnect LVDT#2 the servo current across the cores are nearly the same, drastically different from the above. We suspected a bad mil-spec connector but have eliminated that from the equation. The elevated <S> LVDT servo current while both Any suggestions would be appreciated?