We have installed a rebuilt Young & Franklin SRV on one of our 2000 vintage GE 7EA machines with DLN 1.0. It has an ~.035" swing in manual mode and it won't give .7-3.5 vrms range on the new style RVDT that is wired per the instruction. The best we can get is 1.3-3.5 vrms. We took a good working valve out of this unit while it was on outage and put it in a unit that was swinging and available to fix that unit.
Has anyone swapped from the old style RVDT (4 wire) to the new style RVDT(5 wire) using a Mark 5 "B" panel? Does reducing the feedback span from 2.8 vrms to 2.1 vrms cause a swing due to reduced resolution?
Many of the RVDTs used with Fisher Cam Vee-Balll SRVs had a different linear range than the LVDTs used on older SRVs and other devices. In some cases it was necessary to reduce the zero-stroke voltage to something around 0.5 VAC RMS, and the 100% stroke voltage was sometimes around 2.1 VAC RMS, or even less. (As wit many things, that was never well documented...) The Mark* doesn't care what the zero- and 100% stroke voltages are, as long as the output between them is linear. If the instability is only at mid-to upper stroke it may be that the zero-stroke voltage is set too high which is causing the output voltage to start to "roll over" at larger strokes.
Certainly, it doesn't help if the linear range is small, but it should be possible to get relatively stable operation.
Some of the other possible causes of instability could be: one servo coil is receiving the wrong polarity; the servo is the wrong one for the application (happens all too frequently, unfortunately), and a problem with the hydraulic actuator ( which also seems to happen more than it should), and incorrect servo null bias current. You didn't say what the servo null bias current value was in the I-O Configurator. And, you didn't say if the servo current gain values were the same for both units. There were a lot of commissioning people who would change the gain value to achieve what they perceived to be stable operation but caused problems when the valve was replaced. What are the current gain values for the two units? And the other I/O Configuration Constants for the two SRVs--are they all the same? (Some are on a different screen in the I/O Configurator.)
There have been reports of incorrect servo null bias spring adjustment on new, out-of-the-box servos. And, many refurbished servos have had spring adjustment problems, also. (It's virtually impossible to duplicate test bench conditions in the field to properly check or set null bias springs.)
I have had repeated experience with new, out-of-the-box servos with incorrect coil lead colors, so verifying polarity is very important.
Finally, some of the RVDTs were just never right and had to be replaced. In one instance only one output was good, the other output had a linear range--but the two couldn't be adjusted independently to achieve a linear range for both that was linear from valve closed to valve open.
Hope this helps! Please write back to let us know what you find.
Thanks for replying.
The servo polarities have been confirmed as well as the servo part number. The two RVDT's are linear throughout.
We swapped out the hydraulic cylinder after noticing some leak by and the amplitude of cycles dropped (this made us question the integrity of the "rebuild"). I dropped the Current Gain from 3.9 to 2.5 slowing the swing even further. The Current Bias was a 3.5 with the old servo, it was changed to 2.8 to match the average servo currents at 50%. In desperation I moved the Position Reference gain from .1 to 1.5 to get the swing as close to tolerable as possible.
We haven't run with this set up yet, as we are a peaking facility. I'll let you know what happens. Please let me know if you see any errors in my as left.
Thanks for the feedback!
You are correct--if you have to deviate from the as-found values using a rebuilt device, you should be questioning the quality of the work done on the device.
The really great thing about working on GE-design heavy duty gas turbines is that one doesn't have to tune regulators for servo-operated devices. That's because the servo gains were calculated by the design engineers knowing the hydraulic system pressure, the desired slew rate of the hydraulic actuator/device, the flow-rate of the servo-valve and the volume of the hydraulic actuator. That made things really simple (and easy!)--just make sure the right gains and settings are in there (the ones from the turbine factory--not the control system factory!) and you're practically guaranteed to have trouble-free operation.
Anyway, I'm not saying that in some cases--when the unit HAS to run--that sometimes changes aren't appropriate. What I am saying is that deviating too far from the as-found value (which may or may not have been correct, either...) is cause for concern.
As for changing the current (null) bias value, the starting point for any new servo should always be 2.67. This equates to one-third of the total current (for a TMR panel) specification for the servos GE buys and sells. And, the allowable range of adjustment for particular applications is +/-1.33. So the total allowable range of adjustment for a good servo valve is 1.33 to 4.0; any current (null) bias value outside that range is suspect--VERY suspect.
I don't subscribe to the method of calculating null bias values in GE Control Specifications--it just doesn't work. It's an ivory tower calculation, and in the real world ivory tower calculations don't always work. And in this particular case, can lead to some VERY questionable null bias values which are VERY outside the allowable range. Again, any time a new servo is installed, the current bias value should be reset to 2.67, if it's not found at that value. And any adjustments made from there--if necessary, and only up to the allowable limits (1.33 & 4.0).
I'm curious about why there was a difference in the servo currents at 50% stroke (if I understood correctly). When the SRV is in calibration mode (Auto or manual), it is in position mode. So, the reference is position (it's pressure in normal, running mode) and the feedback is provided by the RVDTs (in this case). So, if the servo current's are unbalanced at any position, and the three regulators have the same reference (which they will when in Calibration mode), then there's something amiss with the feedback.
Which brings me to my question: What did you do with the LVDT adjustment? The 0.7-3.5 VAC RMS specification is what GE tells their LVDT suppliers to guarantee the output is linear between. So, when the want an LVDT with a stroke of 3.0 inches, the LVDT output must be linear over a three-inch stroke when the zero stroke voltage is set to 0.7 VAC. But, they often used 3.0-inch stroke LVDTs on devices with 2.0 or 2.25 inches of travel--which means that the maximum output at the full stroke would be LESS than 3.5 VAC RMS. It would go to 3.5 VAC RMS--and be linear--if the device had a full three-inch stroke, but it if doesn't travel three inches on the device it's installed on, the output will never go to 3.5 VAC RMS.
It's rarely possible to get an LVDT that actually gets all the way to 3.5 VAC RMS at full stroke when the zero stroke is set to 0.7 VAC RMS. I could count on one hand the number of times I saw that (in thirty-plus years).
Early versions of the rotary SRV had the RVDTs; later versions have LVDTs. One of the reasons was because no RVDT could be sourced that had a linear output for 90 degrees of travel of 0.7-3.5 VAC RMS. And, it was felt that a lot of the instability of the SRV was because of the low range of RVDT output over the 90 degrees of travel. And, it was subsequently learned that most of the RVDTs had a very different linear range, from 0.5-1.2 VAC RMS in some cases. So, sometimes when the zero-stroke voltage was set to 0.7 VAC RMS, it was found that the RVDT output would decrease as the SRV rotated to 90 degrees--which drives the Mark V (any Mark*!) nuts. If the RVDT output increases to some point as the device is opening, and then starts to decrease as the device opens further the Mark* just doesn't really know what to do. It can lead to some very interesting "calibrations."
Please do write back with your results!
Here's a little tip that I have found to be VERY helpful when using AutoCalibrate of Mark V. When the AutoCalibration is complete, the scaling values in the RAM of each of the Mark V what one is supposed to do is to take the average of the three 0%- and 100%-stroke values for each of the LVDTs and put the average values into the I/O Configurator, then download them to the three control processors, and re-boot the control processors. (UNLESS one is using the super-cool method of having individual I/O Configuration files for each processor--that's for another day. I'm referring to the normal I/O Configurator data file which is generic to all three control processors, IOCFG_Q.DAT (for <I>s) or IOCFG_Q.AP1 (for GE Mark V HMIs).)
When the AutoCalibration is finished, the scaling values for the LVDTs are exactly what they need to be for each individual control processor. And, they are in RAM, which is what the Mark V uses when running the turbine. Sometimes there are very large differences in the individual values of the 0%- and 100%-stroke values for each control processor; this the result of component differences on the analog input circuitry (and could also be affected by any corrosion on the I/O card ribbon cables/connectors). When you calculate the average values and then download them to the control processors (to get them into EEPROM), and then re-boot the control processors (to get the EEPROM values into RAM) it's possible to cause some large differences in what each control processor thinks the LVDT feedback is (especially if the AutoCalibrate values had large differences).
The tip is to calculate the average values of the 0%- and 100%-stroke values, put them into the I/O Configurator, download them to the control processors--but DON'T re-boot the control processors. This way the exact values each control processor needs are in each control processor's RAM (determined from the AutoCalibration) and the LVDT feedback in each control processor is going to very closely match, if not be equal to, the LVDT feedback values in the other control processors--because each control processor is using the precise scaling values determined by AutoCalibrate. If there's a need to re-boot a processor for some reason the average values will have been downloaded to EEPROM and will get downloaded to RAM during the re-boot and while they won't be the exact values from AutoCalibrate they will be sufficient for normal running operation.
When I'm re-starting a unit with a Mark V after a maintenance outage--or even for the first time--the LAST thing I do just before the START button gets pushed is to perform LVDT calilbrations on all the devices with LVDTs using AutoCalibrate--I check the values from these calibrations against previous calibrations to see they haven't changed by very much, if at all, and make sure the average values in the I/O Configurator are correct (which they should be from prior calibrations) and then declare the unit Ready to Start. And, the LVDT feedbacks are the best they will ever be--because each control processor is using its exact scaling values as determined by AutoCalibrate. (I always do LVDT calibrations well before start-up, along with verifications to be sure the indicated position is very close to the actual position, and I keep printed records of the AutoCalibrate screens. I do calculate the averages of these calibrations and put them into the I/O Configurator and download them to the control processors, but usually during a start-up (new unit or after a maintenance outage) there are multiple re-boots of control processors so the AutoCalibrate values gets overwritten by the EEPROM values. That's why at the very last minute I do AutoCalibrations again, check the values against previous AutoCalibrations, and declare the unit Ready to Start if no glaring differences are found.
This only applies to Mark V Autocalibration, and only to units which DO NOT use individual I/O Configuration data files for the control processors. As mentioned above, there is a method for creating individual control processor-specific I/O Configuration Data files which can be used to download the processor-specific 0%- and 100%-stroke values from AutoCalibrate to each control processor's EEPROM. But, that takes some VERY careful attention to detail and procedures--which were never published by GE--and site personnel have to know this method, and if anyone comes on site to help with controls issues site personnel need to make sure they follow the proper procedures when making any I/O Configuration Constant changes. Without proper written procedures and site attention things can get messed up VERY quickly.
Hope this helps!
Thanks again for all of your insight.
We ran the engine yesterday after returning the Position reference gain to .1 and re-zeroing the zero stroke on both RVDT's. Apparently the MK5 doesn't like starting with a negative position on the SSR, and it can't control with a dead band greater its' setpoint. The first problem showed itself on the first startup as the SSR was released for control and P2 equaled P1. The second showed on the second start as the SSR not following reference trip. After those were ironed out, the third start was successful. The valve controlled from minimum load through base load with only the usual swing.
Thank you for the feedback! "Feedback is the most important contribution!"(c) here at control.com.
SSR (Stop/Speed Ratio) = SRV?
The Mark V (all Speedtronic panels) have a check on the position of devices with LVDTs to see if the position feedback is "good"--and one of the checks is if the feedback is less than -5% (usually) then the feedback is deemed not good. Is that what you're referring to? With a rotary SRV (SSR valve) there is no closed-end over travel as there is with a Young & Franklin combined SRV/GCV assembly; the rotary SRV should be 0% at closed, and 100% at open. If the RVDT feedback was less than approximately 0%, then there was something amiss with the calibration or the RVDT shifted. It would be nice to know what the RVDT feedback voltage was at 0% and 100% when you finished with adjusting it.
> The valve controlled from minimum load through base load with only
> the usual swing.
What do you mean by the "usual swing"? Do you mean the typical swing of most rotary SRVs, or do you mean the same swing it had before you started working on it? It's not a well-known fact, but the Mark V servo-valve outputs have some dither built in to the outputs. Dither is a function that very slightly varies the output in order to try to prevent valve build-up problems--something that happens more frequently on steam turbine valve stems or when there's bad gas on a gas fuel valve with a stem. It's NOT adjustable, nor can it be disabled--it's just always there. One of my theories is that with the design of the rotary SRV and the early actuators (the ones with RVDTs) the dither seems to cause more "instability" in the rotary SRV than other SRVs and hydraulically-operated devices. That's just my theory--I don't have any actionable data to back it up.
I don;t think you said if you changed the Current Gain value back to the Control Spec. Value (or to match the other units' SRV Current Gain value(s)); you said you had changed it trying to get the valve to be more stable during testing after installation. Perhaps, changing the Current Gain by small amounts could help with the stability; many have reported that seems to help. But, unfortunately, many rotary SRVs seem to have more instability than non-rotary SRVs, even with other turbine control systems.
Again, thanks for the feedback, and if you would please tell us what the as-left 0% and 100% stroke RVDT voltages were, that would be very much appreciated!