Automatic Generation Control

J

Thread Starter

jorjani

Hello,

I have a question about the AGC function. As we know, following a disturbance in the system, the speed governors of all the generating units will contribute to the process of stabilizing system frequency at a new level. Then, the AGC function will calculate the Area Control Error (ACE) and change the setpoints of selected generators based on that to return frequency to nominal value and free up the primary control (governor) reserve. In the literature, it is stated that AGC will act from seconds (usually 30 seconds) to minutes. My question is that what causes the AGC to do no action just after the occurrence of the disturbance? What causes it to react after 30 seconds? After the disturbance, the ACE will change. So I think that AGC should send setpoints to generators. If that is true, it means that governor and AGC actions are being done simultaneously. Is that OK or will it bring about problems?
 
jorjani,

This is an answer about AGC control. But, we also need to understand what happens at the prime mover governor during a frequency deviation/disturbance.

AGC (Automatic Governor Control) can be accomplished in one of several different methods. We don't know which method is being used at your site. The signals coming from the entity which is operating or supervising the grid may be analog signals or discrete signals. AGC signals may be being sent to the governors all the time, or they may only be sent after a disturbance. There's just a lot we don't know about the AGC scheme being used at your site.

But, we can try to prevent some information that may be of help in investigating and understanding the system. Let's say the governors of all the generator sets (turbine generators; diesel generators; hydro generators; etc.) are all operated in Droop Speed Control mode (that is, they are not producing maximum, rated power). And, further, let's say that the grid frequency decreases because somewhere one of the generator sets (a large one) is suddenly tripped off the grid, meaning the power it was providing is suddenly lost. What happens in this case is that load being produced by the remaining generator sets is LESS than the total number of electric motors and lights and televisions and computers and computer monitors and tea kettles connected to the grid. The load (the total number of electric motors and lights and televisions and computers and computer monitors and tea kettles) hasn't changed (let's say it was 100 MW), but the amount of generation has suddenly changed, dropping from 100 MW to 95 MW when one of the generator sets tripped off line. So--and this is very important to understand--in this case (our example) the load on the grid remains the same (100 MW) but the amount of generation has decreased (from 100 MW to 95 MW).

What happens next is that the frequency of the grid drops, by 5% (because the difference between the load and the amount of generation has increased to 5%--the load is 100 MW, but the amount of generation has decreased to 95 MW, for a difference of 5 MW, or in our example, a decrease of 5%). The generator sets remain running, but the grid frequency drops due to the loss of generation versus the total load, and the immediate effect is that the grid frequency drops--which means the frequency of ALL the generator sets synchronized to the grid drops.

The governors of the prime movers driving the generators, operating in Droop Speed Control Mode, sense the change in frequency and this causes the amount of energy flowing into the prime movers to increase in proportion to the change in frequency. This happens because the load hasn't changed (in our example), but the amount of generation available to power the load has changed which resulted in a change in frequency. The amount of load (the total number of electric motors and lights and televisions and computers and computer monitors and tea kettles) needs to be powered--the 100 MW of load. If you added up all of the generator outputs after the generator supplying 5 MW tripped of line, it would still add up to 100 MW. But, there simply wasn't enough energy flowing into the prime movers of the generators synchronized to the grid to keep the frequency equal to the rated frequency of the grid.

By having the governors start adding a little more fuel because of the frequency decrease prevents the grid from just grinding to a halt. The 100 MW load is still being supplied (none of the electric motors had to stop, none of the lights had to be extinguished, none of the televisions had to be turned off, none of the computers and computer monitors had to be shut down, none of the tea kettles had to be switched off), but it isn't being supplied at rated grid frequency. 100 MW is still being supplied, just not at rated grid frequency. Without the extra energy caused by the frequency decrease the grid just couldn't keep running and unless some of the load were removed from the grid the frequency would gradually decay to the point that protective relays on the grids would start tripping loads and generators off the grid, and in the worst case, the entire grid would just go "black" because no power was available for the loads connected to the grid.

Now, the grid operators/supervisors can see this drop in frequency, they may--or may not--know which generator tripped off line. But, they can use their AGC output signals to start raising the signals to the governors of the prime movers of the remaining generators to start increasing the frequency of the grid. Every generator doesn't have to receive this signal, just enough generators of sufficient capacity have to receive this signal and increase their output and gradually the grid frequency will begin to increase. Eventually, if the grid operators/supervisors are patient and the prime mover governors are all working properly, the grid frequency will settle out at rated.

The maths of what is happening and what is happening on each and every prime mover governor can be quite intimidating--but the above description is pretty concise without getting into all of the tiny little details. AND, you should know that a LOT of generator sets are NOT being properly controlled when they are synchronized to the grid--and many of them will NOT properly respond when a grid frequency occurs. There are <b>MANY</b> operators, Operations Supervisors, and Plant Managers who believe that during a grid frequency disturbance the output(s) of the generator(s) at their plants should remain stable and not change. And, because of the mode of operation used on many governors when the governor senses a grid frequency change and tries to respond appropriately it actually counter-acts itself (YES!) and tries to maintain load. In our example, the governor would start to increase load, but the governor mode selected and used to operate the unit would sense the increase and start decreasing load, but then the governor would see the load start to decrease and would try to increase it--and this can continue indefinitely if the operators don't take action. This can cause the grid frequency to oscillate, which can make the disturbance worse. And, often when the operators do take action, if the load increases they manually reduce load to return to their previous load--which is the WRONG thing to do, for the grid frequency and stability. So, we have governors that are not responding properly to grid frequency disturbances, they can actually accentuate (make worse) grid frequency problems, and we have operators, their Supervisor and their Plant Managers who believe the generator outputs should remain unchanged during a grid frequency disturbance--and will take action to try to maintain their previous generation output when it should change to help support grid frequency and stability.

Now, with this knowledge, do you expect the AGC signals to be immediate when a grid frequency occurs? Would you agree a small time delay would be prudent for things to try to stabilize before changing anything--which could make the problem worse?

Now, you ask, if the energy flowing into the prime movers of these generators receiving AGC signals increase what happens to the load of those generators? Actually it changes very little--but the increased energy flowing into the prime movers of the generators increases, and that helps to start increasing the frequency of the grid and returning it to normal.

On an AC (Alternating Current) transmission and distribution grid, the prime movers driving the generators consume a certain amount of power just to maintain synchronous speed (to match grid frequency). (Think about how much power is required just to maintain rated speed/frequency when you are synchronizing the unit to the grid!) And, when the total amount of energy entering all the prime movers of the generators synchronized to the grid (to maintain rated speed AND to produce power for the grid) and when the amount of generation (the power outputs of the generator sets synchronized to the grid) exactly matches the total number of electric motors and lights and televisions and computers and computer monitors and tea kettles then the grid frequency will be at rated. Any imbalance can upset this equilibrium. Even starting and running a very large electric motor can momentarily cause the grid frequency to decrease--until some generator(s) somewhere on the grid increase their power output to return the equilibrium to a balanced condition to maintain rated frequency.

In many industrialized countries of the world there are definite changes in frequency which can be observed at certain times of the day. In the morning, when people are awakening and turning on tea kettles and heaters (or air conditioners) and lights and going to offices and factories and beginning work, the grid frequency will often be slightly below normal until the load stabilizes and the grid operators/supervisors can balance everything out. And, then at the end of the day when people return home and turn off appliances and lights and televisions and computers and computer monitors in their homes and go to bed, the grid frequency can be slightly higher than rated, until the load stabilizes and the grid operators/supervisors can balance everything out. And, they use AGC for this balance process, as well. The better grid operators/supervisors can anticipate changes in load during the day and even at different times of the year and can be pretty good at keeping the grid frequency very stable.

But it's really a balancing act, and depending on the nature of the grid disturbance it's probably not in the grid's best interest to immediately change generation, but to wait for things to try to stabilize, thereby keeping things more stable if not exactly at rated.

So, the <i>initial</i> reaction of generators and their prime movers to a grid disturbance occurs at each generator via its prime mover governor. And, the later act of restoring the grid to rated frequency occurs by AGC (on a well-regulated grid). Really, any signal coming to a prime mover governor via AGC is to maintain, or to restore, grid frequency to normal even if the deviation is very small. And, this goes for small frequency deviations detected by the governor--if the governor is acting in "free governor mode" or "true" Droop Speed Control. NOT in some form of load control (where the individual generator's setpoint is determined by the local plant operator and the governor will NOT properly respond to grid frequency deviations/disturbances). Many grids are now requiring plant operators and owners to run their units in "free governor mode" (sometimes also called "Primary Frequency Response", and there are many other names--including Droop Speed Control) to help stabilize grids that have historically been prone to some pretty wild (bad) deviations and even black-outs.

Hope this helps! I would surmise that the times listed in whatever document you are reading are maximum expected delays--but the actual delay will really depend on the nature of the disturbance and the suddenness of the event.
 
Hi CSA,

Thanks for your detailed explanation.

For GE frame6 gas turbine, what is acting as the governor of prime mover of the generator. Is't the GCV? Also, the function AVR is also same as that of AGC as you mentioned above.

thanks in advance for your support.
 
Technically, the governor of a prime mover is the speed control system. For most GE-design Frame 6B heavy duty gas turbine prime movers, the governor is part of the Mark* turbine control system (the asterisk (*) is meant to represent the trademark GE owns on the name "Mark" and the various versions of the "Mark" turbine control system (denoted by Roman numerals, like V, for Mark V, the fifth version of Speedtronic turbine control system produced by GE in Salem, Virginia, USA)).

The Mark* turbine control system typically does many functions, not just speed control, such as automatic control of auxiliaries like pumps and fans and solenoid-operated devices like some valves. Some older governors did only speed control and nothing, or very little, else. But speed control is a critical part of the Mark*.

AGC is a term used to represent a method of controlling the amount of power produced by a prime mover and generator by a person or entity (like an electric transmission & distribution grid operations organization) from a remote location (outside of the power plant), usually to help regulate grid frequency and stability.

AVR is a term typically used to refer to the control system that keeps the terminal voltage of a synchronous generator at some setpoint. AVR stands for Automatic Voltage Regulator, which is really an abbreviation for excitation control system, or exciter regulator. The excitation control system's purpose is to apply DC (Direct Current) voltage and current to the rotating magnetic field of a synchronous generator, thereby controlling the synchronous generator's terminal voltage. (Most excitation control systems have both an Automatic voltage control mode, sometimes called the AC control mode, and a Manual control mode, sometimes called the DC control mode.)
 
Hi CSA,

Can you explain how Mark V or Vie controls the speed of gas turbine?

So AVR is mainly used in start up to make the terminal voltage equal to bus voltage. Once it is synchronized, it doesn't have much role. Am I right?
 
esdauto18,

>Can you explain how Mark V or Vie controls the speed of gas
>turbine?

Well, basically when the unit is generating power there's a speed setpoint, usually point name TNR (Turbine Speed-Reference). And, the Mark* compares the actual speed of the turbine shaft, usually point name TNH, and adjusts the fuel flow (as a function of fuel control valve position, usually) to make the actual turbine speed equal to the turbine speed reference. Both values are expressed in percent, so comparison is easier. And, because different GE-design heavy duty gas turbines operate at different speeds, yet the speed control is the same in just about every version of Mark* they just use percent speed (percent of rated--and rated speed is expressed somewhere in the Mark* I/O configuration setup).

When the unit is accelerating, there is an acceleration speed reference (TNHAR, Turbine Speed-High-pressure shaft Acceleration Reference) and the Mark* looks at the actual acceleration rate (THNA, Turbine Speed High-pressure shaft Acceleration) and adjusts the fuel flow (again as a function of control valve position) to make the actual acceleration rate equal to the acceleration rate reference. The acceleration rate reference is NOT to be field-adjusted. It's usually set to maximize hot gas path parts life (the combustion liners, and combustion transition pieces, and the turbine nozzles and buckets), and increasing the acceleration rate can shorten the hot gas path parts life. Which costs a lot of money--in parts and labor.

When the unit is synchronized to a grid and producing power, from 0 MW to Base Load the unit is operating on Droop Speed Control. And Droop Speed Control has been covered SO MANY TIMES on control.com, I think the name of the website should sometimes be droopspeedcontrol.com. Use the 'Search' feature for MANY descriptions, but essentially what is happening is that the actual turbine speed is fixed (by the frequency of the grid the generator is synchronized to). BUT, when the operator is changing load what is actually happening when RAISE SPEED/LOAD or LOWER SPEED/LOAD is being clicked on, or even when the Pre-Select Load Control Setpoint is being changed is that the Turbine Speed Reference is changing. And the difference between the Turbine Speed Reference, TNR (usually a value between 100% and 104%) and the actual turbine speed, TNH, controls how much fuel is going to be admitted to the turbine combustors. Increase TNR and the difference between TNR and TNH increases (because TNH is fixed by grid frequency), and that increases the fuel flow--which increases the load being carried by the turbine-generator. Decrease TNR and the difference between TNR and TNH decreases, and that decreases the fuel--which decreases the load being carried by the turbine-generator.

Finally, for natural gas fuel (or just about any gaseous fuel), GE for years used a two-valve system (a Stop-Ratio Valve, the SRV, and a Gas Control Valve, the GCV). The flow through the GCV was proportional to the valve position (in GE-speak valve position is called "stroke")--so flow is proportional to stroke. That made using the error between TNR and TNH very easy to directly translate to valve position--which was proportional to flow. Increase the Gas Control Valve position reference, FSR (Fuel Stroke Reference), and the fuel flow-rate through the GCV increases. Decrease FSR, and the fuel flow-rate through the GCV decreases. Even though gas fuel flow-rate is typically measured by the Mark* <i>it's <b>NOT</b> used for control purposes</i>, UNLESS the unit has water- or steam injection for NOx emissions reduction.

For liquid fuel, the actual liquid fuel flow-rate is measured (using magnetic speed pick-ups on the Liquid Fuel Flow Divider), and FSR is converted to a liquid fuel flow-rate and compared to the scaled feedback from the liquid fuel flow divider and the Liquid Fuel Bypass Valve is adjusted as necessary to make the actual flow-rate equal to the flow-rate reference. (That sounds a lot more complicated than it is.)

For shutdown of the gas turbine, the Mark* doesn't control deceleration rate like it does acceleration rate during start-up. It's just reduces the fuel flow-rate in steps, which <i>should</i> cause the turbine to decelerate while still maintaining flame in the combustors down to about 20% turbine speed (when burning natural gas fuel, or about 50% speed when burning liquid fuel). The steps are programmed into the Mark* and sometimes require adjustment as the unit ages and fuels change (and they can change over time). Keeping the flame "on" as low as possible during shutdown helps to maximize hot gas path parts life also.

One more thing which just amazes people when they hear it: The Mark*, as sophisticated as it is, <b>DOES NOT</b> calculate or measure air/fuel mixtures. It doesn't. All of that is done by the designers and programmers of the Mark* using known values of fuel nozzle orifices and air flows through the unit at various IGV (Inlet Guide Vane) angles. And, most of that information is considered proprietary by the designers (GE). So, many inexpensive automobiles with electronic fuel injection have more sophisticated fuel control systems than the Mark* which measure oxygen content of the exhaust gases (which is a function of air/fuel mixture and adjusts the fuel injection rate to maintain a desired air/fuel mixture). (It's true!)

I've answered the AVR question in another thread.

Hope this helps!!!

Again, many of your questions have been asked and answered more than once over the past fifteen years on control.com. And all of that information is accessible using the 'Search' feature to access all the past threads in the control.com Archives. Enjoy!!!
 
Top