Automatic Volage Regulation

D

Thread Starter

Dave

Hello,

We are doing a process simulation of a Turbine Generator for a client who will use it for Operator training purposes. My question is, when the generator is synchronized with the grid, the frequency is fixed to whatever the grid frequency is. Also, the generator terminal voltage is basically fixed to the grid voltage. in our model, Generator voltage is controlled by the exciter before synchronization, and after sync, the exciter controls VArs. However, in the actual process, the Mill uses automatic voltage control even after synchronization. (They can also run in Automatic Var control, but they don't use it). Since theoretically the voltage should be locked to grid frequency, what is the AVR doing? I know from looking at the actual process that the Generator voltage was less than the Voltage at the utility tieline. I also watched the exciter voltage vary several volts up and down a couple of times a second in a range of about +-5 volts.

My question is, where does this voltage difference come from? is it just imperfections in the equipment or voltage drops in the Mill distribution system? The other client we are doing work for also has AVR running after synchronization, Since we are trying to model this behavior, can someone tell me why AVR is necessary after synchronization, and where the voltage variations come from that necessitate the use of AVR after sync?

Please understand that I am in no way questioning the use of AVR after sync, it seems to be normal, I'm only trying to understand where these voltage variations come from so we can introduce them into our model.

Thanks
 
Hello again, Dave,

(I cant help but think of '2001: A Space Odyssey' every time I write that, feeling like HAL. Though for some reason, my train of thought while writing this was not very logical; sorry.)

AC synchronous generator terminal voltage is a function of one of two things: speed and the strength of the generator electromagnetic field. If one is held constant and the other one is changed the generator terminal voltage will vary. The AVR controls the DC current/voltage applied to the generator electromagnetic field windings thereby controlling the strength of the magnetic field.

As you've said, once the generator is synchronized the speed is fixed. So, the generator terminal voltage can only be changed by changing the strength of the generator's electromagnetic field using the AVR.

Using the AVR to increase the strength of the magnetic field before or after synchronization will (try to) increase the generator terminal voltage. When synchronized, it's not so easy to change the generator terminal voltage because it's more or less equal to the grid voltage. So, that extra DC power being applied to the generator electromagnetic field will cause lagging reactive current (lagging VArs) to "flow" in the generator stator windings--when synchronized.

After synchronization, if the AVR tries to lower the generator terminal- and grid voltage then leading reactive (leading VArs) will "flow" in the generator stator windings.

In either case, the grid voltage and generator terminal voltage may perceptibly change, and that's a function of the grid and many intangible factors, but usually any change is fairly imperceptible.

As for why the AVR voltage is fluctuating (which is what I think you're referring to) when the load (watts; KW; MW) is stable, well that's likely a function of the tuning/adjustment of the AVR control loop. Because generally the AVR DC voltage/current output should be fairly stable when load is stable, unless it is being changed in response to a command to the AVR. So, it sounds like there is some instability which might affected by an adjustment to the AVR control loop.

As the grid voltage changes (and it does change throughout the day, every day) if the generator terminal voltage does not change then reactive current (VArs) will flow in the generator stator windings. The "direction" and magnitude will vary depending on whether or not and by how much the grid voltage is changing. That's why, even if the load is stable all the livelong day, the operators likely have to make adjustments to the AVR when trying to maintain a VAr or power factor setpoint--because the grid voltage is changing and the AVR is only trying to maintain a generator terminal voltage setpoint.

If the generator terminal voltage is higher than the "running" voltage the generator is being synchronized, when the generator breaker is closed then lagging reactive current (lagging VArs) will flow in the generator stator windings when the generator breaker closes.

If the generator terminal voltage is lower than the "running" voltage the generator is being synchronized, when the generator breaker is closed then leading reactive current (leading VArs) will flow in the generator stator windings when the generator breaker closes.

A lot of times differences between generator terminal voltage and tie breaker (grid voltage) are due to instrumentation differences, and some small inaccuracies or calibration errors. There can also be voltage drops in the mill. As was said above, if at the generator breaker (which is usually where the voltage sensing for the AVR is located) there is no reactive current (no VArs; 1.0 power factor) then the generator terminal voltage is equal to the "grid" voltage at that point.

Again, these are observations of things I have experienced over decades of work in the power generator industry. The actual formulas and mathematics (and the EMFs and counter EMFs and armature reactions and such) are not what I use to explain this to a power plant operator. He or she just wants to know what's going to happen when they "increase" the AVR, or "decrease" the AVR. Actually, lately the power plant operators I've observed don't really know what will happen when the "increae" or "decrease" the AVR. If the "increase" when they should have "decreased", they just "decrease" a little more. And vice versa. They just know that a change in VArs or power factor will occur, and they "bump" it in one direction ("increase", for example) and if they don't get the change they wanted they bump it in the other direction ("decrease"). (It's not only sad, but extremely frustrating. Because when a problem occurs and one tries to get information from an operator, they can't tell you what happened or what they did; just that what they thought should happen didn't, or that it had "never done that before"....)

Hope this helps!
 
If I may elaborate on the question:

I am thinking of an AVR is a PID controller of some sort. If one is not connected to the grid and the AVR is OFF or it is in MANUAL, then the operator can increase or decrease the excitation voltage directly. If we are still off the grid, and IN AVR control, then the process variable becomes the generator voltage and Setpoint can be increased or decreased by the operator. So before sync-ing to the grid, an operator tries to match the generator voltage with the bus (grid) voltage before closing the breaker. He could do that with AVR turned off or on.

However, once we are connected to the grid, as you mentioned the generator Voltage does not change - practically. So what becomes the Process Variable in a PID controller? It's like whatever the AVR does, the only effect is in VARs, but not Volts. However I have seen generator controls to include VAR control, PF control, and AVR control - selectable by an operator. So if an operator selects AVR control what is the AVR controlling? I understand that whatever they select, it is basically all the same control, one cannot be changed without the other, but I still wonder what is the AVR algorithm looking at (Volts, VARs, ??) when it is in control?

On older non-computerized units, I've seen AVR switch, that could be put into MAN or AUTO. Then there were two more 'momentary switched', first 'MANUAL VOLTAGE ADJUST' with increase and decrease positions and second for 'AUTO VOLTAGE ADJUST' with also increase and decrease positions. However, when connected to the grid, and in AUTO AVR control, AVR obviously does not adjusts Voltage - although the switch description says so. What is the AVR holding steady? What is it controlling? If in this mill they have a 'Load tab changer' and they lower the grid bus voltage that the generator is connected to, what is going to be AVR's response and why?
 
The AVR regulates generator terminal voltage. It does this by regulating field current in the exciter.

The tap changer brings the generator and grid voltage within the control range of the AVR - AVR and Exciter are not limitless.

If there is a disturbance in grid voltage it may not be desirable for the generator voltage to follow the grid - adjusting excitation to maintain generator voltage will support the grid.

Adjusting Excitation can also be used to import or export Vars.

AVR will also suppress over voltage on load rejection.
 
Rudy,

"AVR" means Automatic Voltage Regulator, but it is very commonly used to refer to the entire generator rotor excitation control system, which usually includes two "regulators" (PID loops) or "modes": Manual, or sometimes called "DC", and Automatic, or sometimes called "AC".

The Manual (DC) Regulator/PID/loop/mode of the "AVR" is always functioning (whether or not the generator is synchronized) and is trying to stably control the amount of DC voltage/current being applied to the generator rotor electro-magnetic windings. Remember: the strength of an electro-magnetic field is related the number of turns (which are fixed on a synchronous generator rotor) <i>and</i> the number of amperes applied to the turns (i.e., amp-turns). The generator terminal voltage is a function of speed and field strength--but since the generator rotor speed is relatively constant (as long as grid frequency is constant or the generator is at rated speed)--the only way to change generator terminal voltage is to change field strength, by changing the amp-turns, by changing the DC applied to the rotating field coils (turns).

The Manual (DC) Regulator doesn't care what the generator terminal voltage is (the operator adjusting the AVR in Manual Mode does--but the AVR doesn't, when it's in Manual Mode), it just cares that the amount of DC voltage/current being applied to the generator field windings is stable, and goes up or down in response to the operator's commands. The Manual (DC) Regulator mode of the AVR is only looking at the DC voltage/current feedback and is adjusting

The Automatic (AC) AVR Regulator/PID/loop/mode has feedback from one or more generator terminal voltage PTs (Potential Transformers). When the AVR is in Automatic Mode, it is comparing a generator terminal voltage setpoint to generator terminal voltage feedback, and is sending a signal to the Manual (DC) Regulator of the AVR to increase or decrease DC current/voltage in order to maintain the generator terminal voltage equal to the setpoint. The Manual (DC) Regulator is looking at its DC voltage/current feedback to keep it stable.

It's kind of an "inner/outer loop" control scheme.

Most "AVRs" are operated in Automatic (AC) Regulator mode whether or not the generator is synchronized or not (when the unit is at rated speed). Manual (DC) Regulator is a "back-up" to Automatic (AC) mode that can be used if the generator terminal voltage feedback circuit isn't working or is unavailable.

Some "AVRs" are operated in Manual Mode until synchronization, or until the generator breaker is closed. It depends on the manufacturer and their philosophy at the time the "AVR" was built.

So, when the "AVR" is in Automatic (AC) Regulator it is controlling generator terminal voltage (the AC generator terminal voltage) by sending signals to the Manual (DC) Regulator to increase or decrease DC voltage/current in order to maintain the generator terminal voltage equal to the generator terminal voltage setpoint. When the "AVR" is in Manual (DC) Regulator, it doesn't care about the generator terminal voltage; it's only looking at the DC voltage/current feedback and trying to make it equal to the DC voltage/current feedback setpoint.

Again, "AVR" stands for Automatic Voltage Regulator, but is very commonly used to refer to the entire generator excitation control system--which typically includes an Automatic (AC) mode and a Manual (DC) mode. The setpoint for Automatic (AC) mode is generator terminal voltage (AC voltage), and the setpoint for Manual (DC) mode is generator rotor voltage/current (DC voltage/current).

[Note: I refer to the DC being applied to the generator rotor windings as "voltage/current" because some "AVRs" look only at DC voltage and others look only at DC current; again, it depends on manufacturer philosophy.]

Hope this helps!
 
CSA,

thank you for your detailed response. It clarifies a lot.

However, my understanding was that when a generator is connected to the grid, its voltage is 'locked' to the grid voltage, just like shaft speed is 'locked' into 60 Hz grid frequency (in the US).

In our setup, grid is supplying 161 kV, that goes through a load-tab-changer, down to about 13.8 kV BUS that is connected (through a breaker) to the generator. My assumption was that the generator, when synch-ed, terminal voltage will be 13.8 kV no matter what the exciter control is doing, because it is 'locked' to the grid and can't be changed. My confusion came with how can AVR control voltage if voltage is constant.

It seems like however that the voltage of the grid connected BUS is not constant and can be changed with the exciter they way you described.

If the load-tab-changer is used to bring the BUS voltage to e.g. 14 kV, the AVR will bring it back to 13.8 kV. And VARs will start to 'flow'.

Can you comment on the above? Is the generator voltage really independent of the grid reduced voltage, or is it not?
 
Rudy,

I've always heard it called a load <b>tap</b> changer. A tap is essentially another name for a "contact". A load tap changer is a device which can change "contacts" when current (apparent (real + reactive)) is flowing through the transformer. (Some transformer taps can't be changed under load.) Taps are used to connect or disconnect groups of windings, so it's kind like a step-change variation in transformer ratios to change the amount the voltage is stepped-up or -down depending on which direction the <b>tap-</b>changer is being moved. (I tried using my three different Internet search engines to look up "load tab changer" (with quotes" and couldn't find anything. They all asked me, "Did you mean to say <i>"load tap changer"</i>?"

The "stiffness" of the generator terminal voltage is a function of a lot of things, but the voltage is not fixed. Generator terminal voltage will vary as excitation is varied. How much variation you will see depends on the "resolution" of the PTs (Potential Transformers) used to convert the 13.8 KV to whatever your voltage display device (analog meter; digital meter) requires and converts to KV or volts or whatever units it displays the PT secondary voltage in.

In the same way one is are trying to increase grid frequency when one increases the energy flow-rate into the generator prime mover--but can't much of an appreciable effect--when one increases excitation one is trying to "boost" grid voltage (even through a transformer) but the effects are usually negligible and difficult to see. If one were using a digital voltmeter with enough display digits to measure the actual generator terminal voltage one would see some change. Most generator exciters are rated for about +/-5% of rated generator terminal voltage (that's about 1400 volts on a 13.8 KV generator).

Increasing the energy flow-rate into the prime mover doesn't appreciably increase speed (it does--but the amount is extremely difficult to see without a high-resolution digital frequency counter, probably on the order of thousandths of a Hz or less) and the difference in power becomes amps (real amperes) in the generator stator windings. Increasing the excitation also has a neglible effect on generator terminal voltage (10s of volts or less, probably) and grid voltage, but the difference in excitation energy gets converted to reactive power (VArs).

Now, I have been at locations on a grid where the generator terminal voltage increase or -decrease had a noticeable effect on grid voltage, but that has to do with a lot of different factors (distance to nearest substation; distance to nearest generation; length of transmission lines; temperature of transmission lines (changes resistance); and on and on and on).

But, in general, it's not possible to have much of an effect of grid voltage by changing one generator's excitation, but there is an effect--it's just can it be seen with the site instrumentation?

When the generator is synchronized to the grid, the generator terminal voltage is usually adjusted to be equal to or just slightly greater than grid voltage--and the frequency of the generator is usually set to be just slightly higher than grid frequency. At the instant the generator breaker is closed, the speed of the generator (and its prime mover) are slowed down to match grid frequency and the extra power that was making the generator frequency just a little higher than grid frequency becomes real amperes (watts) flowing in the generator stator windings. This is to make sure that reverse power does not flow into the generator after synchronization.

A similar thing happens with the generator terminal voltage when the generator breaker is closed during synchronization. At the instant of synchronization the generator terminal voltage (again usually adjusted to be slightly higher or equal to grid voltage during synchronization) drops to be approximately equal to grid voltage, and the "extra" excitation that was making the generator terminal voltage greater than grid voltage during synchronization causes lagging reactive current (lagging VArs) to flow in the generator stator windings. This is to prevent leading VArs from flowing in the generator, which most generators are not built to handle very much of and which, if left unchecked, can ultimately result in a slipped pole. Many generators also have loss of excitation relays and other protective relays that detect low excitation power and/or low generator terminal voltage and trip the generator breaker and/or the prime mover to protect against slipping a pole.

For all intents and purposes, the instrumentation in place at most generators does not permit an operator to see much of a grid voltage change when the generator excitation is changed. And, that change in excitation (power) is converted into reactive power (VArs) in the generator--the "direction" and magnitude of which is a function of whether the excitation is being used to try to increase ("boost") grid voltage or decrease ("buck") grid voltage.

It's the same with frequency and prime mover energy flow-rates. The frequency does change--but is the change visible to most operators with the instrumentation normally in place on most generators? No. Therefore, for all intents and purposes the frequency is considered to be fixed. The acceleration of the prime mover and generator does changes when the energy flow-rate into the prime mover changes, but acceleration isn't monitored by most power plants. And, the generator can't be turned (under normal conditions) so that the rotor magnetic fields can jump past the stator magnetic fields. The "load angle" (the amount of "twist" applied to the generator by the prime mover) changes (and that's a function of acceleration and equilibrium), but the speed doesn't change appreciably.

If generator frequency changed appreciably when the energy flow-rate into the prime mover changed, then Droop Speed Control would never work properly. (And, as much as I hate to write this,... generator exciters have droop also.)
 
Rudy,

The function, and the use of, load tap changers seems to vary depending on the site. I see sites with load tap changers that almost never change taps, and control VArs/power factor by varying excitation as required.

And I see sites with load tap changers that almost never change excitation, they are always changing taps to control VArs/power factor.

As I understand it, a load tap changer will allow a wider range of "response" to BUS (grid) voltage variations than would be possible with the average exciter control system.

The AVR is usually adjusted by the turbine operator, and the AVR is typically used to control the generator terminal voltage--not the grid voltage. The turbine operator is informed of the desired VAr or power factor setpoint by the supervisor, and uses the AUTO RAISE or AUTO LOWER buttons/targets or bat-handle switch to adjust the excitation as required to maintain the VAr or power factor setpoint.

Now, if the grid voltage is stable and the load (watts; KW, MW) on the generator is stable then the operator won't have to make many "AVR" adjustments during the course of the shift. But, if as is the case in many locales the grid voltage goes up and down during the course of the day (as load goes down and up) then the relationship of the generator terminal voltage with respect to the BUS (grid) voltage will change, and that will cause the VArs and power factor to change. And that will require the operator to change excitation to maintain the setpoint dictated by the supervisor. (If the load (watts; KW; MW) on the generator changes significantly during the shift even if the BUS voltage does not, then the operator will also have to adjust excitation to maintain the dictated setpoint.)

Now, let's say the "AVR" is set for 13.8 KV (in reality--the operator never knows what the generator terminal voltage setpoint is!) and the load tap changer is set such that the VAr meter reads 0 VArs and the power factor meter reads 1.0. That means the amount of excitation being supplied to the generator exactly equals the amount required to keep the generator terminal voltage equal to BUS voltage.

Now, let's say the load tap changer is used to increase the BUS voltage to 14.0 KV (a difference of 200 volts out of 13,800 volts which is a 1.4% difference). The "AVR" being set to 13.8 KV will not change its output, <b>BUT</b> the VAr meter and the power factor meter will swing in the Leading direction and the generator terminal voltage will probably increase, but there are too many intangibles to say by exactly how much, but it will get close to 14.0 KV. In this case, the amount of excitation being supplied to the generator rotor is NOT sufficient to make the generator terminal voltage equal to grid voltage and so Leading reactive current will flow ("into") in the generator stator windings.

The turbine operator seeing the Leading VArs/power factor will use the AUTO RAISE button/target/switch to increase the excitation to return the VAr/power factor to the dictated setpoint (or at least to make it go to 0 VArs/unity power factor).

Exactly why the mill in your example has a load tap changer and what drives the decision-making to change taps we don't know. I would imagine it has something to do with the load in the mill causing mill BUS voltage to change, and/or the voltage swings experienced by the grid. I'll lay odds that there isn't anyone around in the mill that can exactly why the load tap changer was installed, just that when the BUS voltage gets to a certain value that the load tap changer is used to make it go to a different value. In other words, the typical turbine operator answer: "Because we've always done it that way!" That's what tends to happen over years, and even decades, at most power plant control rooms.

I hope this helps. I don't know if I understand the question, but I do know I don't understand why a load tap changer is used at the mill in your example. And, we don't know what the immediate effect of a tap change is the generator terminal voltage (probably the instruments can't display the change with very much accuracy), or what effect the tap change has on the generator VAr meter or power factor meter.

Now, some AVRs get a "remote" signal to change excitation based on another set of PTs or the VAr or power factor measurement at some other location in the plant. That's not common, but it does happen. However, usually the turbine operator "controls" the AVR.
 
CSA,

your knowledge and willingness to share is priceless. Thousands thanks.

Yes, the load-tap-changer is what I had in mind.

I think my confusion can be pointed out from two paragraphs you wrote in the previous posts:

<i>The Automatic (AC) AVR Regulator/PID/loop/mode has feedback from one or more generator terminal voltage PTs (Potential Transformers). When the AVR is in Automatic Mode, it is comparing a generator terminal voltage setpoint to generator terminal voltage feedback, and is sending a signal to the Manual (DC) Regulator of the AVR to increase or decrease DC current/voltage in order to maintain the generator terminal voltage equal to the setpoint. </i>

and

<i>In the same way one is trying to increase grid frequency when one increases the energy flow-rate into the generator prime mover--but can't much of an appreciable effect--when one increases excitation one is trying to "boost" grid voltage (even through a transformer) but the effects are usually negligible and difficult to see. If one were using a digital voltmeter with enough display digits to measure the actual generator terminal voltage one would see some change.</i>

Basically, I am reading that changing excitation on a grid connected generator, does not change the generator terminal voltage. At least not enough to be practically observable, even by plant instruments.

But then, when AVR is in control, it is looking at the generator terminal voltage and adjust the DC voltage/current to maintain generator terminal voltage at setpoint.

My confusion is that first paragraph says that generator terminal voltage cannot be practically changed with excitation, yet the second paragraph states that voltage is controlled by excitation.

An operator at one of our mills states that when a generator is connected to a 13.8kV BUS and a load-tap-changer is used to increase the voltage of the BUS to e.g. 13.9 kV, the BUS voltage gauge shows the increase to 13.9 kV, but within seconds the gauge drops back to 13.8 kV, because the AVR responded to the change and decreased excitation. I think I understand what happens to VARs here, so I don't worry about that, but I question whether the above is even possible. You mentioned that, just like prime mover trying to increase speed, the excitation tried to change the BUS-grid connected voltage, it cannot do so in a measurable way. Yet, the operator claims the AVR can bring the BUS voltage 100 or even 200 volts down.
 
B
Don't forget that the concept of the "infinite bus" is a concept only, and doesn't exist in practice. In the real world, any electrical system will have a degree of departure from the ideal, and what happens if you change things depends on how far from the ideal you are.

So if you are connected to a continental-scale electrical system and your generator has a capacity which is microscopic in comparison to the total connected generating capacity, a change in one generator power output will not affect the frequency. If you are on an island with a generator which is a few % of total connected capacity, changing the power will indeed change the frequency.

The bus voltage assumption is similarly based on a system connected to an ideal voltage source, having no impedance. Under these conditions, the terminal voltage of the machine will not change. In practice, there will be some reactance which will depend primarily on the distance and cable parameters connecting the plant to the "body" of the generation system - think of this as the centre of gravity of the network. A transformer has typically 5 % reactance - this means that at rated current it will have a voltage drop about 5 % of nominal. You are talking about 100 or 200 V on a 13.9 kV bus, or 0.7 % - this is quite feasible for a relatively loose connection to the body of the grip.
 
M

MarktheSecond

> Your statements are all basically correct - and a lot of inciteful stuff from the contributors.

Yes the generator voltage is equal to the grid voltage when synchronised - there can be no difference. If your AVR tries to raise/lower voltage, then as the grid voltage does not change this converts into changes in VARs produced by your generator.

The grid has a large load on it. This load should be considered as both real load ( watts) and reactive load (VARs).. The real load is shared by all the generators according to their governor settings and the reactive load is shared by all generators according to their AVR settings.

Your generator is designed for a certain "lagging" Power Factor ( i.e. ratio of real load to total load) and the AVR should be adjusting the excitation constantly to ensure the current is not too high in the generator for a given load.

Hope this helps
 
As usual, Bruce Durdle's comments are very helpful. Markthesecond's are also helpful.

Reactance between generators and portions of a transmission and distribution system (internal to the plant, as well as external to the plant) has a LOT to do with how generator terminal voltage changes (read: excitation changes via the "AVR"--I think it's still not clear that the "AVR" is ALWAYS working, whether or nor the generator is synchronized).

It's also not clear exactly where the reported voltage measurements in the mill are being sensed. We don't know if they are being taken from the generator terminal PTs, or from some other location between the generator and the transformer with the load tap changer.

It might be helpful to say that if excitation were removed (i.e., the "AVR" were shut off) the generator terminal voltage would still be "equal" to the grid voltage, but that leading VArs would be flowing in the generator stator windings (and it's very likely the generator rotor would be damaged, either from heat or from the effects of slipping poles).

In the same way that an increase in energy flowing into a generator's prime mover when it's NOT synchronized to a grid would cause the prime mover/generator rotor speed to increase but that same change in energy flow-rate when synchronized would cause an increase is amperes (watts; KW; MW) flowing in the generator stator windings but no appreciable change in speed, an increase in excitation (from the "AVR") when the generator is NOT synchronized to a grid would cause the generator terminal voltage to increase, the same change in excitation (from the "AVR") when the generator is synchronized to a grid would primarily result in a change in reactive current flowing in the generator stator windings. The actual generator terminal voltage change will depend on a LOT of external factors: transformer characteristics; type of loads on the grid; distances between generators and transformers and the types of load between them; switchyard characteristics and configuration; and so on. This is where power system analysis and studies are useful in understanding the effects of various generators on grids, regionally and grid-wide.

Rudy, I referred to the "stiffness" of system voltage in a previous post. All of the factors above--and more--contribute to the "stiffness" (or lack of stiffness) at any particular point in a grid or system. In some places excitation changes (via the "AVR) will have no appreciable effect on grid voltage, while in others excitation changes will have a pronounced effect on grid voltage. Most generators are connected to grids through one or more transformers (with impedances and reactances).

And, finally, AVRs have "droop" as well. When generators are connected to a grid and the operator increases the "generator terminal voltage setpoint" (do the operators at the mill get any indication of generator terminal voltage setpoint?) the amount of increase of generator terminal voltage will not always exactly equal the setpoint increase--and the increased error between setpoint and actual will be, in part, responsible for the increased excitation (very similar to the way Droop Speed Control works on prime mover governors).

I'm becoming more curious about what's driving the AVR to return the "voltage" back to 13.8 KV. Are you sure there isn't some external controller/scheme that drives the "AVR" to adjust excitation to maintain 13.8 KV (from your example of load tap change from 13.8-14.0 KV)? If the operator isn't causing the change, what is causing the "AVR" to change?

Again, I'm not sure where the PTs for the sensing are located. And, you have said you are comfortable with the VAr flows in the plant, but that might help us to understand/explain what's happening. Is there some kind of VAr control at work in the plant?

Also, what is the criteria used for determining when to change transformer taps? Is it the voltage on the high-side of the transformer, or the voltage on the low-side of the transformer, or VArs (flow and/or magnitude)?

Grid voltage is similar to grid frequency, but not exactly the same. I am NO expert on grids, but I do know that under certain conditions grid voltage sags or drops, and when that happens the grid operators ask some plants to increase excitation (via the "AVR") to help support/maintain grid voltage. I think this usually happens when there is a lot of inductive load on the grid (such as when a lot of air conditioners are running--which have induction motors for the compressors, the air handlers, the condenser fans, etc.). More current flowing in the transmission lines results in more heat in the lines which results in an increased voltage drop in the lines and so on.

And, then there's the effect of the VArs "required" by the induction motors, which shifts the voltage- and current sine waves out of phase with each other which, I believe, aggravates the
voltage situation even more.

But, the upshot of all this is: Generator terminal voltage is going to be basically equal to grid voltage, and "AVR" changes are not generally going to have much of an effect on generator terminal voltage under most conditions--the larger effect will be on VAr magnitude and/or direction (leading or lagging). Some generators, depending on many factors, will see more or less change in terminal voltage as the "AVR" increases or decreases excitation.

And, I think it would take a pretty sharp power systems engineer some time to review the exact configuration and parameters at the mill to be able to say why your operator is experiencing and reporting what you are relaying to us.

I wish I could be more help, but at this point we're up against my limit of understanding and explanation without going back to university, and probably without a lot of maths and formulas (that I don't like to use to confuse most operators, technicians and managers with).
 
CSA,

I do not know enough about the setup and physical configuration to answer most of your questions.

It seems like however from your post that what the operator is reporting is not what you would expect. Namely, if LTC is used to increase BUS voltage e.g. 200 Volts, the AVR should not be able to decrease the BUS back the 200 Volts. Maybe the operator, and I have seen this before, is misinterpreting what he is seeing, looking at the wrong gauge, or is simply wrong. I have not observed what he claims first hand. Or, the mill configuration really allows what he is observing, we just do not know what that configuration is. Here is however some factual information:

They have 4 turbines on 3 buses. Each bus connected through a load tap changer to a 161 kV bus, that is connected to a US grid. Of course, there are breakers everywhere.

Bus #1 has only one turbine #1 connected to it and then some mill loads. This Turbine #1 has computerized control system (DCS, PLC). On this turbine, for excitation I see that the excitation can be ran in AVR, VAR, Field Current, and Power Factor Modes. When synchronized and online, they run it in AVR mode only - all the time. In this mode, I can see a Setpoint of 13800 Volts, and a PV of 13800 Volts - all on a DCS screen. I assume that in this mode, the AVR will change field current/voltage in order to keep Generator Terminal Voltage at Setpoint. Operator claims: when LTC is used to increase BUS #1 voltage to 13.9 kV, the PV on the control system will show 13900 as PV, and the AVR will decrease the excitation to bring the Voltage down to SP, which is 13800 Volts, within seconds.

Bus #2 has only one turbine #2 connected to it and then some mill loads. Everything is operated with local panels, no computerized system. For the AVR they have the bat-handles. The AVR can be either OFF, Manual or AUTO. Another Manual Voltage Adjust batt-handle has INCREASE and DECREASE positions. Another Automatic VOltage Adjust batt-handle has INCREASE and DECREASE positions. When synchronized and online, they run the AVR in AUTO mode. In this mode, they use the AUTO Voltage Adjust batt-handle to adjust the setpoint of the AVR control. They <b>DO NOT SEE</b> what the setpoint is. I assume: when the (invisible) SP is increased with the batt-handle, the AVR will increase the field current/voltage, to change the generator terminal voltage to keep it at new Setpoint. Operator claims: when AVR is in AUTO and left alone and when LTC is used to increase BUS voltage to 13.9 kV, the gauge will show such a voltage, but then AVR will bring the voltage back to setpoint which may be 13.8 kV within seconds.

The remaining two turbines and third BUS is very similar to #2 bus above, so let's skip that one.

You write: <i>I'm becoming more curious about what's driving the AVR to return the "voltage" back to 13.8 KV. Are you sure there isn't some external controller/scheme that drives the "AVR" to adjust excitation to maintain 13.8 KV (from your example of load tap change from 13.8-14.0 KV)? If the operator isn't causing the change, what is causing the "AVR" to change?</i>

But isn't the AVR's responsibility to do exactly that - bring it back to Setpoint? When LTC is used, the Setpoint of 13800 Volts in the AVR did not change. Isn't AVR changing field voltage/current to maintain generator terminal voltage at Setpoint? As such, if AVR has a setpoint of 13.8 kV, then changing the actual voltage with LTC to 13.9kV, will create an error of 100 Volts, which AVR will respond to by decreasing field current until the error is eliminated and that is only when Generator voltage is back to 13.8 kV?

Another question you had:
Criteria to use the LTC: I was told that operators were told to use AVR control to adjust the VARs, not the LTC. So if they need to change VARs, they are supposed to use AVR and leave the LTC alone. Actually LTC is rarely used, but they have the option to do so if they need to. They told me there are two schools of thoughts, one is to strictly use AVR to adjust VARS and the other to use only LTC to adjust VARS. They are following the first school of thought.

Thanks for your time.
 
Rudy,

Your are correct: I would not expect to see what you are reporting.

However, that doesn't mean it's not happening. I haven't seen every type of mill and control system.

The thing that's missing from your reporting--for me to comprehend things--is what are the VArs doing when transformer taps are changed and the AVR is then changing the generator terminal voltage? At the high-side of the transformer AND at the generator terminals?

> But isn't the AVR's responsibility to do exactly that - bring it back to Setpoint?

I would agree--this is the AVR's job to make the generator terminal voltage equal to setpoint. And if that happens, then the reactive current is going to change somewhere, either at the generator terminals or at the high-side of the transformer (grid). And you don't know what's happening with the VArs.

For me, operators don't generally watch the generator terminal voltage setpoint--they watch either the VAr meter or the power factor meter. Particularly if they are synchronized to a grid. If they are islanded (separated from the grid), then voltage is very important. And, the generator terminal voltage setpoint value you see on your DCS screen is very unusual for me. The only time generator terminal voltage is really important is during synchronization--after that, operators generally watch the VAr meter or the power factor meter, and generator terminal voltage is what it is.

Admittedly, most of work has been done at independent power producers, cogenerators (at mills of one sort or another) and peaking power plants. And, my experience has been that after synchronization (when it's only important to "match" generator terminal voltage to grid voltage before synchronization) that generator terminal voltage is pretty much a forgotten quantity/value--it's all about VArs or power factor.

I have learned from our exchange, and it's caused me to think and re-think some things. Hopefully I have added to your understanding more than make you question what you thought you understood or what I have told you. There are formulas and maths for all of this stuff, but everyone I've ever used them to explain anything has just rolled their eyes. Especially operators!
 
I have been following along with this discussion and I see a few things that need to be stressed more. Dave stated that the simulation was being developed for operator training.

From my experiences the operation of the voltage regulator, be it automatic or manual, can have a very big impact on the efficiency and life expectance of the equipment the generator is powering as well as the generator itself. It has been stated that the voltage doesn’t change because it is locked to the grid just the same as frequency is locked to the grid.

While a 5% variance in the voltage might not sound like a lot, but the difference in the temperature of the transformer windings at 13.6KV verses 13.9KV can be quite a lot.

The reason I bring this up is that if someone is developing operator training material it needs to be stressed that the proper use of voltage control involves monitoring the transformer and generator winding temperatures fairly close in addition to line voltage and the VAR meter.

As the generator voltage is raised the generator winding temperature is going to rise and the transformer winding temperature is going to fall. A balance must be maintained and generally speaking the changes are going to be rather gradual and therefore can be adjusted for early enough to prevent damaging either the generator or transformer.

If I were developing a computer simulation to train operators in the proper use of the voltage regulator I would make certain that the simulation showed the effect of high or low voltage on the generator and transformer winding temperatures.

Perhaps Dave has taken this into account already in the computer simulation.

One of the short comings I have seen in a lot of operator training material is that small things, like more closely monitoring generator windings when “pushing VARs” isn’t stressed enough.

Thanks
Mark Allen
 
I would tend to agree with Mark Allen, but, again the specifics of the site and configuration and its operation are very important in monitoring the effects of VArs and voltage changes--particularly on generators.

I've been struggling to try to come up with some kind of "table" or other method to try to work out what might be happening when operators are changing transformer taps and the AVR is changing generator terminal voltage. And, I haven't found a way to try to approximate what might be happening.

The most recent site I visited with a load tap-changer on the step-up transformer (a nearly 30 year-old peaking power plant) didn't use the AVR at all for adjusting voltage, neither during synchronization, nor during loaded operation. The AVR was in AUTO mode all the time, and if it was necessary to adjust voltage or VArs/power factor the load tap-changer on the transformer was used exclusively. (Of course, when asked why, the response was, "Because we've always done it that way!" Another learning opportunity missed.)

And, all the other plants where I've worked the load tap-changer was used ONLY when the VAr flow <i>across the utility tie breaker</i> couldn't be controlled with the AVR (meaning the grid voltage at the utility tie breaker was much higher or much lower than the AVR could respond to).

This gets very complicated very fast when a tap changer is involved--at least for me. Let's try this; say the tap was in a position such that the VArs at the utility tie breaker were zero (0) AND the VArs at the generator output were zero (0). This would mean that the generator terminal voltage was equal to the low-side transformer voltage and the high-side transformer voltage was equal to grid voltage. Further, let's say the load on the generator was stable (watts; KW; MW were not changing).

Now, as the day wears on the grid voltage increases--a lot. To my mind this is going to cause the VArs across the utility tie breaker to increase, in the Leading direction. The increase in voltage on the high side is going to cause the voltage on the low side to increase, which would also cause the VAr meter on the generator output to increase in the Leading direction. But the AVR (in Rudy's example) would lower it's output to decrease the generator terminal voltage, which would bring the generator VArs back to nearly zero.

Now here's where I can't work out what's happening (Bruce Durdle: HELP! I'm happy to be wrong if I can learn from the experience!!!): What happens to the voltage and VArs on the high side of the transformer, at the utility tie breaker when the AVR changes the voltage on the low-side of the transformer? Also, assuming the mill (in this example) is being powered by generator or the low side of the transformer (when mill load exceeds generator output) what happens to mill bus voltage? How does the mill reactive load affect all of this?

I'm presuming that most, if not all, mill load is driven by the 13.8 KV bus that the generators are connected to and the low side of the transformer is also connected to.

Perhaps I'm over-thinking the situation; I presume the "infinite" grid affects voltage/VArs under these conditions, but would expect the mill reactive load would have to figure in to the situation, also. (It's said, "VArs don't travel far.") And, does the flow of real power (watts; KW; MW) affect what happens at the transformer?

And because we don't know what's happening to the VArs at the generator terminals and at the utility tie breaker I can't try to formulate an explanation for what the operator at Rudy's mill is reporting. To my way of thinking, VArs must change when there is a voltage "differential" between the generator terminals and the "grid" voltage, because in my experience changing excitation doesn't have much effect on terminal voltage in MOST plants--it only affects the VArs. And operators don't see generator terminal voltage setpoint and don't generally monitor generator terminal voltage when adjusting the AVR when it's in AUTO mode; they're just looking at the VAr or Power Factor meter as they are increasing or decreasing the unseen generator terminal voltage.

When the AVR is placed in VAr or Power Factor Control, it's still technically in AUTO, it's just that the generator terminal voltage is changed as necessary to make the actual VAr- or Power Factor equal to the VAr- or Power Factor setpoint.

As for generator heating during high reactive loads, I would refer to the generator reactive capability curve provided by the generator manufacturer. It generally defines the limits of generator operation (watts; VArs) based on generator cooling medium (air; hydrogen) temperature, sometimes based on stator winding temperature.

Same for the transformer; I would refer to the transformer manufacturer's data sheet/specifications for the transformer to protect the transformer against operation outside it's design limits.
 
My question is that if AVR is shifted from Auto mode to manual mode due to some transient with gen breaker closed, then how to shift AVR back to Auto mode. Is Null balance indicator should be at zero position to shift AVR back to Auto mode?
 
We have on load tap changer for transformer. When generator MVARS are high (lagging side) due to grid demand, then we will increase our tap position i.e from 13->14 to reduce Generator MVAR to keep exciation and stator parameters within limits. however system voltage are further reduced due to increase in tap position. Can u explain please what happen in reality?
 
Top