D
Hello, hopefully this question is suitable for this forum.
I'm an engineer at a consulting company. One of the things we do is create customer specific simulations for operator training purposes. We have modeled many processes in the past, but we are now doing several turbine generator simulations. The mechanical side is working well, but I have some questions about how to model the electrical side.
I will preface the questions by saying I’ve done a lot of research on the internet, as well as on this forum, and have gained a much better understanding of the subject than before I started, but I still have some fundamental questions.
My main questions revolve around reactive power, how its “generated” and how it’s dealt with by turbine generator operators. I’ll tell you what I think (mostly based on info from this site) and hopefully someone can set me straight.
VArs are a function of the type of load a generator “sees” and need to be dealt with by changing the excitation on the rotor.
Controlling VArs at the generator (done by varying excitation) is necessary, and VAr creation must match VAr load or the system voltage will change, just like power (watts) creation must match power load, or the frequency will change.
Increasing excitation above what is required to maintain generator voltage equal to grid voltage will cause lagging VArs to flow out of the generator.
Decreasing excitation below what is required to maintain generator voltage equal to grid voltage will cause leading VArs to flow into the generator.
Leading VArs and Lagging VArs of the same magnitude will “cancel” each other out.
Now for some basic questions:
There is a number on the DCS screen for MVAR. How does that number come to be? Is it a directly measured quantity? Is it back calculated using the power triangle from other quantities that can be directly measured? (Remember, we need to simulate this).
Is the only way a particular plant and operations team “see” and react to VArs by measuring and reacting to small differences in the grid voltage vs. the voltage the generator is currently producing?
Are large capacitor banks there to cancel out some of the lagging load seen by a particular generator, or group of generators at a specific plant? Say you had a capacitor bank that had an effective capacitance of 3 MVAR, and the breaker to that bank was open. Also say that you had a 7.5 MVAR reading on your DCS, and the power factor was .95 in the lagging direction. If you were to suddenly close the breaker to the capacitor bank, would the MVAR reading drop to 4.5, and the power factor would increase toward unity?
I realize that everyone has a different opinion on and way of understanding VArs. What I’m trying to accomplish is to model how a plant “sees” and reacts to VArs in a real world way. These simulations don’t need to deal with extremely complex math models, nor do they need to work in every possible scenario, but if we get the fundamentals right, it may work as expected,
I also have an additional question that arose during my research. It seems there is a governing body that has said it is NOT OK for power plants to run in automatic VAr Control mode. I have no opinion one way or the other, I’m just curious about the reasoning behind why plants should not be allowed to do this.
Thanks,
Dave
I'm an engineer at a consulting company. One of the things we do is create customer specific simulations for operator training purposes. We have modeled many processes in the past, but we are now doing several turbine generator simulations. The mechanical side is working well, but I have some questions about how to model the electrical side.
I will preface the questions by saying I’ve done a lot of research on the internet, as well as on this forum, and have gained a much better understanding of the subject than before I started, but I still have some fundamental questions.
My main questions revolve around reactive power, how its “generated” and how it’s dealt with by turbine generator operators. I’ll tell you what I think (mostly based on info from this site) and hopefully someone can set me straight.
VArs are a function of the type of load a generator “sees” and need to be dealt with by changing the excitation on the rotor.
Controlling VArs at the generator (done by varying excitation) is necessary, and VAr creation must match VAr load or the system voltage will change, just like power (watts) creation must match power load, or the frequency will change.
Increasing excitation above what is required to maintain generator voltage equal to grid voltage will cause lagging VArs to flow out of the generator.
Decreasing excitation below what is required to maintain generator voltage equal to grid voltage will cause leading VArs to flow into the generator.
Leading VArs and Lagging VArs of the same magnitude will “cancel” each other out.
Now for some basic questions:
There is a number on the DCS screen for MVAR. How does that number come to be? Is it a directly measured quantity? Is it back calculated using the power triangle from other quantities that can be directly measured? (Remember, we need to simulate this).
Is the only way a particular plant and operations team “see” and react to VArs by measuring and reacting to small differences in the grid voltage vs. the voltage the generator is currently producing?
Are large capacitor banks there to cancel out some of the lagging load seen by a particular generator, or group of generators at a specific plant? Say you had a capacitor bank that had an effective capacitance of 3 MVAR, and the breaker to that bank was open. Also say that you had a 7.5 MVAR reading on your DCS, and the power factor was .95 in the lagging direction. If you were to suddenly close the breaker to the capacitor bank, would the MVAR reading drop to 4.5, and the power factor would increase toward unity?
I realize that everyone has a different opinion on and way of understanding VArs. What I’m trying to accomplish is to model how a plant “sees” and reacts to VArs in a real world way. These simulations don’t need to deal with extremely complex math models, nor do they need to work in every possible scenario, but if we get the fundamentals right, it may work as expected,
I also have an additional question that arose during my research. It seems there is a governing body that has said it is NOT OK for power plants to run in automatic VAr Control mode. I have no opinion one way or the other, I’m just curious about the reasoning behind why plants should not be allowed to do this.
Thanks,
Dave