Controlling load to a generator

M

Thread Starter

Mark Licari

Hello,

I'd like a description (in layman's terms if possible) of how load gets shifted from the utility to a generator and visa versa, when the utility and generator are paralleled. Assuming the utility doesn't change, what changes in the generator take place that "attracts" the load from the utility and "distracts" the load from generator?

- thx
Mark
 
M
Not just vars, total load watts and vars. Here's the case:

A gen sync's with the utility and the load slowly goes over to the gen (approx 80kVA/minute). When 90% of the load is transferred, the utility breaker opens and the gen now has all of the load. (closed transition) To go back to utility, the reverse happens. What changes take place in the gen that cause the load to shift?
 
Mark Licari,

I only want to address the real load portion of this discussion in this reply--the watts, not the VArs.

In an electrical system the real power (watts; KW; MW) being produced/transmitted is a function of the voltage multiplied by the amperes multiplied by the square root of 3 (~1.732) multiplied by the power factor. (The square root of three is for a three-phase system, which most electrical grids are.) If we assume there is no reactive power (no VArs) then the power factor is 1.0 (unity) and all of the power being produced/transmitted is real power, watts. Since most generators operate at a uniform, relatively constant voltage the only variable in the equation is amperes. If you want to produce or transmit more power, you need to increase the amperes.

Try to think of the entire electrical system at once. There are generators and there are loads--loads are electric motors, lights, computers, computer monitors, etc.

The generators are driven by prime movers (turbines, reciprocating engines, etc.). The prime mover is providing torque to the generator, and the generator converts the torque into amperes at some voltage. Those amperes are transmitted via wires to the load(s), where the amperes are converted back into torque (in the motors) and other useful work (light; computers)--though sometimes the usefulness of computers is questionable.... (Some days I feel like I work for my computer instead of the other way around.)

So, the prime movers are really doing the work of the motors and lights and computers--electricity is just the way of producing a lot of torque in one location and converting it to amperes that can be easily transmitted and distributed to many smaller loads in remote locations and converted back into useful work. And, again, it's being done at a relatively constant, stable voltage (if the generator excitation system is working correctly!). If we didn't have electrical transmission and distribution systems we would have millions of small prime movers driving water pumps and fans and small electric generators to power lights and computers and computer monitors.

A generator is just a device for converting torque into amperes. And a motor is just a device for converting amperes into torque. (Really, there is no difference between the two, other than the type (synchronous or induction)). So, if one wants to increase the power "output" of a generator (really, the work being done by the generator) one has to increase the torque being applied to the generator. And if the prime mover driving the generator is a steam turbine, then the steam turbine control valves are opened and more steam is admitted to the turbine which would TEND to increase the speed of the turbine but when it's connected to a properly regulated electrical grid ("utility") the speed will NOT increase (by any appreciable amount). The extra torque that would TEND to increase the speed is converted into amperes, which means some of the motors and lights and computers and computer monitors connected to the utility are not being powered by the prime mover (via the generator).

As for your scenario, it's a bit unusual but seems to describe what's known in the industry as an "island" load that is slightly less than the rating of the generator's prime mover.

In your scenario, after the generator was separated from the utility it would be necessary to resynchronize the generator to the grid, then reduce the torque being produced by the prime mover driving the generator to reduce the load being carried by the generator and its prime mover.

Generators don't actually do any work; they just convert torque into amperes. Motors convert amperes into torque. The prime mover driving the generator develops and applies torque to the generator, which converts it to amperes, and then at the other end of the wire(s) the amperes are converted back into useful work. Electricity is just the way torque is transmitted and distributed--a very ingenious way--but generators don't do any work. The prime movers driving the generators do the work--of the loads at the other end of the wire(s). The prime movers actually pump the water, or drive the refrigeration compressors, or the forced air unit fans, or power the lights and the computers and computer monitors.

If the prime mover is a reciprocating, internal combustion engine to increase the torque being produced by the engine it's necessary to increase the fuel being admitted to the prime mover. Increasing the fuel increases the torque which increases the amperes.

Again, just to be clear--the generator terminal voltage is usually held relatively constant for most generators. In fact, most generators have a voltage rating of only +/-5%, or even less, which means the power could only theoretically be affected by approximately +/-5%. If voltage were used to vary real power, which it isn't.

We can talk about reactive power (which is a part of the power triangle-- which includes the VAr component of the KVa equation) if this is clear.
 
Top