I have a farming facility with 8 diesel gen sets in parallel. 2 pcs of 2000kVA, 1 pc of 1250kVA, 2 pcs of 820kVA and 3 pcs of 500kVA all synchronised and powering an irrigation distribution and a factory plus domestic houses.
We have experienced frequent power trips when we run the system at 0.95 pf but when we reduced it to 0.85 pf it is stable.
I am told that this is due to reverse power when a large load is suddenly connected to the system. The load is all induction motors and lighting. A pump station can have 6x 160kW motors which could be started in quick succession despite instructions to the contrary.
Is there something available to control this reverse power because we can save a lot of diesel by getting the pf up higher.
How exactly does sudden load changes affect pf and why does it cause the whole system to trip when it is set higher?
The power factor of a system (the sum of all the motors and lights and computers and computer monitors and televisions and tea kettles) is what it is. It's not possible to "control" the power factor by changing "a" generator setting.
I think what you'll see is that when you try to "increase" the power factor that what is actually happening is the system voltage is changing. I also think what you will see is that when the system voltage is at rated then the power factor of the system will be the sum of the loads on the system. The amount of VArs being provided will be equal to the amount of VArs required--and the system voltage will be equal to rated.
You say that trips are occurring--but you don't say what relays are causing the trips. And, if you have multiple gensets do you have some kind of load or power management system that keeps the system frequency constant as load changes?
When it's really hot and a lot of people are running their air conditioners there is a lot of inductive loads on the system (the air handlers (fans) and the refrigerant compressor motors). And, that's just for residential air conditioners. That causes the power factor of the system to decrease and the system voltage to also decrease.
For a smaller system like yours, as inductive loads increase (as motors are started and run) the power factor of the system will decrease, because those motors are "consuming" VArs (induction motors won't run without VArs--it's what makes them turn). And some source has to supply those VArs or the lights dim (brown-out) because the voltage dips, and the power factor decreases. To produce VArs it is necessary to use some of the diesel fuel to make more excitation on the generator, and that's what makes the fuel consumption go up.
Utilities actually charge large users of VArs, by installing VAr-hour meters in addition to watt-hour meters (which we only have at our homes and most small businesses). But because the utility has to supply the VArs those large users require they make the large users pay not only for watts (kW; MW) but also for VArs. Because they have to supply those VArs to keep the system stable. Some very large users actually have ways of supplying their own VArs to reduce their VAr consumption from the utility.
A small grid is really no different. It has its own power factor (overall sum of all the loads on the system) and in order for the voltage of the system to remain constant as the reactive loads (induction motors and such) increase it's going to require more excitation which is going to require more diesel fuel. Reactive power is often called "imaginary" power--but it's really not. Or, rather, the effects of reactive power are not imaginary; they shift the voltage- and current sine waves of the system out of phase with each other, and that makes the system less efficient. By "producing" or supplying reactive power to the system one can actually shift the sine waves back towards being in phase with each other which helps the system--but it requires real energy: in your case, diesel fuel.
I don't know where you're measuring the power factor (at each individual generator) or some overall system power factor meter. I don't know why the units are tripping--but I have a suspicion.
When induction motors start they draw a LOT of current--called inrush current--to get them up to rated speed quickly. And, that inrush causes the system frequency to tend to dip briefly--but something should cause the diesels to work harder at least for a brief time to keep the generators spinning at rated speed (frequency), and once the induction motors reach rated speed and the inrush current is reduced the diesels will not be working quite so hard--but will still be working harder than before the induction motors were started.
So, that's part of what's going on. I'm sure if you're standing near the diesels when these induction motors are starting you will hear the diesels work really hard for a brief time and then seemingly back off a little bit but still be working harder than before the motors were started. That's the effect of the in-rush current.
Now, when large loads are removed from the system the net effect is going to be that the system frequency will tend to increase--and the diesels have to work less hard. If the amount of load being removed from the system exceeds the amount of load being provided by a generator then it's conceivable that generator will trip on reverse power, to protect the diesel engine. (Reverse power means the generator actually becomes a motor, drawing power from other generators, and tries to keep the diesel spinning at rated speed--which is NOT good for the diesel to be "driven" by the generator when it should only be "driving" the generator! So, the reverse power relay operates to open the generator breaker to protect the diesel from being damaged.)
There's a lot going on, and you've done a pretty good job of describing most of it. Yes; you can save diesel if you don't provide a lot of VArs (which means the generators operate at a lower power factor). But, that means the system voltage is "suffering" and the efficiency of the system will be less than optimal.
Power factor is really a measure of efficiency. When it's 1.0, then the system is 100% efficient--meaning all of the energy is being used to do "real" work (torque; heat). When it's less than 1.0, say. 0.95, then only 95% of the energy is being used for real work (torque; heat)--and the other 5% is being used for "reactive" power requirements (the "induction" of induction motors, for example). As the power factor decreases, the efficiency decreases--less of the diesel fuel is being used for torque/heat and more is being used for reactive power ("imaginary" power--which isn't really imaginary, because it consumes real power!).
Induction motor nameplates have a power factor rating on them. That's the efficiency of the motor at converting the power coming in to the motor into real work (torque). And, all motors have a power factor less than 1.0--usually more like 0.85! The "better" motors have a higher power factor--but they also cost more!!!
Finally, starting motors in quick succession can cause the inrush currents to be high for a long period of time....
Reverse power is when the diesel is "too" lightly loaded--it isn't producing any power (watts; kW). And that's because the system load is such that there are too many generators running for the amount of motors and lights and computers and computer monitors and televisions and tea kettles that are on. To protect the diesels the reverse power relays operate. That's not always a bad thing. But, it's kind of the opposite of what you are describing when motors are starting.
There "must" be some kind of load management system for all of the engines--either that or those gensets have some VERY good governors!!!
Hope this helps!