can anyone tell me why we should try to increase the power factor in a alternative current circuit as much as possible to reduce total power consumption?

can anyone tell me why we should try to increase the power factor in a alternative current circuit as much as possible to reduce total power consumption?

It may appear that decreasing the power factor, cos(phi) would decrease power consumption because,

P_avg=V I cos(phi).

Then for a given rms Voltage, V, and rms current, I, the closer that phi is to (+/-)90°, the less power is consumed. However, in that case, the work done per unit time by the power source would be equally small.

The power company keeps the voltage at some fairly constant level. To get work done at a particular rate, it will take that particular amount of power. If cos(phi) is close to zero, then the rms current, I, will have to be proportionately larger to make up for the power factor. However, the power lost to resistive elements in the circuit, including in the supply lines, transformers,etc., is governed by (I^2)R losses. Also, the (I^2)R losses will not benefit from the power factor because the power factor for any purely resistive element is zero. Those losses will be MORE than proportionately higher, due to squaring I.