I've been doing quite a lot of welding lately, but have not noticed any appreciable increase in my electric bill over what we normally pay. I wonder if it's because of the way the “smart†utility meter on my house measures energy (I don't know what's smart about it, in any case it's a meter just like all my neighbors have).
I use a Lincoln AC stick welder, which is basically a large iron-core transformer. On the grid side there is a primary winding connected across the 240V split-phase grid supply. On the weld electrode side there are secondary (high current ranges) and reactor (low current ranges) windings connected in series. I don't know how close to saturation this iron core operates, but I don't see any DC control winding that would bias the core and reduce the inductance.
So, to the grid and to the meter, this welder should look like a large inductive load (voltage leading current). The primary winding has some resistance of course, so this load is not 100% inductive. But, the PF may be close to zero when the welder operates, and the average power it takes from the grid should be small (P = V * I * PF).
The question is how the utility meter measures all this. I recall from another thread someone saying here that the meters do not bill for reactive power (energy cycling back and forth between grid and reactive load), and that no penalty for low PF is assessed in residential service.
Is it possible that a utility meter does not bill for the reactive part of the mostly-reactive power of a transformer-based welder – this would explain my low electric bill?
Has anyone observed this before while using his/her welder?