So, just looking at my little Watt meter when I start up the microwave with a cup of water in I see:
Watts, in this case 1200 or so
Amps, in this case 5.6 or so (230v power)
PFC, in this case from 80% to 93%
If I was to leave the thing plugged in I would have gotten a meaningful kWh reading. But I don't have to, I can work it out from the figures above. It would have taken 50 minutes at full power to get one whole kWhs worth usage (0.833 of an hour).
The PFC is a red herring in the example above as you don't pay for it on grid power as normal consumer. However, you do have to make it if you make your own.. (one of the few freebies left).
So, the first step is to work out your daily usage
The second step is to break down your usage by appliance. So, in the example above, even though the microwave is a 1200w device, I only use it twice in the morning for 30 secs or so. 1.2kW / 120 (30sec is 1/120 of an hour) = 0.01kWh or 0.01 "unit" used in a day for me to zap my coffee twice for 15sec.
AC fridge/freezers have massive start up current but taper off. Our fridge here does that, a couple/few thousand watt start up, then about 240w during usage. We are about to replace it with a 36w DC fridge!
Once you've done that for all of your appliances (you don't have to work out each appliance just leave the meter zeroed for 24hours on each appliance). Add all of the kWh for each appliance. It should be close to what your power board meter should be showing you after 24 hours.
Where I live we have our hot water on electricity. It is a new cylinder and is small and very well insulated. It is also put onto a separate meter and heated up at a low night-time rate (half daytime rate). That said we use 300 units a month. 10 a day. 10kWh of electricity. On average we are very low electricity users. 2/3 of that unit usage is the hot water cylinder usage, and ideally that would be solar hot water, and/or a wet back on a fire, but because of the lime in the water it makes solar hot water or even instant gas heated water problematic. 3.3kWh of electricity in a day for everything else.
So, that is usage..
If you are generating your own power, you have to make all of it, and there are many losses along the way. There are losses in the efficiency of storing the electricity. There are losses in converting the electricity from a DC battery bank. Losses in wiring.
But simply, if you have a 1200w wind generator that averages 200w every hour (0.2kWh) over 24 hours then you have to run for 5 hours to make 1kWh of electricity. However, that energy has to go into a battery bank, say a 90% efficient lead acid (PbA) battery and you run it into an inverter with %85 average efficiency then that 200w has turned into just over 150w of usable energy per hour to break even. Once this is factored in you have to pay for all of it too so that microwave which is basically 93% efficient (%93 PFC) then that is another loss. There are other losses as well, wiring between a low voltage battery bank and inverter can be significant. The battery will lose capacity every day, maybe 4% for good PbA. And batteries die! Wind generators die! From what I can gather here, a poll here on fieldlines, the average life expectancy for a home built genny is just over 1 year!
It is always cheaper to use grid electricity than to make it. Here we pay NZ23.8c for a unit of electricity. It's a lot. Our night-time rate is NZ13c. For me to seriously think about making electricity cheaper than that the unit rate (with solar) it would have to be 10x the price it is now.
Just figuring out what is using what is the best place to start and many $$ can be saved this way.