Hi All,
I've got my lawnmower/alternator battery charger up and running! For starters I used a simple power pot control, but the voltage climbs of course as the batteries charge. I was testing it on a small battery bank, so it took frequent monitoring. I'm picking up ~ 1100aH worth of golf cart batteries in the next week or so.
Anyhoo, I whipped up a simple PIC based controller for it. It can read the battery voltage and set the field voltage to the alternator by PWM'ing a PNP transistor. Works great, and the current ramps down as the batteries charge. The alternator can only put out ~ 50 amps, so I won't need to worry about exceeding C/20, let alone C/10. In fact I'm sure I will wind up scrounging a larger alternator. I also added a current sensor today. I'm going to add a temp. sensor too.
So, my questions are:
Do I need to ramp up the current initially, or is it ok to hit it with 50 amps right off the bat?
With smaller banks, I should set the controller to current limit to a max of C/10, and C/20 is nicer, right? I will use it to charge other batteries than my large bank I'm sure.
I know from our previous discussions that there is no reallly good way to use voltage to determine SOC, and that tracking aH in and out is probably the best way to go. Eventually I will make such a system. However, for now, I need to get a simple contoller going and move on to other pressing projects to get my move to off grid complete. I want this controller to figger out when I am ~ 80% charged and shut down the motor. With voltage and current readings availble, what is the best way to determine when to shut down the generator?
I have read that the best place to put a temp. sensor is on a negative battery terminal. I'm assuming it doesn't matter which battery in the bank, as long as all of the connections are of sufficient gauge. Sound right?
Jonathan