I remember Midnite being a brand that came up routinely with off-grid solar charge controllers. Just for comparison I went over to their products to look at specs compared to what I have and now I am really confused.
For an example for discussion - the chart I've been working on for a few years for the power usage at my cabins I have listed 6kw panel wattage in the calculations.
The Midnite classic 150 shows 86 amps charge @ 48v nominal bank voltage. Actually, I think lining up the numbers to be accurate in what I am putting in the thread here has partially answered my question...
With the above numbers - 86 amps @ 48v nominal battery voltage (actual voltage around 58v - lithium) = just under 5,000w.
5,000w isn't 6,000w. So if a system is above 5,000w that would require multiple charge controllers?
Is there an input panel wattage maximum? Or does the controller take what it gets and only uses what it can make use of if the input power is more than it can push out?
How do you set up, say, a 20-30kw off-grid system?
I am familiar with the grid-tie inverters where the solar power, once it reaches enough to "start" the inverter, is directly pushed out to the AC power circuit. This is not what I am referring to - with this set up there is no storage. In order to store the power you need batteries and in order to safely charge batteries you need a charge controller. Therein lies my question.
For installs where it is sunny all the time and power consumption is only when the sun is shining then I can see where "grid tie" systems are good. That does not cover night time and that does not cover periods of no solar production - like when it is cloudy and rainy for a week. If you don't have stored power through the low periods you're hosed. So how do you get the stored power - on higher wattage set ups?