at the risk of stirring up all sorts of vigorous argument, and while i understand what i understand mppt to actually be, i am somewhat at a loss as to the difference in the theoretical definition and what the actual as deployed definition might be.
while i understand the theory behind mppt or maximum power point tracking, i am left to wonder if the definition is so broad , that maybe it turns out that it isn't exactly mppt in a broad application sense but maybe at best a booster within very narrow parameters as built and sold by the industry.
allow me to explain what i think i know, and then i will follow with a question.
theoretically mppt ought to take a higher voltage source and via a buck converter and some programmable control reduce the voltage while increasing the amperage with the result being more power available to charge the battery bank.
this higher voltage can be in some cases as high as ~150volts dc from the source and the battery bank could be anything less, typically 12,24, 48volt nominal delivered to the batteries.
while the products sold by the big companies will operate within these parameters, it appears that they are optimized for use with normal nominal panel voltages, so that maximum efficiency is seen when using a 12 volt panel with perhaps an open circuit voltage of 17volts. (24 and 48 at their correspondingly higher open circuit voltages) or in some cases they reach best efficiency at 24volt nominal or 48, or whatever,, the point being often they are not as efficient if fed 150vdc
as they would be if fed the 17vdc (for a 12volt system)
i am using 12volt system as an example only, not a suggestion.
what i am wondering is this
do the manufactures engineer and tailor their products to provide maximum efficiency for what they perceive as being systems most in demand? that would make sense from an economic standpoint but when applied to systems made for higher voltage inputs why would we tolerate lower efficiencies for the broader width (between open circuit and battery nominal voltages?)
i thought at first it was a product of the buck converter section, but this is clearly not the problem, a converter can be made as high in efficiency at wider width as it can be for narrower width (between supply and load voltages), so... what is the culprit?
is it within the programming, wherein the effort to maximize efficiency for a narrow parameter for a specific nominal voltage at common supply voltages results in a compromise(s) the result of which mandates lower efficiency at other supply and load voltages?
hope i am making myself clear
if this is the case, (which i firmly now believe) , then there is a real good argument to be made for either of two things to happen
1. the oem ought to allow entrance to the programming of their mppt controller so that the user or installer can adjust the programming to optimize efficiency for other voltage battery banks and other input voltages.... or
2. the drive to design and build a controller that is open source in the programming side of things is certainly high in my opinion. as difficult as it might well be.
yes building high power converters is a tough prospect, so that is a significant hurdle, once crossed the cost to produce would be reasonable relatively speaking.
the programming side of the thing, while complex is probably the most doable in that programming generally does not eat up parts, just lots of man hours, of which there are lots more programmers than there are power electronics guys out there. probably 10k programmers for every qualified power electronics engineer.
having said that it would appear to me that one (or group) ought to clearly define the operation of an mppt, specifically what parameter is it to operate under. then lay out some basic code architecture, enough to get an idea of what needs to be monitored and what needs to be controlled and how best to do each both separately and together.
then start work on the power electronics end of things.
i am thinking that for instance it might be possible to do the following
start with a switch mode power supply, such as a heavyduty puter supply some of which like 90-270volt ac input, and have outputs in the 30 or more amps at 12volts dc...
hack into the control circuitry of the converter and instead of a flat dead on regulator, insert a microprocessor wherein one could take various inputs such as input voltage, temperature, windspeed (if for wind) and whatever else to crunch he numbers and vary the circuit to
make the output voltage anything one wants to charge a 12volt battery.
puter supplies are mass produced, most are about bullet proof to start with, if bought in quantity are relatively inexpensive so the converter section is about 90% there...
which leave the control side of things, the microcontroller and its coding. i can do the bs2 stamp but there are lots of folks that can do much better with a pic chip for a fraction of the cost of a bs2.. and there are lots of folks that could program one to do just about anything we can imagine...
so why not?
but first we have to define the problem (what specifically is mppt)?, under what parameters will it be asked to operate (input and output voltages)? and can it be made so that it can be optimized for specific
installations with the goal being the ability to take full advantage of the converter.
so there you are, any thoughts guys?
bob g