I'm using an APM2.6 board, and a 3DR voltage/current sensor board. I had to adjust the voltage multiplier upwards in order to get the voltage readings correct
If a 4-cell LiPO has measured internal resistances of 4,4,7,6 milliohms (respectively), that is a total of 21 milliohms. If I am drawing 30A (for example), that would mean the internal resistance of the batteries is dropping 30 X .021 = 630 millivolts. So, under that kind of load (typical for me) then my battery voltage reading will be .630 V too low. That is, when my battery has an actual voltage of 14V, I will measure only 13.37V. This is significant, since it amounts to 0.158 V/cell error. A fixed offset won't work, since the error depends on the exact current draw.
Does anyone build a voltage/current monitor that does this adjustment automatically? If not, is there a need for such a device (hint: I'm a hardware engineer).
Also, Mission Planner tells me the battery remaining and gives me verbal warnings. Where does the 'trigger' for those warnings originate - in the APM itself? In Mission Planner? If it is in Mission Planner, can I change the threshold?
people are interested in actual voltage, under load, not some "would-be-voltage" compensated for load.
Also, each pack have a bit different RI , especially big differance between older and new packs.
In my experience, the voltage in Mission Planner is not that accurate, I've seen 0.1-0.5v discrepancy or more easily. It's been there for years but hasn't had much dev love for a long time. Hardware-wise, the only new stuff has been 6S capable power modules.
The verbal and visual warnings are triggered in the MP, check the "MP Alert on Low Battery" check box in Initial Setup > Battery Monitor at which you'll get a number of options. There's a box for battery capacity.
I have changed the voltage multiplier to 10.3, which gives me a pretty accurate reading.
I have several new 4-cell LiPOs and none of them have a total ESR of less than .025 ohm (X 30A = 750mV).
It all depends on what you mean by ACTUAL. A LiPO with an ACTUAL cell voltage of 4.00V, and an ESR of .006 Ohms will give a reading of 4.00V when unloaded, but 3.82V when subjected to a load of 30A. So even with a 30A load, the cell voltage is still 4.00V, the 180mV 'drop' is due to resistance in the cell + packaging.
In a 6 Cell battery, the differences can really be signifcant.
Yes, older batteries have higher ESRs, but I generally throw them away (or use them to power my FPV equipment) when the ESR of any cell gets larger than 10 milliohms.
I can build a compensator that would show very close to the actual cell voltage - regardless of load, as long as the ESR didn't exceed 12 milliohms/cell. I was just wondering if anyone else would be interested in it.