I've been dialing in the calibration of my 3DR current sensor by changing the value of the VOLT_DIVIDER value in the Advance Parameters list. I've found that for my quad, with no load, a value of 10.45 gives me a value through Mission Planner (and the OSD) that is equal to displayed on my very nice Tektronix voltmeter hooked up to the quad's power bus. This is all good.
The problem surfaces when I start applying throttle. As the load increases the voltage drops, as expected. But the 3DR PM starts reading lower than actual. At about half throttle it's reading almost .25 volts lower than the actual.
Is this about as good as I can expect? If so I guess I need to have another look at the battery failsafe voltages I set.
Replies
Hi,
Some ppl don't have confidence in using voltage to estimate remaining capacity. And it's true enough that some brand new LiPos have an extremely low internal resistance and their voltage doesn't drop much during discharge. That's probably what they have seen, and concluded that voltage is no good and you need to integrate current.
However after a few cycles the voltage of every battery I have seen will drop nicely linearly to the discharge, except at the extreme ends of the curve.
Others have argued against voltage that it goes up and down depending on the throttle setting. True enough, but one can just make a habit of setting throttle back to about the same setting for each battery voltage reading.
Regards
Soren
Randy,
Randy, why would Vcc sag under loading of the battery? Vcc is after the voltage regulator in the power module or the BEC, whichever the user chose.
If BEC is used for APM power and there is lots of servo activity, thats a different story (BTW don't do that anybody).
And if Vcc is used as a reference, shouldn't the measured battery voltage appear to go up?
And finally (sorry :) doesn't Ardu* firmware now use the internal reference to correct the Vcc reference so really the batt. voltmeter should (apart from response time and noise) be immune to reasonably small Vcc changes?
Regards
Soren
Thank you Bill. So if I understand this right, I can adjust VOLT_DIVIDER to give the correct voltage at some load, but it won't track accurately across a range of loads.
My concern is if I set my battery failsafe for say, 13.8 volts on a 4S setup, it could trigger at 14.1 actual volts but the APM is thinking it's at Failsafe. I probably don't want to do a lot of electronic work to make this more accurate, as long as I know what the differential is I can adjust my failsafe to accommodate it. When I used VOLT_DIVIDER to adjust the rest voltage, I just assumed it was going to be equally accurate at different voltages. I was really surprised to find it wasn't and wondered if I was missing something.
I use the MinimOSD with goggles to monitor my battery level, which comes from the Power Module.. This explains why I end up having more battery left after flying than I expected.
Changing the volt divider will not help with the 'calibration' of the diver. the divider is precisely 10.13333333333333 using 1% resistors.
I usually see 0.2V of a difference on my setup measured against my Multimeter (that costs as much as a Mac Book Pro 15")
The thing to get better accuracy is to measure Vcc accurately. The newer firmware calibrates against an internal voltage reference the value for Vcc. This can be +-10% accurate according to the data sheet.
You could build your own voltage diver network, which scaled the output voltage max of 20V. This would give more resolution to the voltage being read, and would hoping improve accuracy over the range.
The other thing is that has items power by the Inputs Rail draw more current, Vcc will drop slightly (due to the protective solid state fuse), This could be a cause of it reading lower as well. the current draw on the APM side needs to be fairly constant.
The newer APM2.6 has a new fuse that has less internal resistance as more current flows, so that should be less of an issue.