All,
I had my first flight test last weekend with the battery monitoring turned on. Here is my configuration.
Gas powered configuration - therefore the battery just powers the receiver, the board, and the servos.
One (1) - 4.8 V Ni Cad battery
Board is powered through the servo inputs
1 3.9KOhm resistor across channel 1 for voltage monitoring
Battery + and - split right before it powers the receiver and run into the AN0 and ground pins for voltage monitoring.
I know my voltages are lower that those of you that are using the 14 or 18V batteries, but in theory everything should work the same.
Now for what happened in practice. I measured the voltage coming off the battery right after we charged it with my in-home voltage monitor.
It read 5.44V.
When I fired up the GCS it was reporting 5.14V.
No problem, I figured it was just a scaling problem and I could fine tune it later. I expected the voltage to then drop from 5.14 down in the 4's.
Instead what I saw was the voltage climb to 5.28V before I finished flying.
I left the power on after I landed to see what happened as the battery drained.
The voltage kept climbing until around 5.35V before the receiver / board couldn't get enough power anymore.
I then measured the voltage of the battery again using my in-home voltage meter and it read around 4.8V (can't remember exactly).
So the question is - Why would the voltage meter on the board register a drop from 5.44V to 4.8V and report it as a climb from 5.14V to 5.35V?
Any thoughts?
Replies
I did some more measuring with the board on last night and here is what I found.
I am supplying the board with power by using the voltage going through the receiver along the servo lines.
I am also bypassing the receiver and sending that voltage into the AN0 input.
The APM board has a 5V regulator on board.
When I measure the voltage between ground and AN0 it measures the voltage the battery is currently delivering. This starts around 5.45V and drops from there as the battery discharges.
When I measure the voltage between the ground pin and the 5V pin going to the telemetry, that ranges between 4.9 and 5V regardless of the voltage on the inputs. So it appears that the onboard regulator is doing it's job and outputting a voltage around 5V.
Questions remain:
Why does APM still consistently measure 5.15V even though the onboard voltage is steady, but the measured voltage is slowly dropping?
Is there a better place to measure the voltage on the board than the pins going to the telemetry?
I'm not using a BEC. I've never needed one on this system. My engine is gas powered and (as I understand it) a BEC is used for electric powered engines.
However - that may be my problem. If the voltage powering the board is the same voltage that is being measured by the board, then is it likely that the board will not detect a voltage change?
Would that make sense?
Is using a BEC to keep the input voltage constant the only way to resolve this?
If the power supply to your board is *dropping* while the battery voltage is holding steady (or dropping at a lower rate) then your voltage reading will seem to go up. Make sure your bec gives a rock steady input to your board.
More info:
I am using the default setup from here. I have it set up for option 3 "Measuring the total battery voltage only."
I used the packaged 3.9k resistor and soldered it across the top voltage divider pads. The 10k resistor is already on the board. The manual seems to indicate that the 3.9k resistor can be used for anything under 17.9 volts.
I then chose option 3 in the GCS.
As I understand it, in order to know that the battery is running low we do expect the voltage to drop off as it gets towards the end of the battery life. If it didn't drop off until the end, then what good is the voltage sensor? It would just drop to zero and the plane would crash.
The 100% reference voltage is 4.8 to 5 volts that the electronics run with.
It is expected to remain constant, if a BEC was used it would.
Power the electronics with a V to V converter (BEC) and sample the
battery voltage to the scaled input.
Just to be sure I follow, when you say there's a 3.9k resistor across ch1... what does that mean?
Normally if your battery voltage exceeds the MCU's Vcc (or analog supply, anyway) you'd want a voltage divider made of two (precision) resistors to step the voltage down into a measurable range.
You can use an online voltage divider calculator to figure out reasonable values. For a max voltage of 5.5V and an ADC with a max range of 5V, a 10kΩ / 1kΩ resistor divider would do the trick.