Finally got her to fly with the lights on. There are a lot of problems with lightbulbs reflecting on reflective surfaces. These problems never happened with IR, but IR should be no different.
Detecting the XYZ of a spinning object with 1 camera in high shutter speed mode is really hard. It would be easier with 2 cameras.
There is a new takeoff algorithm, relying on hard coded throttle for a given battery voltage to instantly leap off the stand. It's a very unstable system. Technically, it should be more stable than the previous throttle ramping algorithm.
So the 70fps framerate, the lack of need for a USB hub, & the lower computational load made the board cam irresistible. Despite everything flying perfectly, it was time to rebuild it again.
The board cam immediately had new problems. The radio & camera don't always initialize. It helps to leave the board powered off for a while before restarting it, but it's a real problem if it is to automatically boot on the raspberry pi without a command line to drop kick it. There would only be power cycling after observing the failure on a tablet. This didn't happen with the last build of exactly the same board, but the last build had 4 more PWM's & no camera.
Also, there are a lot of tiny cables flexing. There wasn't any notable improvement in flight. The picture had a much more refined oval & probably more accurate coordinates but the flight was equally unstable.
With all the knowledge gleamed over the last year, the ultimate vision system would now use visible light, dual board cameras on separate turrets & separate USB connections producing 320x240 at 70fps. That would give the best velocity measurements. It would be nice if an IR board cam was easily obtained, but the lack of such a camera & the reduced power needs of visible LED's make IR impractical.
Without the instant velocity measurement of doppler shift that GPS provided, all indoor vehicles have suffered from delayed velocity measurement. The only solution is to increase the framerate to make the velocity measurements as close to realtime as possible, but never as good as doppler shift.
There were 2 major software changes:
The autopilot since 2009 exclusively used a binary integral. It would add all or none of the feedback constant, regardless of the error. That produced very fast response to changing weather, but created lots of oscillation. Changing it back to a proportional integral which scaled the feedback constant based on the error greatly reduced the oscillation.
The autopilot has always accumulated cyclic trim in world frame & translated it to copter frame. That compensated for wind as the copter turned, but indoors there is no wind & the trim is entirely due to vehicle balancing. For the 1st time, the cyclic trim was stored only in copter frame & the turns on the Syma X1 got a lot more stable.
The Syma X1 has a problem of gyro drift & uneven motor heating causing massive trim changes. It has always needed more aft pitch as the flight wore on. Normally, you want the vehicle as balanced as possible, so the trim is only due to wind.
There is a case for cyclic trim in copter frame for an outdoor vehicle, to compensate for balancing, but no way to differentiate between balance & wind.
The 1st flight using the 70fps 320x240 board cam. It wasn't tuned & oscillated. The velocity measurement needs to be at least 7Hz before the oscillation becomes bearable.
Looking up from below in slow motion gives the impression of a long gone vehicle, but the reality is she's never been more than a week from being flyable, since 2011. There are plenty of other vehicles which will never fly again. With the flight software now working on raspberry pi, there is every intention of having a self contained flying system that always works.
The current vehicle was mostly developed in April, 2011. Only the takeoff stand was improved in Jan 2012. The machine vision system saw development in Summer 2011, Jan 2012, & the last week with conversion to a board cam.
Since monocopter development began, in 2010, there has never been an onboard camera. The plan is now to stick a basic 320x240 wifi camera board on her, in addition to the existing flight computer board & ESC. The previous plan was to replace the flight computer with the camera board & have all data on wifi, but the camera board needed to be more modular. Wifi will now just carry video & somehow automatically associate with the raspberry pi access point.
The 2nd Marcy 1 took to the air. This one is loaded down with the POV LEDs.
Visually the same, but mostly redesigned. The POV processor & mane processor now read the radio directly instead of the mane processor passing on data to the POV processor. It was 1 more wire but much simpler code. Even though it was built in July, this was the 1st time this airframe did POV. The motor is immediately getting too hot, melting through the propeller.
The fancy blob detection had to go for POV to work. It's back to a simple threshold, exactly like the nighttime only version. IR vision would allow it to use blob detection.
There was supposed to be a 3rd Marcy 1, with an onboard camera. It's possible to get a very small, wireless camera, but the only fabrication possible in the apartment is a large wireless camera.
Also, if it has 802.11 on it, it's a drag to have to carry around a ground station to control it instead of controlling it directly from a tablet. It's necessary, because 802.11 isn't reliable enough to control it & having 1 system that supports an autopilot & manual control is easier than having 2 unique systems for autopilot & manual control.
The trick with Marcy 1 is once she's flying with the lights on, the thrill from such a strange device hovering subsides, & she gets real boring.