Hi,

I have been following this site for a while now, I guess it is time to post something myself.
Above is a picture of my latest scratch-built quadcopter running an STM32 ARM processor board with home-brew software. This is my second quad, on the first one (you can sort of see it in my profile picture) I used a home-mode, dual Atmega 328 board.
So far, the quad features the following setup:

  • Aluminium frame "Home-Depot" style
  • Alpha 370 size motors (HobbyPartz)
  • Salvaged 6dof sensor board from a Walkera XUFO 5 (4?)
  • 2-axis magnetometer (I know, 3 axes would make my life a little easier)
  • Maxbotix sonar range finder for low-altitude hold
  • Mediatek LS20031 GPS
  • Modified ADNS2610 optical mouse sensor for low-altitude position-hold
  • Custom software, using ChibiOS RT OS (http://chibios.sourceforge.net)
  • 72Mhz RC gear (Futaba TX, generic RX)
  • XBee telemetry downlink
  • Onboard data logging to SD/MMC card
  • Oh, yes: taped on keychain-camera for onboard video ;-)

Here is picture of the (rather untidy) electronics setup (the Xbee is normally mounted on the inside of the canopy, the mouse sensor and ultrasound sensor are underneath the copter for obvious reasons):



The software features working so far are:
  • Quaternion based attitude representation - tried Kalman filters at first, switched to a more DCM-like approach later
  • Attitude-hold and aerobatic flight modes (can be switched in flight)
  • GPS position hold (activated by channel 6 switch on TX)
  • OR alternatively at the moment: Optical flow position hold via a downward-facing, modified optical mouse sensor with custom optics (NOTE: both GPS position hold and mouse sensor position hold work, but only on a good day - definitely requires more tweaking/PID tuning)
  • Data logging to SD card (not very reliable atm - I do not really use this much since I have telemetry via Xbee)
  • "Ground station software" for data logging and displaying the state of the quad in various graphical ways

The software is written pretty much in straight C, making use of ChibiOS' multithreading. The ground software is a collection of Perl/Tk scripts - all my development happens on Linux.
Below is a video of the quad in "mouse sensor" position hold mode in my driveway - apologies for the terrible video, I will try and shoot a better one soon...

If somebody finds any of this useful or interesting, I am glad to share some more details...
Best,
Marko

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • @Marko... the chibios quadcopter code. I know, I'm using a maple mini board(STM32F103) but I'm running into problems using their I2C HAL driver.

  • @Wagner: Sure, which source are we talking about? The actual quadcopter code or the code that runs on the Attiny13 which controls the mouse sensor? Keep in mind that I am running on hardware that has no relation or resemblance with Arducopter... 

    Marko

  • will you share the source code?

  • Developer

    The issue is that objects that are close will lead to high optical flow values as they move, while the same size objects a long way away will lead to very small optical flow values.  If you're always in the same environment where say the walls are the same distance from the optical flow sensor, then you can likely tune the alt control so that you know how high you are but otherwise it'll be tough i think.

     

    Anyway, good luck!

  • You're right Randy.  I think, for side sensor, I'll have to add a pan and tilt. 

    By, doing a scan before take off, we should be able to determine all the boundaries.  So, in the air, and it can just correct for any deviations.

  • Developer

    Ellison,

         the problem with using two optflow sensors for 3d velocity is that you can only really determine the velocity if you know the distance to the objects you're looking at.  So for the downward facing sensor it's ok because you know your altitude from the sonar/barometer but when looking out to the side you really don't know how far away the objects are.  You could aim a sonar out the side but that's only good up to 10m.

  • Thanks for the tip, Randy.  I'll be sure to use a level converter.  I've got my 3080's, but still waiting on some other parts like those 24Mhz resonators.  My plan is to use two 3080, one on the x-y plane and one on the z-axis.  That way I can get velocities in all three axis.  Seems like the hard part about this is get the optics aligned properly.

  • Developer

    Hi there.  Marko definitely helped me out a lot especially with the first version using the ADNS2620 from sparkfun.  I owe him a bag of the 3080s when they're finally out (and I don't think it will be as long as few months).

     

    In my 3080, I did put the 4.7uf caps there but no idea if they are absolutely necessary.  You will definitely need some 1k resistors on the communication lines between the APM and the 3080 to protect the mouse as it's only 3.3v.

  • Thanks Marko.  Randy says their probably a few months away.

    I like playing with hardware and know Arduino programming pretty well.  Just a question you might be able to answer. Just like the ADNS 2610, the ADNS 3080 also needs couple of 4.7uf caps at the two vdd.  Just wondering if I get regulated 3v3 from the IMU board, do you think that these caps are really necessary?

  • Hi Ellison,

    I presented a prototype of an optical flow sensor on my quad in this blog post. Neither my quad nor the software I wrote for the sensor are related to the arducopter project, so I cannot comment on the state of support in the AC code. I had a few email exchanges with Randy of the AC team when he ported the idea to the AC project, so he can probably tell you more...

    Marko

This reply was deleted.