The ArduIMU v3 is a very interesting little device for minimalists. For my own project, I needed a device that could keep the plane level, maintain heading/altitude and sometimes steer to a particular waypoint. The majority of the time, I intend to control the plane myself, but I need the autopilot to take over at certain times to allow me to perform some other minor tasks. This is a video showing a full 'hardware-in-the-loop' demonstration of the IMU with only the platform replaced by the flightgear simulator. Manual control is effected by a wii motion+, which is held in the angle that I want the plane to be. Pressing the "B" (actuator) button means I take over control manually, letting it go allows the imu to take over and maintain course/altitude.
The Ardu IMU v3 ticks all the boxes in terms of inputs, outputs, size and price. My requirements:
Instead of a fully autonomous system, this project explores the requirements for a system where the pilot is kept closer in the loop and where the role of this single operator changes frequently during the flight. Piloting, navigation and mission execution are seen as three different concerns that are hierarchically associated. What this hopes to achieve is to smoothen out the cognitive consequences of changes in context and situation that occurs when operators switch between these roles and what is needed in terms of technology and displays to help the operator make that transition. The more fluid this transition is perceived to be, the better the system is perceived.
The other very interesting benefit is that, if the assumption of the datalink guarantee holds, the kind of applications that become possible are very interesting. This is especially the case in situations where a lot of data is available on the ground, which cannot be easily sent or stored on the platform (realtime data about ground movements, collaborative searches, etc. ). This paves the way for a rich augmented reality display, where positions of interest, movements of other mobile objects, home locations, antenna beam limits and landing tunnels can be visualized virtually, supporting the flight operations of the pilot. The assumption being that merging a symbolic, graphical display with actual video, instead of showing numbers and arrows that require interpretation is significantly better.
I personally foresee that it makes more sense to move back the navigation+piloting code from the ground to the plane eventually, but this is a great starting point for systems where the tasks for navigation and mission control could blend in more with the actual environment of the plane.