Hello, 

I'm building a quad as my graduation project for electronics engineering. The idea is to use image processing to track and follow a pattern, all happening onboard using a Raspberry Pi 2.

It's important for the quad to be able to work indoors, which is why we'd like to use an optical flow sensor and 6 sonars.

We have an APM 2.5 and an ADNS3080 optical flow sensor.

I have a couple of questions for the developers:

Question #1

Reading the code (3.2.1), I can see that the optical flow sensor is supposedly read in the function update_optical_flow() in ArduCopter.ino:

static void update_optical_flow(void)
{
 ...

optflow.update_position(ahrs.roll, ahrs.pitch, ahrs.sin_yaw(), ahrs.cos_yaw(), current_loc.alt);

// updates internal lon and lat with estimation based on optical flow

...

}

The function update_position() of AP_OpticalFlow.cpp goes like this:

void AP_OpticalFlow::update_position(float roll, float pitch, float sin_yaw, float cos_yaw, float altitude){


float diff_roll = roll - _last_roll;
float diff_pitch = pitch - _last_pitch;

// only update position if surface quality is good and angle is not
// over 45 degrees
if( surface_quality >= 10 && fabsf(roll) <= FORTYFIVE_DEGREES
&& fabsf(pitch) <= FORTYFIVE_DEGREES ) {


altitude = max(altitude, 0);


// calculate expected x,y diff due to roll and pitch change
exp_change_x = diff_roll * radians_to_pixels;
exp_change_y = -diff_pitch * radians_to_pixels;

// real estimated raw change from mouse
change_x = dx - exp_change_x;
change_y = dy - exp_change_y;

float avg_altitude = (altitude + _last_altitude)*0.5f;

// convert raw change to horizontal movement in cm
// perhaps this altitude should actually be the distance to the
// ground? i.e. if we are very rolled over it should be longer?
x_cm = -change_x * avg_altitude * conv_factor;
// for example if you are leaned over at 45 deg the ground will
// appear farther away and motion from opt flow sensor will be less
y_cm = -change_y * avg_altitude * conv_factor;


}

_last_altitude = altitude;
_last_roll = roll;
_last_pitch = pitch;
}

As you can see, this function updates x_cm and y_cm based on dx and dy, but it never reads the optical flow sensor registers!

The function that reads the registers is in AP_OpticalFlow_ADNS3080.cpp:

void AP_OpticalFlow_ADNS3080::update(void)
{
uint8_t motion_reg;
surface_quality = read_register(ADNS3080_SQUAL);
hal.scheduler->delay_microseconds(50);

// check for movement, update x,y values
motion_reg = read_register(ADNS3080_MOTION);
if ((motion_reg & 0x80) != 0) {
raw_dx = ((int8_t)read_register(ADNS3080_DELTA_X));
hal.scheduler->delay_microseconds(50);
raw_dy = ((int8_t)read_register(ADNS3080_DELTA_Y));
}else{
raw_dx = 0;
raw_dy = 0;
}

last_update = hal.scheduler->millis();

Vector3f rot_vector(raw_dx, raw_dy, 0);

// rotate dx and dy
rot_vector.rotate(_orientation);
dx = rot_vector.x;
dy = rot_vector.y;
}

I think that this function should be called from update_optical_flow() before calling update_position(), otherwise the sensor would never be read.

Now, what puzzles me, is the fact that the test sketch kind of works, and the function that displays the motion goes like this:

void display_motion()
{
bool first_time = true;

// display instructions on how to exit
hal.console->println("press x to return to menu..");
hal.scheduler->delay(1000);

while (!hal.console->available()) {
optflow.update_position(0,0,0,1,100);

// x,y,squal
hal.console->printf_P(PSTR("x_cm/dx:%d/%d\ty_cm/dy:%d/%d\tsqual:%d\n"),(int)optflow.x_cm,(int)optflow.dx,(int)optflow.y_cm,(int)optflow.dy,(int)optflow.surface_quality);
first_time = false;

// short delay
hal.scheduler->delay(100);
}
}

Again, neither optflow.update() nor optflow.read() are ever called, yet the console shows movement! (Not always, it's kind of random)

I found that previous versions of Arducopter (2.7 and older) used to call optflow.update(), both in the test sketch and in update_optical_flow(). I wonder what changed since then.

Question #2

Another thing that's been bothering me is the fact that in Arducopter 2.7, there were longitude and latitude variables in the AP_OpticalFlow class, which were calculated from x_cm and y_cm:

vlon = x_cm * sin_yaw + y_cm * cos_yaw;

vlat = y_cm * sin_yaw - x_cm * cos_yaw;

Now, that doesn't make sense to me, I drew the axes and did the trig, and I think it should be like this:

vlon = x_cm * cos_yaw + y_cm * sin_yaw;
vlat = y_cm * cos_yaw - x_cm * sin_yaw;

Is that right? It looks like the version in 2.7 considers the yaw angle to be between the y axis and East, instead of North.

I need to get the body to earth frame translation right, as I would like to modify AP_IntertialNav to use the optical flow sensor instead of the GPS when indoors, and modify the OF_Loiter mode to be more like Loiter.

That way, I can use the complementary filter of IntertialNav to fuse the data from the accelerometers and the optical flow sensor, lowering the time constants (as it's much better resolution than the GPS).

I think that would be a better approach than the current one, which is just a PID that modifies the roll and pitch based on the movements of x_cm and y_cm.

Question #3

I don't understand why dx and dy are inverted in update_positon()

change_x = dx - exp_change_x;
change_y = dy - exp_change_y;

x_cm = -change_x * avg_altitude * conv_factor;
y_cm = -change_y * avg_altitude * conv_factor;

I thought that the sensor had to be oriented so that the pins pointed towards the nose of the quad. In that case, the ADNS3080 is oriented as per the datasheet (http://people.ece.cornell.edu/land/courses/ece4760/FinalProjects/s2009/ncr6_wjw27/ncr6_wjw27/docs/adns_3080.pdf), which means that the x and y axes of the UAV and sensor should match, therefore no inversion should be needed, is that right?

Or maybe the sensor reports a negative change when it moves to the right? (The background moves to the left?)

I hope some developer can help me out with these questions, especially Randy who I think is the one that wrote most if not all of the code for the optical flow sensor.

Cheers!

Juan Manuel

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • OK, I'll answer myself the first question:

    I hadn't seen this line in void AP_OpticalFlow_ADNS3080::init(): 

    hal.scheduler->register_timer_process(AP_HAL_MEMBERPROC(&AP_OpticalFlow_ADNS3080::read));

    I feel dumb.

This reply was deleted.

Activity