this is Jan from http://www.diy-streetview.org.
I do streetview as a hobby.
Recently I noticed diydrones.com when searching for a solution to a leveling challenge. My 5-camera-rig is attached to a backpack. I take images while walking, without stopping. Therefore the rig is never leveled, and the resulting panoramas are not straight.
Illustration:
http://www.diy-streetview.org/data/development/20110126a/Screenshot-Fast_Panorama_preview.png
Now, If I just knew the roll and pitch at the exact moment the image was taken, I could automatically level the panoramas.
Your thoughts please?
How would you go at this?
Thanks,
Jan
janmartin AT diy-streetview DOT org
You need to be a member of diydrones to add comments!
Replies
@Jan, I ran some tests late last night on Traditional Dead Reckoning, and as expected the results are not good for TDR. The IMU is not currently designed for that.
After 90 seconds the IMU has an error in the X axis (East) of 400 meters.
So one would have to try to use the accelerometers to gauge the average step size of the walker when outside (and GPS is available), and then detect and use the same step pattern when the user is inside the super market. I'm not sure yet whether that is possible.
The accelermoter function on the IMU is primarily to find the gravity vector and use that to calibrate the roll and pitch of the gyros. The acclerometers will have to be more sensitve in order to spot the walking pattern of the user. It would require more testing.
The good news is that the yaw gyro in the above test dirfted linearly by about 15 degrees over the first 60 seconds, and then stayed at 15 degrees for the next 3 minutes.
So all we can say at the moment, is there is possibly a solution for the supermarket diy-street-view, but it all would require more investigation.
Best wishes, Pete
UAV DevBoard running MatrixPilot and set to provide SERIAL_OUTPUT_ARDUSTATION will also log to the openlog from sparkfun onto a removable SD card, the Roll and Pitch and GPS position of the backpack.
It might be possible to use the DEADRECKONING information, and or, the data about acceleration to time pictures to be taken at time of slower movement moment (or when the user is still). I'm wondering whether for good clear pictures the cameras should be still for a moment.
Have you decided how accurate you need the pitch and roll measurements to be ?
The synchronization of the pitch, roll log with the pictures is the main issue. Not just timing, but ensuring that both picture and log entry both have the same unique handle (usuall a number e.g. 212344).
Or use ArduPilotMega with the IMU shield and the onboard datalogger (a config file allows you to choose what data fields to log). Onboard memory is probably only enough for a day, however.
Ultimately, I suspect the best solution is a phone app. An iPhone 4 or Nexus S has all the sensors you need and tons of datalogging memory. If you google around a bit you may already find an app that does what you need. Just remember to strap your phone to your backpack, not keep it in your pocket ;-)