I recently purchased a raspberry pi from adafruit.  Without much difficulty I was able to get a demo python script for opencv running (see here).  It was able to detect my face as a face and that was kind of exciting.  It was a little choppy obviously, but still pretty fantastic.  I used an old PS2 eyetoy which you can pick up at a gamestop for about $2.  You'll get what you pay for.  One trouble with the Rpi is that they haven't released their camera yet, and it doesn't play well with a lot of USB cameras.  I was also given a GoPro Hero 3 for Christmas.  This is the silver edition, which uses different hardware from other editions.  The GoPro is an incredible little machine.  The GoPro isn't a webcam in the traditional sense, except that it is.  It charges via USB, but it doesn't stream images this way, it uses a webpage and a private Access Point (AP).  It can stream images to an iphone app (and a 3rd party Windows shareware) which begs the question as to how.  To see this work you have to connect your iPhone to the Hero3 as an access point.  My Rpi isn't going to be running Windows, so I'm not interested in the shareware unless I can port it to UNIX.  Looking at the settings on the iPhone, the Hero3 is the router @ and the iPhone is the client at  So I googled & GoPro to see that someone else has already figured out what port to use; 80:80.  When I go to on my iPhone I get a screen that looks like this:


This is really exciting because I've already seen people using Eye-Fi cards to upload their info.  I have an Eye-Fi and I hate it.  I hate the fact that it won't just let me share over samba windows networking.  That's probably unfair of me.  I'm sure it is, but I already have enough wing-ding widgets trying to run in the taskbar and I didn't want to have to leave a computer on just to transfer files.  Nor did I want to post 1,000's of pictures of my yard to facebook.  I felt really cheated of my money. 


There are 4 directories available from this little unix machine (the GoPro).  DCIM is what you'd expect; a directory for your pics and videos just like on most cards and cameras.  Live is the interesting one (and I don't really know what mjpeg and shutter are for as both were empty).  I have an axis network camera that uses mjpeg which I guess is a moving jpeg, perhaps for embedding. 


However, in the live directory are two .m3u8 files which can be used to view the "live" video feed from the GoPro.  The link above says this is an Apple filetype for streaming to iOS devices.  The files aaba.m3u8 and amba.m3u8 both stream, but the amba seems to sustain the video feed.  "Live" is in quotes because there are a few seconds of delay, just as there is with the GoPro app.  But still, Tada!  Now in theory... all I have to do is have my Rpi log in to my GoPro and I have a video feed.  I still don't know how to get Python to talk to a URL rather than a USB, but I think I can overcome this.  Everything in UNIX is treated as a file, so I doubt it will matter.  I'm not sure I can overcome the lag.  My robot is pretty fast, perhaps not the fastest, but fast enough to wreck in the lag interval.  Note to self, the next time I get it running, I should give my phone to the test driver (my daughter) and track the top speed with GPS.  I would like to stream the video without requiring two wireless cards.  My networking skills aren't awesome, and UNIX does such a great job with multiple nics (network interface connections) that I'll probably just use two cards.


One more piece of curiously juicy information. Googling "amba.m3u8" and "GoPro" returns a .pdf of patched GoPro code.  And, just to confirm, all the coolest stuff is open source: the original tarballs are available from the gopro website.  I don't really have the skills necessary to tackle a project like this.  I'm just hoping my bird-dogging helps the next person barking up the same tree.  It's encouraging to know that the API might be sufficiently accessible to make remote APM2.5 or Rpi control possible.  More likely, the people with the skills to decode an API don't need my help.


My intention is to be able to VNC into the raspberry pi to program my Arduino.  This way I could easily upload completely new sketches wirelessly. The addition of computer vihttp://www.eecs.ucf.edu/seniordesign/fa2008sp2009/g11/pictures.htmlsion is even more exciting. There are a couple of snags so far; I haven't been able to connect to the Rpi with x11VNC from my Oracle Ubuntu VM.  This is possibly because I forgot to connect to a "broadcast"ed SSID.  The Rpi will only connect to WiFi networks which broadcast their SSID.  This is sort of lame because my whole precariously cobbled together network, which is built from repurposed DSL routers as APs, etc, doesn't broadcast.  I realize it doesn't make a difference; it's just a preferance.

There really isn't anything new under the sun in the google-centric universe.  I've found a group of 4 engineering students that did the exact same project 5 years ago that I am attempting now.  They even used the exact same servo for the steering, and they struggled with the same problems in implementation (the knuckle is at an inefficient angle).  One document they posted described how many 100's of hours it took them to build which was sobering.  It made me a little taller.  I can say I'm glad I can buy an arduino for $30 and not have to build it from scratch; I'm pretty sure that saved me a lot of time.  There are probably a lot of things engineering students should have to do for their senior project that a political science student shouldn't have to do.  I have several old GPSs lying around in old iphones, a tomtom, a garmin.  I don't want to have to solder in a new one and invent the wheel.  I hope to keep my design modular enough that I'll be able to plug in an autopilot from diydrones at some point (when they go back to, or below their low of $179!) but that is still a long way off.  In the interim may Moore's law march on.  



Views: 48365

Comment by eduardo on March 31, 2013 at 10:37pm

Hello, i have a question about eye-fi

you can read the photos even when camera was taking pictures ?

for example:

i take a shot and i can read the file from card after picture are taken ?

and about camera, when i read the card by wifi what occur with camera ?

Comment by Joshua Johnson on March 31, 2013 at 10:53pm

I hear that Raspberry Pi is very easy to mess around with and learn.  I've been thinking about investing in some hardware but haven't been able to figure out if it will be useful for my project!  Are you someone I could reach out to with more questions about Raspberry Pi?

Comment by Liam Honecker on March 31, 2013 at 11:12pm

Hi Eduardo,

     I'm sorry it's been so long that I don't really remember much except the dissapointment.  I would discourage you from using it for anything too creative or too far of a departure from their site.  The file will be read, but maybe not how you want.  You don't have much control as to the how.  It can upload to a few services, and your computer.  I remember it being very attention and processor intensive.  The camera has to be on, and most camera's had no idea that the card was anything different.  I'm looking at the manual for mine and it says it was printed in 2008. It may have been improved since then.  It was not very open source in spirit, and not very hackable. The cam-do site has a how-to that you may find more promising. It depends what you're trying to do with it.  -Liam 

Comment by Liam Honecker on March 31, 2013 at 11:29pm

Joshua, that would definately be a case of "the blind leading the blind".  I'm glad to help.  The best advice I have is to get one and check out the tutorials at adafruit.  They're so inexpensive ($40) that I think it will find a place pretty quickly.  I think I saw your post about the voice activated helicopter?  The Rpi has an audio jack (I think out only), but a webcam or external sound card via USB might work.

Comment by Liam Honecker on March 31, 2013 at 11:48pm

/me thanks the moderator/admin for automagically adding the nifty image of a prefab GoPro AV pin out!

3D Robotics
Comment by Chris Anderson on March 31, 2013 at 11:50pm

I wish it were automagic, but you're welcome ;-) Next time, pls start your posts with an image and we won't have to be so random in filling the gap!

Comment by Jack Crossfire on April 1, 2013 at 12:56pm

About time someone finally found a way to view recordings over wifi.  When mine arrived, there was nothing but the limited gopro app.  Now, if only the pro didn't constantly crash.

Comment by Simon Wunderlin on April 1, 2013 at 2:28pm

I don't own a raspi, however, if I had problems with wifi on linux i'd check the wpasupplicant configuration.



Comment by Jack Crossfire on April 1, 2013 at 4:06pm

There is a live stream but still no way to view a recording without taking the camera out of its waterproof case & moving the SD card to a card reader.  The camera needs to be powered on & you have to run the gopro app to get it to initialize the web server.  This despite the blue LED flashing when it's off.

Comment by Justin on April 1, 2013 at 8:56pm

If your willing to still have a PC available you can stream your APM to ethernet with ser2net the only issue i have is its TCP but works extremely well. I currently don't have a transmitter that is not on the 2.4 band so I cant use it to its full potential.


You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2018   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service