I recently purchased a raspberry pi from adafruit. Without much difficulty I was able to get a demo python script for opencv running (see here). It was able to detect my face as a face and that was kind of exciting. It was a little choppy obviously, but still pretty fantastic. I used an old PS2 eyetoy which you can pick up at a gamestop for about $2. You'll get what you pay for. One trouble with the Rpi is that they haven't released their camera yet, and it doesn't play well with a lot of USB cameras. I was also given a GoPro Hero 3 for Christmas. This is the silver edition, which uses different hardware from other editions. The GoPro is an incredible little machine. The GoPro isn't a webcam in the traditional sense, except that it is. It charges via USB, but it doesn't stream images this way, it uses a webpage and a private Access Point (AP). It can stream images to an iphone app (and a 3rd party Windows shareware) which begs the question as to how. To see this work you have to connect your iPhone to the Hero3 as an access point. My Rpi isn't going to be running Windows, so I'm not interested in the shareware unless I can port it to UNIX. Looking at the settings on the iPhone, the Hero3 is the router @ 10.5.5.9 and the iPhone is the client at 10.5.5.119. So I googled 10.5.5.9 & GoPro to see that someone else has already figured out what port to use; 80:80. When I go to http://10.5.5.9:8080 on my iPhone I get a screen that looks like this:
This is really exciting because I've already seen people using Eye-Fi cards to upload their info. I have an Eye-Fi and I hate it. I hate the fact that it won't just let me share over samba windows networking. That's probably unfair of me. I'm sure it is, but I already have enough wing-ding widgets trying to run in the taskbar and I didn't want to have to leave a computer on just to transfer files. Nor did I want to post 1,000's of pictures of my yard to facebook. I felt really cheated of my money.
There are 4 directories available from this little unix machine (the GoPro). DCIM is what you'd expect; a directory for your pics and videos just like on most cards and cameras. Live is the interesting one (and I don't really know what mjpeg and shutter are for as both were empty). I have an axis network camera that uses mjpeg which I guess is a moving jpeg, perhaps for embedding.
However, in the live directory are two .m3u8 files which can be used to view the "live" video feed from the GoPro. The link above says this is an Apple filetype for streaming to iOS devices. The files aaba.m3u8 and amba.m3u8 both stream, but the amba seems to sustain the video feed. "Live" is in quotes because there are a few seconds of delay, just as there is with the GoPro app. But still, Tada! Now in theory... all I have to do is have my Rpi log in to my GoPro and I have a video feed. I still don't know how to get Python to talk to a URL rather than a USB, but I think I can overcome this. Everything in UNIX is treated as a file, so I doubt it will matter. I'm not sure I can overcome the lag. My robot is pretty fast, perhaps not the fastest, but fast enough to wreck in the lag interval. Note to self, the next time I get it running, I should give my phone to the test driver (my daughter) and track the top speed with GPS. I would like to stream the video without requiring two wireless cards. My networking skills aren't awesome, and UNIX does such a great job with multiple nics (network interface connections) that I'll probably just use two cards.
One more piece of curiously juicy information. Googling "amba.m3u8" and "GoPro" returns a .pdf of patched GoPro code. And, just to confirm, all the coolest stuff is open source: the original tarballs are available from the gopro website. I don't really have the skills necessary to tackle a project like this. I'm just hoping my bird-dogging helps the next person barking up the same tree. It's encouraging to know that the API might be sufficiently accessible to make remote APM2.5 or Rpi control possible. More likely, the people with the skills to decode an API don't need my help.
My intention is to be able to VNC into the raspberry pi to program my Arduino. This way I could easily upload completely new sketches wirelessly. The addition of computer vihttp://www.eecs.ucf.edu/seniordesign/fa2008sp2009/g11/pictures.htmlsion is even more exciting. There are a couple of snags so far; I haven't been able to connect to the Rpi with x11VNC from my Oracle Ubuntu VM. This is possibly because I forgot to connect to a "broadcast"ed SSID. The Rpi will only connect to WiFi networks which broadcast their SSID. This is sort of lame because my whole precariously cobbled together network, which is built from repurposed DSL routers as APs, etc, doesn't broadcast. I realize it doesn't make a difference; it's just a preferance.
There really isn't anything new under the sun in the google-centric universe. I've found a group of 4 engineering students that did the exact same project 5 years ago that I am attempting now. They even used the exact same servo for the steering, and they struggled with the same problems in implementation (the knuckle is at an inefficient angle). One document they posted described how many 100's of hours it took them to build which was sobering. It made me a little taller. I can say I'm glad I can buy an arduino for $30 and not have to build it from scratch; I'm pretty sure that saved me a lot of time. There are probably a lot of things engineering students should have to do for their senior project that a political science student shouldn't have to do. I have several old GPSs lying around in old iphones, a tomtom, a garmin. I don't want to have to solder in a new one and invent the wheel. I hope to keep my design modular enough that I'll be able to plug in an autopilot from diydrones at some point (when they go back to, or below their low of $179!) but that is still a long way off. In the interim may Moore's law march on.