You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • you are not allowed to fly high enough in outback challenge...

  • wonder how hard can it be to actually build one similar with phone camera modules to take 1 picture from height and process it to find Joe - my main idea was the outback challenge :) 

  • I agree that the algorithms have been around for while. from an engineering stand point they still have to process around 1Tbits/s in real time.

    on the badwidth I do not understand your 1. the system ussumes that you may have up to 64 video feeds active at a time so steaming just the feeds is 4 Gb/s streaming the total image would be around 1Tb/s! ( more like 0.1 Ebit per day than 1 Ebyte )

  • Developer

    @Frederic, you are correct. But the principle is the same. The difference is that the source of the stitch is multiple video frames from different cameras instead of still images from a single camera being panned. So the usage in the news rapport is innovative, but not "declassified".

    Regarding the bandwidth, I see a couple of options.

    1. Video from each camera is used to stitch onboard the UAV, and the resulting image is then transmitted to the ground. If you point and click a select area of the map, the camera corresponding to that portion of the map is streamed live in addition to the image. This way you only need the bandwidth for periodic update of large resolution map, and streaming video for a few cameras.

    2. Some kind of heavy compression (h.264) is used to stream all video cameras live to the ground for stitching. But I have a hard time believing they have that kind of bandwidth.

  • @John the link you give is for a system where a single sensor scans an area and then stitch the shots to create a larger image like in Gigapan. here they assemble hundreds of sensors to create a larger one so if one sensor is recording at 30fps you have created a 30fps gigapixel sensor. putting that together requires some engineering effort but do not seem out of reach. not knowing anything about military communications I am more impressed by the implications of 64 real time independant video feeds going through the communication channel. even with only 640x480 resolution and 25 fps that is 4 gigabits/s before compression!

    GigaPan | High-Resolution Images | Panoramic Photography | GigaPixel Images
    Create, share and explore stunning high-resolution gigapixel images and panoramic photography with GigaPan's revolutionary technology.
  • One EB per day? I'm skeptical on that. If the images are uncompressed and 8-bits per channel you get something a little over 5GB per frame. If shooting at 30 FPS that add ups to 12-13PB per day. That's a lot of data but not anywhere near what they said. Even with 16-bits per channel and 60 FPS it's a small fraction of what they said. Some numbers from Google calculator (love that thing!).

    (Hours * Minutes * Seconds * FPS * Pixels * Depth * Channels)

    (24 * 60 * 60 * 30 * 1800000000 * 8 * 3) bits = 12.4317PB

    (24 * 60 * 60 * 60 * 1800000000 * 16 * 3) bits = 49.7266PB

    With slight lossy compression or even lossless compression those become manageable data sets. If dropping the frame rate down to something like 10 FPS it becomes even more manageable. By manageable I mean possible, certainly not easy! Even though there were some cheesy parts of the documentary I thought overall it was really interesting to watch.

  • Developer

    Where all this data stored is a good question? Google, Apple, ATT, and others would like a fat government contract!

  • Not sure if I got this correct, but the UAV was able to store the videos locally? "A million terabytes a day" (at 2:50)

    In august last year, Facebook worked thrugh 500+ terabytes each day... (http://techcrunch.com/2012/08/22/how-big-is-facebooks-data-2-5-bill...)
    So this drone should be able to store all data entered into facebook for approx. 5.5 years...

     

    I suggest selling one drone to facebook, as facebook would be able to pay incredible sums of cash for this as it can replace some of their server parks...

     

    And the datalink must be better then the 3DR radios ;-)

     

     

     

    How Big Is Facebook’s Data? 2.5 Billion Pieces Of Content And 500+ Terabytes Ingested Every Day
    Facebook revealed some big, big stats on big data to a few reporters at its HQ today, including that its system processes 2.5 billion pieces of conte…
  • Developer

    I would like to append to my own comment. First of, don't get me wrong. The science is impressive and valid, and this is probably the first use of gigapixel technology in a UAV. My knee-jerk reaction has more to do with the whole secret "declassified" technology angle of the story. Everything you see in the video is based off publicly available research. The scientist even indirectly says so himself, when he speaks about how they lowered cost by exploiting existing smart phone camera technology.

  • Developer

    I'm gonna come straight out and just say it. What a load of bull.. This whole feature tried so hard to be more then it really is. First off, what was the deal with the dark "evil laboratory" setting and scientist face lit from below. That's just silly theatrics, and totally wrong for this kind of news feature.

    And now for the "declassified" science. What was shown was just your average research level multi-sensor gigapixel camera deal, using map stitching and basic well known video analytic algorithms to detect moving objects. The math has been there since the early 90's and while the giga pixel stuff isn't dime a dozen yet, it is a couple of years since they where considered "bleeding edge" in research.

    There is even commercial systems in the marked already. http://www.ipconfigure.com/products/gwas/index.html

    And some reading material and a link to OpenCV if you want to experiment with this type of stuff yourself.

    http://graphics.uni-konstanz.de/publikationen/2012/gigapixel_videos...

    http://opencv.willowgarage.com/wiki/

This reply was deleted.