Phill Scott's Posts (2)

Sort by

MAAXX Europe UAV 'Racing'

3689695309?profile=original

I wanted to give this website some visibility so that:

  • It might create some interest within the community
  • More people are likely to enter
  • We can help define some rules a little more clearly

I'm Alumni from this University (graduated 12 years ago), and the academics and students are doing as much as they possibly can with autonomous systems.  This competition is the next step. 

  • It isn't FPV racing - hence the inverted commas in the title, the vehicles have to be in autonomous mode.
  • It's about creating a system which can follow lines placed on the ground (26m x 2m racetrack with some additional features).
  • Your UAV has to weigh less than 1kg and fit within a sphere 2m in diameter, and can fly no higher than 1.5m
  • You can use off the shelf or complete DIY
  • You can use any config or type of vehicle you want
  • Control once on the course is fully automatic only - no manual intervention
  • The rules say you can have a groundstation - for reporting back, and to make the calculations - but why not have all the calcs done onboard and broadcast video and listen out only for normal ground station commands?  Doesn't seem to be ruled out.
  • Program it in whatever you want.  Use whatever kit you want to follow the tape
  • You can do this as an individual (£6 total entry fee) or a team (£60 total entry fee)

I guess a lot of people (myself included) did projects like this at university, but with line following rovers and IR sensors or similar.  Doing the same thing with UAVs is not as trivial, from the prototyping I have seen!

Here's a couple of random Youtube vids of attempts at this kind of thing:

https://www.youtube.com/watch?v=DGX93PVhuoM

https://www.youtube.com/watch?v=ZZ3sAy_2eNs

I'm hoping that this post will stir some discussion, and maybe point the way with any rule changes or specific items of interest required (e.g I'd like to have a figure of 8 rather than a straightforward race track...).

I know there are lots of people who couldn't / wouldn't come to Bristol for this - so - I wonder if we could arrange to do this virtually for those people.  Webcast, use a standard drone / GCS set-up and so on...?

Thanks, Phill

Read more…

First mission for UCLANs aeroSee project

3689538572?profile=original

(image taken from the aeroSee website)

The aeroSee project had its inaugural trial mission on the 25th July 2013. I signed up and spent an hour reviewing pictures as they came in.

I had read the BBC article posted on the 16th of July, about the project and thought it would be a good activity to get involved in - especially as I had time off from work on the day of the first mission.

The involvement of 'the crowd' is very interesting as the UAS can generally be activated and enroute to the search area in a faster time than a mountain rescue team, so the UAS can start the search and provide coverage of large areas quickly. Any agents logged on to the search system can be provided with a brief (eg. we are looking for two men and two women, 2 in red, 1 in blue and 1 in black, all with black rucksacks.) and can then start reviewing still images sent from the UAS to the ground station, and then on to the search servers.

The search agents can add tags on the pictures in areas with something that the searcher thinks is of interest. A brief note to say what/why they've tagged something of interest to them (location of the tags needs some refinement but that shouldn't really be a problem). The responses are then analysed and down-selected to be sent to the Mountain Rescue HQ for them to decide on a response plan. I don't know what the analysis is, but I should think that it is possible to analyse images and the locations of tags on images to show frequency and concentration of tags and decide which images have raised the most interest in the searcher group. Hopefully those down selected images will show the target of interest!
I did this on my home desktop / adsl / wifi link and the service was very quick - I could page through images as fast as I needed. Quality of the images was variable, and there were IR shots thrown in to the mix too, however, you could make out people, and spotting humans wasn't that hard on the day the images are taken - nice and sunny so there were good long shadows and good shape and feature definition. Also, quite surprisingly you could recognise the shape of a face from relatively few pixels. Unfortunately I didn't save any example pictures this time around - if there's another trial I'll try and remember.

If there is another trial flight I think I'll use my mobile phone and it's 3G link to see if there's a noticeable difference in performance. I'm pretty sure that the site isn't optimised for it, but it could mean that you could help out in your lunch break sat at your desk or in your car, or as a passenger on a long journey with nothing better to do, this kind of activity would be available to you.
I think that getting a good level of searcher service for a crowd sourced initiative all the time could be an issue considering there could be 10000 images to search through without too much effort on the part of the recording device - 1 image every 0.5 seconds is 7800 images an hour. It seems that this could be key to realising the savings and efficiencies possible with such a system.

To sum up - this is really interesting and I'm looking forward to seeing where this goes. I'm already wondering at how it could help local lowlands rescue teams when called upon by the police to look for missing persons late at night.

Read more…