I posted the results of my research into using a small UAV for land cover classification for habitat monitoring a couple of months ago.
I've created a short video as part of my research presentation that details the workflow used to analyse the image data and thought it may be of interest to the community. I'm more than happy to have a conversation about the project in the comments :)
These papers loosely describes eCognition's nearest neighbor supervised classification algorithm with fuzzy rules (link and link). It is based on the feauture space of the sample objects. eCognition provides a rich toolset for nearest neighbor classification that includes spatial autocorrelation and other contextual descriptors. In other words, it is principally object-based. There are an infinite number of ways one could mathematically describe what a nearest neighbor is, but here it refers to distance in the sense of multivariate space and either Euclidean or Mahalanobis distance, in what is likely a variation on the commonly used k-nearest neighbors (knn) algorithm.
Thanks Mark, the person in that particular video does not seem too fond of the user interface though.
Could you go into some more detail on how you're using eCognition? Maybe some screenshots or a video.
I'm using nearest neighbour classification in eCognition which is a learning algorithm of sorts.
Nice, very nice
I read your previous post and you said you used imageJ for the falsecolor image. I think it's a much better idea to use GIS software like QGIS for the purpose. All the georeference data is preserved that way.
Could you talk about the automation part of your work. Are you using a learning algorithm for classification?