UAV Generated Point Cloud Classification

Ok, so this may be a more advanced topic than normally found on this forum. But I am struggling to classify the point clouds generated with the UAV. After Pix4D, after Agisoft or others you are left with an unclassified point cloud If I want to classify the buildings, vegetation, and ground points so I can then generate a DTM. How is everyone doing this? What software have you used, what settings? 

The main problem I have found is that most point cloud manipulation software is designed for LiDAR, and LiDAR has one thing we do not have, multiple returns. So for them clasifying the ground is a matter of getting the last return and then just cleaning up the buildings. But most of these don't work so well with the point clouds from Pix4D, Agisoft, etc... I have tried LASTools, ESRI, and recently DTMaster. Any other suggestions? or suggestions on the settings to use?

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


          • I agree Tazmoe.

            Nonetheless, people here want fast results with limited knowledge of the photogrammetry limits. You essentially create an artificial DTM out of your imagination, not a real data behind. As for a vegetation, you have no idea where a ground is, just a best guess. For any man made structures or objects, you can at least assume. But at the end you always end up with a just non-real terrain model. Unless there is some better technology coming, we end up with virtual models, not real ones. Lidar has also limitations as you said, but it's the best we got so far.

            Btw. very good topic and discussion here ! I will try my tests with Photoscan and DTM extraction too. But I assume I will end up with some Lidar tool at the end anyway.

            • Hi Mike

              I do not completely agree with you. I have done several tests and found that the models I got were so close to reality. Several random point were picked from the Model and used dGPS to measure them on the ground and I had within 5-15cm. For earthworks this is excellent in my opinion. If was designing high rise, yeah this is not as accurate. I think the key is that you need to have the data referenced accurately with GCP spread well enough over the area. The main challenge I have seen when you have dense vegetation cover, it is simply not possible getting the data cleaned and have it accurate. This is where LiDAR has upper hand.

              Yes it is a good discussion!

            • Not exactly "fast results with limited knowledge" I think is more like sharing your experience in DTM extraction using photogrammetry only. Due to the lack of specialized software (most of the software is design for Lidar data) this process is pretty complicated.

    • Have you tried Global Mapper's Lidar module yet?  I've got it...  Tried it, but with limited results.  Just about every engineer I've talk to says get it done in India.

  • 3701972399?profile=original3701972481?profile=originalHere is a test site I did. 56 - 14 GB photos from Nex-5 alpha processed in Photoscan to generate the point cloud and ortho. The red arrow is pointing to a Bulldozer to give a sense of scale. This is roughly 35 - 45 acres.  Ground points classified in TerraScan and Contours generated in TerraModel. I'm pretty impressed with Photoscan's abilities.


    • Filter ground points in Photoscan then export point cloud (ground only) to LAS format.

      Then from Whitebox GAT tools run the next sequence Bare-Earth DEM (usually with 0.5m resolution) > Remove Off-Terrain Objects > Fill Missing Data Holes.

      I'm getting good results using this method.

      Of course you have to change parameters according with terrain type and DSM resolution to get the best of it.

      • I'm using Recap360 as well with a GoPro for now to learn how to do this and learn how to fly the drone w/out too much damage to a DSLR.  Last week we put the XYZ coordinates in the upload & downloaded it to an RCS.  I don't have that file on this computer, I gave it to one of the engineers at my husband's soils engineering company.  

        Here's at least the rendering of a little less than 200 photos.  I'm hoping to upgrade equipment in the next few weeks since the flying is going well & I'm getting very close with the photogrammetry.  These are of a landslide repair


  • I thought I would share my experience with Agisoft. Here is what Agisoft looks like when you are classifying the dense point cloud. The software takes all the points generated from the dense point cloud and breaks them into cell size which the user defined as Cell size. Then specify from the lowest point, z, in that cell how high to come up from that point (max distance) and what the angle (max angle) should be from the lowest point to any point included in the DTM surface. Everything that is brown is now in the newly classified ground surface, and everything that is white is separated out. 

    As you can see from the screen shot I am not filtering out the cars properly. I am getting the roofs separated but not the hood and lower parts. I had to end up manually removing them. 3701963753?profile=original

    • 3702555437?profile=originalThe top if a Pt cloud generated in Autodesk Recap 360, the lower image is what Civil 3D 2015 can do with a tool they have to extract a surface from a pt cloud.  They have three different modes you can use, two of them will attempt to remove the peaks.  ie trees, cars building etc.  One does nothing and just gives you the raw surface.   As you can see from the surface most of the peaks are pretty much gone.  Now it doesn't do much for grass, but doubt any software can.

      Bill Neuhauser P.E.

      • 3702555860?profile=originalSorry here is the original PT cloud generated in Recap 360.

This reply was deleted.


DIY Drones via Twitter
Dec 12, 2021
DIY Robocars via Twitter
Dec 12, 2021
DIY Robocars via Twitter
RT @chr1sa: Just a week to go before our next @DIYRobocars race at @circuitlaunch, complete with famous Brazilian BBQ. It's free, fun for k…
Dec 4, 2021
DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training
Nov 28, 2021
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Nov 26, 2021
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25, 2021
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23, 2021
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord:
Nov 23, 2021
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23, 2021
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20, 2021
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20, 2021
DIY Robocars via Twitter
Incredible training performance with Donkeycar
Nov 9, 2021
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge…
Oct 31, 2021
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29, 2021
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25, 2021
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉…
Oct 23, 2021