Hi,

Has anyone ever tried a setup where you have used an ublox M8N to connect to the LiDAR from Velodyne? I did all the wiring connections as you can see on my diagram. I got a PPS Locked on the LiDAR, but no GPS coordinates. 

Someone has already make it work?

Conectando GPS M8N ao LiDAR Velodyne.pdf

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi Pedro, Why was necesary invert Tx output from ublox?. I undestand about shift leveling but invert? do you have any theory?.

  • Hi, to those of you coming to this topic, this document/thesis might be of interest:

    Construction of 3D Point Clouds Using LiDAR Technology

    (describes using the velodyne puck as well)

    https://support.dce.felk.cvut.cz/mediawiki/images/d/d3/Bp_2016_trafina_tomas.pdf
  • I connected the GPS with the PC via USB cable and using U-center I could change the type of message that was send by the GPS to the one that matches the LiDAR needs. I can't remember exactly where I did this modification, but I think it was pretty straight forward, I jut had to find where it was defined the types of GNSS supported.

    The inverter was a suggestion from the Velodyne support team to match the signal format.

    Rohan R Paleja said:

    Hi, 

    Could you explain a little more about how you disabled everything in U-Center and why you needed an inverter.

    Thank you

    Pedro Henrique R. P. B. Silva said:

    Hello, Matej

    After many tries, I could make it work.

    I had to do exactly as you said, reprogram the ublox using u-center and disabling every GNSS except GPS. I also had to access the pin 3 of ublox m8n to get the PPS.

    But besides that, I had to connect an inverter chip (7404) between the TX of the GPS and the RX of the LiDAR. I will send the configuration file and the connection diagram. It might help someone in the future.

    Thanks for your help!



    Matej Miljko said:

    Hi Pedro,

    I have no experience with this but after some google-fu I think this might be the answer:

    First I saw this in Velodyne's manual:

    The user must configure their GPS device to issue a once-a-second synchronization pulse (PPS, 0-5V, rising edge), typically output over a dedicated wire, and issue a once-a-second NMEA standard $GPRMC sentence. No other output message from the GPS will be accepted by the VLP-16.

    Did you do that? If not I think you should be able to do it in Ublox u-center software.

    You can connect the gps to program it using one of these methods.

    This thread might explain your problem.

    Or the gps had no lock.

    Best of luck

  • Hi, 

    Could you explain a little more about how you disabled everything in U-Center and why you needed an inverter.

    Thank you

    Pedro Henrique R. P. B. Silva said:

    Hello, Matej

    After many tries, I could make it work.

    I had to do exactly as you said, reprogram the ublox using u-center and disabling every GNSS except GPS. I also had to access the pin 3 of ublox m8n to get the PPS.

    But besides that, I had to connect an inverter chip (7404) between the TX of the GPS and the RX of the LiDAR. I will send the configuration file and the connection diagram. It might help someone in the future.

    Thanks for your help!



    Matej Miljko said:

    Hi Pedro,

    I have no experience with this but after some google-fu I think this might be the answer:

    First I saw this in Velodyne's manual:

    The user must configure their GPS device to issue a once-a-second synchronization pulse (PPS, 0-5V, rising edge), typically output over a dedicated wire, and issue a once-a-second NMEA standard $GPRMC sentence. No other output message from the GPS will be accepted by the VLP-16.

    Did you do that? If not I think you should be able to do it in Ublox u-center software.

    You can connect the gps to program it using one of these methods.

    This thread might explain your problem.

    Or the gps had no lock.

    Best of luck

  • Matej,

    I'm sending 3 images. The first one is the best that I could reach so far. But in this cloud, I didn't use any GPS nor IMU data. The cloud was created only using linear distances between the laser frames.

    The second I used only one axis of the GPS, the other two were zeroed. You can see that I'm having huge problems with the spacing between the frames, using the other 2 axis would make it even worse.

    The third I tried only to test the IMU, there's no GPS data, only Yaw angle. The cloud is pretty bad and the objects near the center are rounded.

    So, I think there's no point in sending a picture with full GPS and IMU axes,they are just a huge mess with nothing barely recognisable.

    I was looking for  some hardware now. I saw the AP20, APX-15 and  Novatel SPAN. They looked like good options, do you know something about the APX-15 and SPAN? I just sent an email to Applanix and Novatel to see the prices of these sensors. I'm lookin for some medium price range now, I already had too many problems with the low cost ones, this thread was created 2 months ago, I worked on it every day since then, and I could reached barely nothing useful.


    Matej Miljko said:

    Hi Pedro,

    That is really good. Mind sharing a photo of the pointcloud or the file itself?

    For cheap hardware you can start with something like this and an arduino to log imu data and a gnss that can capture raw data(M8T and M8P can and are fairly affordable tho I don't know how to log it). The raw gnss data can be post processed to get centimeter accuracy. Once that is working the best it can you can substitute the imu for some other one.

    The STIM300 requires a synchronization signal while ap20 sends out the data with a timestamp.

    Pedro Henrique R. P. B. Silva said:

    Matej,

    I just did some research on the hardware that  you mentioned, because I didn't know them. I'm not sure if I understood them correctly. Correct me if I'm wrong. The Trimple AP20 and STIM300 have an internal IMU and I can connect a GNSS to it, and they internally sync the data?

    My hardware is so simple that it can be considered much more a proof of concept than anything. I have a m8n connected to the VLP16 interface box, working just as a clock. And I have a Pixhawk with another m8n attached to the top of  the VLP16.

    I turn everything on, connect the Lidar to Veloview to save the data and start to walk around scanning the environment. When I finish, I get the LiDAR data through Veloview (downloading the frames as .CSV file), and remove the SDCard from the pixhawk to get the log.

    With the Points and the Log, I have a simple C# software that compares the Time from the Lidar with the Time of th Log and sync the points with GPS and IMU data.

    Now, with a single file with a structure like: X,Y,Z,intensity,lat,lon,alt,roll,pitch,yaw. I use MATLAB to implement the formula that I just showed to you.

    And then, finally, I have a file X,Y,Z,intensity, that I open in CloudCompare to see my map. That is always pretty useless due  to the poor hardware that I have.

    We can surely colaborate, I have pretty low knowledge in good hardware. I see the hardware from Phoenix Lidar as the top of the market, but it's way out of reach ($$$). So if you have any hint for a hardware  that could do a useful map, I'd love to listen to that. 

    1.png

    2.png

    3.png

  • Hi Pedro,

    That is really good. Mind sharing a photo of the pointcloud or the file itself?

    For cheap hardware you can start with something like this and an arduino to log imu data and a gnss that can capture raw data(M8T and M8P can and are fairly affordable tho I don't know how to log it). The raw gnss data can be post processed to get centimeter accuracy. Once that is working the best it can you can substitute the imu for some other one.

    The STIM300 requires a synchronization signal while ap20 sends out the data with a timestamp.

    Pedro Henrique R. P. B. Silva said:

    Matej,

    I just did some research on the hardware that  you mentioned, because I didn't know them. I'm not sure if I understood them correctly. Correct me if I'm wrong. The Trimple AP20 and STIM300 have an internal IMU and I can connect a GNSS to it, and they internally sync the data?

    My hardware is so simple that it can be considered much more a proof of concept than anything. I have a m8n connected to the VLP16 interface box, working just as a clock. And I have a Pixhawk with another m8n attached to the top of  the VLP16.

    I turn everything on, connect the Lidar to Veloview to save the data and start to walk around scanning the environment. When I finish, I get the LiDAR data through Veloview (downloading the frames as .CSV file), and remove the SDCard from the pixhawk to get the log.

    With the Points and the Log, I have a simple C# software that compares the Time from the Lidar with the Time of th Log and sync the points with GPS and IMU data.

    Now, with a single file with a structure like: X,Y,Z,intensity,lat,lon,alt,roll,pitch,yaw. I use MATLAB to implement the formula that I just showed to you.

    And then, finally, I have a file X,Y,Z,intensity, that I open in CloudCompare to see my map. That is always pretty useless due  to the poor hardware that I have.

    We can surely colaborate, I have pretty low knowledge in good hardware. I see the hardware from Phoenix Lidar as the top of the market, but it's way out of reach ($$$). So if you have any hint for a hardware  that could do a useful map, I'd love to listen to that. 

    SparkFun 9DoF Sensor Stick
    The SparkFun 9DoF Sensor Stick is an easy-to-use 9 Degrees of Freedom IMU. The Sensor Stick deftly utilizes the LSM9DS1 motion-sensing system-in-a-ch…
  • Matej,

    I just did some research on the hardware that  you mentioned, because I didn't know them. I'm not sure if I understood them correctly. Correct me if I'm wrong. The Trimple AP20 and STIM300 have an internal IMU and I can connect a GNSS to it, and they internally sync the data?

    My hardware is so simple that it can be considered much more a proof of concept than anything. I have a m8n connected to the VLP16 interface box, working just as a clock. And I have a Pixhawk with another m8n attached to the top of  the VLP16.

    I turn everything on, connect the Lidar to Veloview to save the data and start to walk around scanning the environment. When I finish, I get the LiDAR data through Veloview (downloading the frames as .CSV file), and remove the SDCard from the pixhawk to get the log.

    With the Points and the Log, I have a simple C# software that compares the Time from the Lidar with the Time of th Log and sync the points with GPS and IMU data.

    Now, with a single file with a structure like: X,Y,Z,intensity,lat,lon,alt,roll,pitch,yaw. I use MATLAB to implement the formula that I just showed to you.

    And then, finally, I have a file X,Y,Z,intensity, that I open in CloudCompare to see my map. That is always pretty useless due  to the poor hardware that I have.

    We can surely colaborate, I have pretty low knowledge in good hardware. I see the hardware from Phoenix Lidar as the top of the market, but it's way out of reach ($$$). So if you have any hint for a hardware  that could do a useful map, I'd love to listen to that. 


    Matej Miljko said:

    Hi Pedro,

    What I think will be the biggest problem is the refresh rate of the gnss as you can change course/speed in fractions of seconds and that can give you a huge error.

    What I am guessing the formulae does is add relative points from the lidar+imu to gnss points and compensate for the antenna offset, that should be it then.

    The parameters he is using is all you need but getting those parameters is hard. You use a M8N, right? When using GPS+GLONASS you get 5Hz refresh rate, that is one location every 0.2s, even then it is +-2.5m. You might need a better one, the M8P is cheap and rtk enabled, it is relatively cheap in a D-RTK solution from ProfiCNC called HERE+(it also refreshes 5Hz but is +-2.5cm relatively).

    Have you decided what hardware you'll be using? If you are interested I am making an inhouse solution for the company I work in as other solutions are way to damn expensive, we can collaborate if you'd like. I was thinking of using either Trimble AP20 or STIM300 and a ppp/ppk enabled gnss. My biggest problem is I don't have the hardware required, gathering the data is what I think the problem will be.

  • Hi Pedro,

    What I think will be the biggest problem is the refresh rate of the gnss as you can change course/speed in fractions of seconds and that can give you a huge error.

    What I am guessing the formulae does is add relative points from the lidar+imu to gnss points and compensate for the antenna offset, that should be it then.

    The parameters he is using is all you need but getting those parameters is hard. You use a M8N, right? When using GPS+GLONASS you get 5Hz refresh rate, that is one location every 0.2s, even then it is +-2.5m. You might need a better one, the M8P is cheap and rtk enabled, it is relatively cheap in a D-RTK solution from ProfiCNC called HERE+(it also refreshes 5Hz but is +-2.5cm relatively).

    Have you decided what hardware you'll be using? If you are interested I am making an inhouse solution for the company I work in as other solutions are way to damn expensive, we can collaborate if you'd like. I was thinking of using either Trimble AP20 or STIM300 and a ppp/ppk enabled gnss. My biggest problem is I don't have the hardware required, gathering the data is what I think the problem will be.

  • Hi, Matej

    Do you know the math? I'm using Craig Glennie's papers to develop mine. 

    Do you think this is accurate enough? I only use the general formula, not the  Jacobians' one.

    Matej Miljko said:

    Hi Rohan,

    I think you got it wrong. Veloview does not map, it either shows or captures data. The lidar does not combine GPS+IMU+Lidar data. You will need a rtk gnss, a good IMU, a computer and the lidar to be able to map. And then write your own software to do it as there is no available software for that.

    Or you can use slam but you still need an onboard computer to capture the data. And it will not be geotagged.

    The precision of slam depends on the area you are scanning and the speed you are scanning it at while the precision of imu+gnss scanning will depend on the precision of them + lidar precision.

    If you are interested in making it, it isn't as simple as you may think it is, sure it is just math but you need to capture the data accurately.

    Velodyne sends it's data in TCP packages, you need to phrase them to extract data as you are moving while it is scanning(the scans aren't instant) and it causes warping if you look at a full rotation at once (many slam systems do it like that so they can never be as precise for airborne mapping, indoor they'll be fine).

    After capturing all the data you need you need to sync it, velodyne syncs with GPS timestamp but your imu probably doesn't unless you are using a GPS+IMU combined board(there are a few) so there is that.

    Rigorous 3D error analysis of kinematic scanning LIDAR systems.pdf

  • Hi Rohan,

    I think you got it wrong. Veloview does not map, it either shows or captures data. The lidar does not combine GPS+IMU+Lidar data. You will need a rtk gnss, a good IMU, a computer and the lidar to be able to map. And then write your own software to do it as there is no available software for that.

    Or you can use slam but you still need an onboard computer to capture the data. And it will not be geotagged.

    The precision of slam depends on the area you are scanning and the speed you are scanning it at while the precision of imu+gnss scanning will depend on the precision of them + lidar precision.

    If you are interested in making it, it isn't as simple as you may think it is, sure it is just math but you need to capture the data accurately.

    Velodyne sends it's data in TCP packages, you need to phrase them to extract data as you are moving while it is scanning(the scans aren't instant) and it causes warping if you look at a full rotation at once (many slam systems do it like that so they can never be as precise for airborne mapping, indoor they'll be fine).

    After capturing all the data you need you need to sync it, velodyne syncs with GPS timestamp but your imu probably doesn't unless you are using a GPS+IMU combined board(there are a few) so there is that.
This reply was deleted.

Activity

DIY Robocars via Twitter
Practice virtual race this Saturday; the real thing will be on Oct 3 https://www.meetup.com/DIYRobocars/
yesterday
DIY Robocars via Twitter
yesterday
Derrick Davies liked lisa TDrones's profile
yesterday
DIY Robocars via Twitter
Monday
DIY Robocars via Twitter
RT @SahikaGenc: AWS DeepRacer & Hot Wheels Track https://youtu.be/4H0Ei07RdR4 via @YouTube
Sep 14
DIY Robocars via Twitter
Sep 8
DIY Robocars via Twitter
RT @davsca1: We are releasing the code of our Fisher Information Field, the first dedicated map for perception-aware planning that is >10x…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: How this works: 1)object detection to find cones in single camera image, 30 frames/sec on @NVIDIAEmbedded Xavier. 2)comp…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: Use two color cones to guide the robocar. No map needed, on onsite training needed. Just place the cones and it will fol…
Sep 7
DIY Robocars via Twitter
Sep 7
DIY Robocars via Twitter
RT @roboton_io: Great to see http://roboton.io running at 60fps on the cheapest #chromebook we could find! #edtech #robotics #educat…
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Crazy in-depth article about using the OpenMV Cam for Astrophotography: https://github.com/frank26080115/OpemMV-Astrophotography-Gear https://t.co/BPoK9QDEwS
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Hi folks, it's finally here! Our first draft of our Arduino Interface Library is out! It works over SoftwareSerial, Hardware…
Sep 3
DIY Robocars via Twitter
RT @chr1sa: Please let them have an open API. This would be perfect for @DIYRobocars races https://twitter.com/NintendoAmerica/status/1301513099707658246
Sep 3
DIY Robocars via Twitter
RT @SmallpixelCar: Lanenet pretty much used all my GPU power on @NVIDIAEmbedded Xavier since I optimized with tensorRT. I need to run anoth…
Sep 3
DIY Robocars via Twitter
RT @LyftLevel5: Our @kaggle competition on Motion Prediction for Autonomous Vehicles is now live! Experiment with the largest-ever self-dri…
Aug 24
More…