Any idea if Collision avoidance using IR or Sonar sensors will be implemented for Arducopter firmware now that Pixhawk has come out with much higher processing capability?
You need to be a member of diydrones to add comments!
I'll definitely be interested to hear how your optical flow solution works out.
The PX4 Flow could be very good for this I think, relatively high resolution, built in binning and an on board micro capable of doing all the image processing.
Of course the primary problem with optical flow is that in it's normal use it isn't much good for determining absolute distance rather it is excellent at determining horizontal movement and even on the PX4Flow they use a sonar so they can determine the distance and make the calculation for horizontal movement - distance easier.
Certainly when you are approaching something you can tell by increase in rate of motion away from center, but it is still hard to know how far away that is as it varies by "texture".
I would very much like to know how you are dealing with this.
Damn, it my comment is getting deleted for some reason.... Second time I'm writing this.
I will be doing the avoidance by trying to balance the flow on the left/right sides of the frame. Time to contact will be calculated by using the ratio of the size of a camera frame and its time derivative to find the scale change of the feature points that were tracked. But that is just a theory. I haven't practically tried it out yet. Will keep you updated on how it goes :)
You are right, the Px4 flow is a very good piece of hardware for this purpose. If I had the $ I would have gotten a PX4Flow + Neato XV11 rangefinder. The PX4 flow pointed down, fusing its positional information into Lynxpilot's main navigation controller , the XV11 lidar would be generating depth maps and could be used for avoiding any obstacles in the 60cm - 5m range. Onboard ROS would be used for the high-level SLAM , pathplanning and for exchange of all data onboard to and from the Lidar, PX4Flow and Lynxpilot positional / navigational controller. That would be my most optimum low-cost setup for navigation in GPS-denied environments.
Another approach could be to use stereo cams for generation of point clouds in combination with the PX4Flow
Do you think using two flow sensors is a good idea? Like use one of them for horizontal movement calculations while the second one is sort of used as a way to gauge depth and then use a comparison/estimation code to combine the sonar and flow data to work out a theoretically more accurate depth sensing? Kind of like the new HTC One M8 camera solution where by it uses a second camera to sort of read the depth of the environment?
The final version will have two forward facing sensors, left & right sensors, a rear sensor and one looking up. Its being used for indoor inspections and there is a roof to avoid.
any reason why you didn't want to have a downward facing for altitude hold?
I don't think so. Neither one of those sensors are reliable enough. They can barely be used to reliably detect the ground when hovering. Trying to use sonar would probably result in the UAV "shadow boxing" an awful lot, picking up false returns. And IR is not very effective outdoors.
Replies
Hi Kabir,
I'll definitely be interested to hear how your optical flow solution works out.
The PX4 Flow could be very good for this I think, relatively high resolution, built in binning and an on board micro capable of doing all the image processing.
Of course the primary problem with optical flow is that in it's normal use it isn't much good for determining absolute distance rather it is excellent at determining horizontal movement and even on the PX4Flow they use a sonar so they can determine the distance and make the calculation for horizontal movement - distance easier.
Certainly when you are approaching something you can tell by increase in rate of motion away from center, but it is still hard to know how far away that is as it varies by "texture".
I would very much like to know how you are dealing with this.
Best Regards,
Gary
Damn, it my comment is getting deleted for some reason.... Second time I'm writing this.
I will be doing the avoidance by trying to balance the flow on the left/right sides of the frame. Time to contact will be calculated by using the ratio of the size of a camera frame and its time derivative to find the scale change of the feature points that were tracked. But that is just a theory. I haven't practically tried it out yet. Will keep you updated on how it goes :)
You are right, the Px4 flow is a very good piece of hardware for this purpose. If I had the $ I would have gotten a PX4Flow + Neato XV11 rangefinder. The PX4 flow pointed down, fusing its positional information into Lynxpilot's main navigation controller , the XV11 lidar would be generating depth maps and could be used for avoiding any obstacles in the 60cm - 5m range. Onboard ROS would be used for the high-level SLAM , pathplanning and for exchange of all data onboard to and from the Lidar, PX4Flow and Lynxpilot positional / navigational controller. That would be my most optimum low-cost setup for navigation in GPS-denied environments.
Another approach could be to use stereo cams for generation of point clouds in combination with the PX4Flow
Kabir
Do you think using two flow sensors is a good idea? Like use one of them for horizontal movement calculations while the second one is sort of used as a way to gauge depth and then use a comparison/estimation code to combine the sonar and flow data to work out a theoretically more accurate depth sensing? Kind of like the new HTC One M8 camera solution where by it uses a second camera to sort of read the depth of the environment?
any reason why you didn't want to have a downward facing for altitude hold?
I don't think so. Neither one of those sensors are reliable enough. They can barely be used to reliably detect the ground when hovering. Trying to use sonar would probably result in the UAV "shadow boxing" an awful lot, picking up false returns. And IR is not very effective outdoors.