3689579052?profile=originalHere's a couple of screenshots of a little experiment I did today. This is a slightly reduced mesh from the point cloud example yesterday. In this work I'm using the blender game engine for path planning of an autonomous robot. My "robot" is that green cube over there and it's trying to get to the purple sphere. I'm using a navigation mesh laid over the parts where I don't want the robot to go at all, for example grass, the hedge or that mountain thing. Since my environment is georeferenced, I can simply take the "blender" coordinates from the waypoint list that the planner is generating and in theory feed them to an actual robot navigating the actual real world environment. If the environment is static and nothing changed, this works well. Not a guarantee for dealing with dynamic objects yet, but even the Google car makes a full plan to get to a destination, then tunes this for the actual situation depending on sensors that sense the immediate environment.

I already said the words: "navigation mesh". A relatively recent and better technology used for AI path planning for game characters is a navigation mesh, assuming you're working with characters that touch the ground. It's just a set of polygons that determines where a character/NPC is allowed to go and where it's not. For an impressive demo of how this works, check out this link. ( Some background here: previous algorithms used lists of waypoints. When you've ever seen NPC's in a game stuck in a running position on an object, then those would typically use those waypoint lists. They can't deviate from that path because they can't make assumptions, so get stuck when there's something in the way. This also makes the same robots collide or unable to avoid dynamic objects in their path. )

Setting up a little AI environment for experimenting is surprisingly easy in Blender. A very nice tutorial video showing how to do that is here.

This is all relevant because mainstream drone technology right now relies on operators to do the actual path planning. I think that's going to change in the future when you hook up databases with more specific information, deal with a large number of constraints, execute dynamic scenarios or are unsure of the vehicle capabilities. The idea is to have the computer propose a (list of) routes just like my Waze does, instead of plotting out my road wp for wp.

What blender thus allows you to do is set up a simulation environment and explore the algorithms and not waste precious time on the visualizations. Since it's all from within blender, offering great support for primitives, meshes, lines and points, it's pretty simple to add visual cues to the simulation. The biggest benefit is how easy it is to model a testing environment. Because it's so easy to work with python in the game engine, it should also be easy to 'drive' the blender simulation using an external simulator as well, for example a flight simulator that has a better flight model for uav's.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

This reply was deleted.