Hi, guys,
A few of our engineers from UgCS team recently built a project in their off-time that we thought some of you might be interested in seeing.
Since kids having lots of Lego bricks can cause somewhat of a chaos, the guys from our team have built an automated lego sorter to deal with this problem. It uses a vibratory bowl feeder and an artificial neural network (ANN) to feed the bricks up a path, identify them and sort. Currently, in the prototype version, it's only working with maximum brick sizes of 4x2, but with some modifications it can be made to accept a wider range of Lego bricks.
Here's the video of it in action:
We're curious to hear your thoughts on it and maybe some ideas on where it could be used or how it can be improved.
If you are interested in development of this prototype to a more advanced device, feel free to drop us an e-mail at ugcs@ugcs.com
Get the newest version of UgCS here: https://www.ugcs.com/
Wishing you a happy Holiday Season and drone-ful 2018,
UgCS Team
Comments
$375 would buy a vibratory bowel feeder or a new lego set. But it does have the words "neural network" in it.
Ideal lego sorter should be integrated with vacuum cleaner... We will work in this direction :)
You can probably sort them out mechanically, but there is only one way to be sure you have collected them all from the floor: .....walking barefoot ;-)
Just kidding, my son, which I shared the post with, is already excited! Great project!
It's the first prototype built to have all components connected and working. As you can see on the video we have implemented a simple linear common control algorithm. We think that we can reach a performance of 2-5 bricks per second.
A neural network is of course an overkill for sorting same bricks by color :).
But the target is to build a universal sorter - a truly universal sorter that can be used for any shapes/colors and not only for Lego bricks. Here the AI will play a significant role...
Great project guys!, but why is the seperation time so slow?
At my University I saw a similar project but with much faster seperation time.
It was based on a color sensor based on RGB values and reflection coefficent based on color reflection. not based on a standard camera. The processing was made using ARM Cortex M4, and the major components such as motor, gear, solonoid etc. was modelled and used a MPC (model predictve control) for high accuracy and fast response.
Also using a neuralnetwork might be overkill for this purpose?
But after all the mechanical part looks good.