First DIY Zoom Gimbal


Interested in a adding optical zooming to your DIY UAV systems? It is now possible to build a field tested zoom gimbal by yourself. Vertigo DIY kit gives you an opportunity to enter the specific fields of drone services such as emergency, surveillance, monitoring, agriculture or computer vision development. 

We prepared 3 buying options: Full kit + .STL’s files, Electronics kit, and .STL files only.


This is the best price solution on the market right now, and it will bring you most of the functionality that is brought by professional UAV systems on the market in any drone capable of lifting 398g | 0.87 lbs of payload, and it will work fine in most diy multirotors and some larger airplane frames.

If you have an access to 3D printer you will be able to produce several pieces for yourself, and start some larger scaled missions quite fast. 


We are excited to see what upgrades are introduced in your versions of the gimbal, so please share your builds with us. We hope it will help to develop better solutions in future for all of us in DIY community.

We recorded a full assembly tutorial, so the build should go very smooth. If you need more assistance please feel free to contact us directly (via as well.

First 4 sets sold to users will also include parts already printed for your convenience.  

Hope you will have a great time building your own zoom gimbal. 

More information:


E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Hey @OlliW,

    we have a code that will work on most of the companion computers like RPi, Odroid, Orange Pi, BananaPi, Nvidia TX2 with ubuntu, python, and OpenCV for python installed. As long as they have at least one USB 3.0 port. 

    As for the EIS, i will take a closer look at the approach you proposed. I will have to take the zooming facotr in the consideration, but the information about the zoom position is possible to obtain. In fact, we already have as we slow down the gimbal movement when the camera is zoomed in.

    Hi @Josip Nunth,

    We tested OpenCV on Odroid and Rpi. In Ubuntu it works without any drivers, because it uses UVC (USB Video Class). We also provide drivers for Windows as well as the viewing software.

    If i was to use Jetson TX2 i would leave it for stabilisation and other practical applications, and use a Hacklink or similar link, where it converts the video to h264 for you.

  • Hello Greg, Has you tested OpenCV on any embedded system? Do you think Jetson TX2 could do stabilization, tracking and h264 encoding at time? 

  • Hey Greg

    many thx for the explanations

    I'm a bit confused now as regards to what is what, too many key words (2 axis, Sightline, Snapdragon, ...) from which I can't piece together what you are doing (as opposed to what else options are there) ... your answer makes me think you are using your own software solution on whatever companion computer

    anyways, this was the basics of my question: in many cameras EIS based on gyro signals is used, and - as much as I can tell - with much success, and I must say, the principle makes totally sense to me, as one doesn't have to first analyze the video with whatever smart or non-smart and resource hungry or non-hungry code to retrospectively infer what vibrations/shakes might be there, but can act directly on data in a numerically much cheaper and yet effective way. The point is, the STorM32 NT does give you that data for free, so it should be "simple" to do. I'm carrying this idea with me since more than 2 years, but myself never had the time, but I believe it could be highly effective, and believe for someone who is already in opencv fairly simple to implement. There are some interesting papers on the principle out there in the web.

    Cheers, Olli

  • Hi OlliW,

    we are using opencv to capture good features to track in the feed and then move the video matrix so that the picture is cleaned from rapid shakes visible in full zoom. This takes some computing power to achieve but is effective. Some video processors have that built in their API like Snapdragon for example. The addition of IMU data from the controller could be helpful  in the process, but i am not sure if this would work as a single source of data, as i never tried it before. It seems to be logical, but in reality some additional factors may enter in the equation. But definitely this could improve current method with small computing power factor.

    The video that Jose attached is a video from a 2 axis gimbal and the roll axis is managed with software. So the software part manages to stabilise roll and additionally clean the frame from turbulent movements while zoomed in. I understand the need of feedback from the software to the gimbal controller as it could optimise the current attitude so that software could display larger amount of scene.

    Here is a sample video:


  • Hey Jose

    oh, sorry, I was not looking properly at who was actually answering but just assumed it's who I asked, my apologies

    the sightline link is quite instructive, thx

    I'm not sure I have your rcgroups user name, but I do have Greg's I think, and will send him a PM about a point I'd like to make

    if you need different/better/other commands for controlling the gimbal or serving your needs, please do not hesitate to tell, I'm genuinly interested in improving the controllibility

    all the best


  • I am not Greg!

    Yes, we talk about the same, stabilize the image by software.

    I mean when you have all zoom in you could need than stabilization algorithm could control gimbal, but it the same algorithm, yes, to compensate yaw drifts, like a tracking to keep pointing the same scene.

    something like this:

    I was talking with the people of Sightline in Xponential, they do image processing for gimbals, stabilization, tracking... the need to control the gimbal for tracking function.

  • Hey Greg

    thx for the quick response

    I'm not sure I understand what you're saying though, maybe because we mean different things by "stabilization". I was thinking terms of video stabilization, i.e., there e.g. vibrations, jello and such artefacts are removed by software. Reading your answer it sounds a bit that you mean removing horizon drifts, yaw drifts, and such kind of artefacts. What do you mean?

    yes, the latest v3.3 STorM32 controller supports UAVCAN, and it works great, and the v3.3. board might become even available soonish ... there is however no official flight controller software which supports it, and, frankly but somewhat sadly, I doubt this will change anytime soon ... you might add it your companion though, (or use a fork)

    BTW: if it is indeed horizon and yaw drift compensation, you might want to check out the STorM32-Link feature, which is available since 1/2 year or so:

    BTW2: just in case if what your stabilizer does would be better placed insight the STorM32 controller, I'll be happy to consider this

    cheers, Olli

  • OlliW, Maybe even in the opposite direction, that the stabilizer acts on the gimbal to maintain a frame and compensate the drift of the sensors. 

    You are doing a great job with the STorM32, I would like to face the development of a camera for our drone and we are sure that we will go to STorM32. I have seen that you are supporting UAVCAN in your new controller!!!

  • hey Greg

    > We have a digital stabilisation code that works,

    now this is interesting. Maybe I can ask: Does your stabilization works only on the video feed, or does it also in addition use data of some sensors (gyro e.g.)?

    cheers, Olli

  • Hi Jose,

    we provide a USB 3.0 variant of the 30XHD V3 gimbal. It allows a very straight forward connection to a Rpi, Odroid or Tx2. We have a digital stabilisation code that works, but it is not yet implemented in the gimbals.

This reply was deleted.