404Warehouse's Posts (5)

Sort by

Cansats are a can size satellite used for educating students to teach how to design and implement systems. Cansats are launched in small rockets which drop the cansat in around 100m ~ 3km altitude. From deployment and operation of the satellite is done in the same procedure as of the real satellites. However, in a dense country or in city regions, launching rockets are restricted to the area available. Small rocket trajectories are usually uncontrollable requiring a large safety radius for launch(~5km for Mclass Rockets) A way to be able to drop a cansat while remaining a small safety radius is needed to deploy cansats or be able to test the deployment before launching on a real launch in a densely populated area is needed. This project proposes a way to deploy a cansat in high altitude using a quadcopter which is guided by GPS enabling it to deploy a cansat in high altitudes.

For further information, visit :


Read more…

I have been developing a flight mode for the quadcopter to stabilize on a constant yaw rate. This is to have a flight mode which can simultaneously scan and localize inddor environments or outdoor landmarks while mounting the lidar/camera sensors rigidly on the quadrotor structure to save weight.

Below the video is a simple proof of concept of the new flight mode. The quad spins at a constant yaw rate and the pilot controls the quad by activating 'Absolute Mode'

However, as I spin the quad in a faster yaw rate, the absolute mode reference seems to drift and change directions. That is, the reference direction changes, making it hard to control. The video is at 180 deg/s yaw rate. I would like to increase this to up to around 500deg/s. I am suspecting maybe the control loop is running in a low bandwidth but don't see how I can improve the performance.

Moreover, while moving in one direction is working fine but when I change directions vertically while moving in a translational motion, the quadrotor does a kind of swinging movement as shown in the video. What should be done to increase the controllability of this concept. 

Any ideas or comments will be a big help to resolve the matters. 

Read more…

We are tring to build a FPV Rover combining a logitech racing wheel with an RC Car for an FPV platform.

However we are having some problems with the latency and tracking of the steering wheel input. 

- I have checked the RC inputs using MAVlink and checked that the rc override was working fine

(The racing wheels seem to be in sync with the rc inputs)

- The steering does respond but is rather slow in response and does not move to the ends

- I checked this using the my turnigy 9x receiver but there was no problem with the steering, steering responds well

-> to conclude,

The RC inputs seem to change properly commanded by the racing wheel but when steering with mavlink latency problem and tracking problem comes in. There is no problem when controlling with the rc transmitter

Does anybody have an idea to fix or go around this problem?


-Ardurover v2.47

-HKPilot Mega 2.7

Ground Station

-Mission Planner

-Logitech Wingman Formula Force GP

Read more…

FPV Series: FPV Driving

Project Overview

This project was done to understand the basic idea of first-person-view (FPV). FPV can be referred to as remote-person view (RPV), or simply video piloting. FPV has become popular recently in the radio control hobby community.

Jaeyoung Lim, Mechanical and Aerospace Engineering, Seoul National University

Dongho Kang, Mechanical and Aerospace Engineering, Seoul National Univerisry

Specific components and discussions can be found on the 404warehouse blog


- Camera gimbals or head tracking modules was not integrated in the system. This resulted in the feeling that I had a very narrow field of view. This gets better if you have your head fixed to a fixture to constrain your head movement.  So it can be concluded that the feeling of constrained field of view was coming from the head movement not being affected by the user’s own head movement.

- The sound made it confusing as driving near myself the sound of the car was heard where the driver was standing. Driving near myself,  it was confusing to decide which way to turn to avoid myself even it was very obvious on the screen.

Read more…

Vision Guidance Filming Drone


The Vision Guidance Filming Drone is a research project to study the possibility of an application of a general use of an autonomous flying quadcopter.

Sanghyun Hong, Electrical Engineering, Seoul National University

Jaemin Cho, Electrical Engineering, Seoul National University 

Jaeyoung Lim, Mechanical and Aerospace Engineering, Seoul National University

Dongho Kang, Mechanical and Aerospace Engineering, Seoul National University


Currently the culture of sharing videos of outdoor activities are growing. The VisionGuidance Filming drone is a research project to study the possibility of an application of an autonomous guided quadcopter designed for the mass.
The vision guidance filming drone tracks and follows the protagonist in the video so that quadcopter may be able to film the protagonist. Currently people use poles or helmet cams to hold the camera and take selfies. The vision guidance filming drone automates the process by following the protagonist and film the selfie. (which is not exactly a selfie as it is the ‘drone’ itself filming the person not the person himself)

This can be for various applications such as outdoor filming such as skiing, surfing, driving or indoor filming.

The project was funded and displayed in ther Creative Design Fair held by Seoul National University, Capstone international Design Competition, Electrical Engineering Fair

The source code of the flight software and project overview is in 404 warehouse blog

3689594980?profile=originalThe gaphic above demonstrates the flight status, which the flight status could be determined by glance. The number of mountains show the altitude, and the mountains flow in the direction of flight. the diagonal lines indicate the four corners of the marker which the drone is tracking. 

Read more…