Achal Negi's Posts (18)

Sort by

FlytCloud is an enterprise-grade cloud platform to automate and scale your commercial drone operations - Enabling you to remotely deploy, a fleet of intelligent drones.

FlytCloud provides real-time access to control, telemetry and video/payload data from your drone fleet, over a secure and reliable interface.

AI-engine and integrations with various 3rd party services, allow further extension of its capabilities.

Visit https://flytbase.com/flytcloud/ to learn more.

Read more…

3689733306?profile=original

Drones have already been in use in the security and surveillance industry, bringing a significant change in how the operations are carried out. However, most current aerial security and surveillance systems are either tied to a particular drone hardware, or need significant manual intervention during operation. These solutions lack critical features and software capabilities, such as, AI and machine learning for automated alerts, automatic mission scheduling, compatibility with wide-range of drone hardware, etc. This makes it expensive, and often infeasible, to deploy the drone-based security/surveillance solutions at scale.

Drones have already established the value that they bring to the table, in terms of mobility, unrestricted bird’s eye view and accessibility. The focus is now on efficiencies and realising a meaningful return on investment for wide commercial adoption. This calls for integration of “intelligence” and “connectivity” with drones, to build completely automated and integrated workflows.

3689733359?profile=original

FlytSecurity offers a plug-and-play, drone-agnostic, SaaS platform to quickly deploy and scale drone-based automated security operations. This significantly cuts down the cost of development and time to market, translating into an attractive ROI for the drone security service providers. With a wide range of features, like, 4G/LTE connectivity over unlimited range, live video, control and telemetry, fleet management (for simultaneous coverage of a large, distributed facilities), AI/ML for automated alerts, automated mission schedules, FlytSecurity enables fully-automated 24×7 operations at scale. Compatibility with any drone hardware, further makes FlytSecurity easy to adapt to variety of customer requirements (large/small drone, long/short endurance, quad-planes/multicopters, thermal/RGB sensor, etc.), and makes it easy to upgrade hardware at any time.

For early access to FlytSecurity, please visit: https://flytsecurity.ai


Read how FlytSecurity is transforming drone security & surveillance operations

Read more…

Precision landing is a critical requirement for a large number of commercial drone applications, be
it autonomous routine patrols for security & surveillance, package delivery at multiple locations,
remote inspections using docking stations or a GPS-denied environment like a warehouse. It is
one of the key components for automating and deploying drone operations at scale.

GPS alone is not accurate, IR beacons get affected by surrounding conditions and require power
on the landing site, RTK-GPS is complex to setup, requires additional infrastructure and it still
does not give the desired results.


FlytBase, the company bringing intelligence and connectivity to the
drones, today announced the release of an automated precision-landing solution, FlytDock- the
world’s smartest visual target landing solution, compatible with the widest range of drones.

3689732139?profile=original

FlytDock enables the drone to precisely align and land itself on the site with a centimeter-level
accuracy. It works across conditions; whether it is landing in day or night, outdoor or indoor
(GPS-denied) environment, on a ground-level or elevated platform, or even on a moving or floating
(in water) platform. Powered by FlytOS, this intelligent plugin utilizes computer vision techniques
and dedicated landing algorithms to precisely align, approach and land the multirotor on a visual
marker on the ground. There is no infrastructure/electronics required on the landing site,
making it easy to deploy at scale. Further, the system can be remotely managed and controlled
over cloud (4G/LTE).

FlytDock is readily compatible with DJI Enterprise, Ardupilot, and PX4 based drones.

To learn more visit https://flytbase.com/flytdock

Read the full blog.

Read more…

3689727367?profile=original

We are excited to announce the release of the FlytOS v1.5-5. You are requested to update your FlytOS using the OTA automatic update feature. This version of FlytOS introduces RPi camera support for video streaming over FlytCloud, FlytOS beta support for DJI, and other minor updates and bug fixes.

Please refer to the release notes for the full list of changes: https://goo.gl/AhhzCv

Update Instructions: https://goo.gl/YdH5W9

What’s new in FlytOS v1.5-5

Released on 25th January 2018

Key Updates

  • added RPi camera video streaming API via FlytCloud
  • released FlytOS beta version for DJI

Bug Fixes

  • FlytOS during boot up on NvidiaTX1, would freeze its UI
  • position_set_global API for APM sometimes sent drone to incorrect location
  • position_set_global API would return success=true even if command terminated due to timeout
  • FlytOS during boot up would delete saved values of user created params in few cases
  • exec_script API when used to trigger complicated python script would fail sometimes
  • FlytOS would not allow RC to switch over from OFFBOARD mode for PX4 in some cases

Minor Updates

  • FlytSim now runs latest arducopter SITL firmware (v3.5.4)
  • FlytOS now also checks newer FLTMODE feature of PX4, to verify API (OFFBOARD mode) switch position of RC
  • Navigation APIs now detect valid position data even if position sensors other than GPS are used. Only for PX4
  • added support for maxbotix MB1242 (I2C) sensors, allowing obstacle avoidance for APM
  • added support for JeVois cameras (upto 2 camera supported) and enabled its ARTag detection API
  • running stop_flytOS.sh script now prints relevant info, as opposed previously.

FlytOS is being continuously evolved to add more features, wider compatibility with hardware options, more sample codes to assist developers, fix bugs and enrich documentation. We request users to keep their drones updated at all times. Instructions for installing the latest update are available here.

To learn more about FlytOS visit https://flytbase.com/flytos

Source

Read more…

Get a headstart with FlytPi Kit at just $149. Learn more: https://goo.gl/kyw7AP

This video demonstrates, how to setup your drone with FlytBase Cloud. You can send real-time navigation commands, get telemetry and payload data using FlytConsole LIVE dashboard. FlytBase Cloud helps you connect your drones to the cloud. Set UAV missions, get telemetry and payload data remotely over 4G/LTE.


FlytBase Cloud supports all major drones available today. To learn more, visit: https://flytbase.com/cloud
 or schedule a free consultation with our application expert here: https://goo.gl/SwUHRp

Developing applications for delivery, public safety, construction, inspection, autonomous drone docking station, emergency response etc.? We can help you, write to us at letstalk@flytbase.com

Read more…

Hey Everyone, we are creating this tutorial series to help you connect your drones to Cloud. (You can use your existing setup with a CC and 4G/LTE dongle)

FlytBase Cloud helps you connect your drones to the cloud. Set UAV missions, get telemetry and payload drone data remotely over the Internet
This video helps to set up a virtual drone. It provides one-click access to cloud simulator instance for drone developers. You can test applications in a safe and secure environment, resolve issues and accelerate your development process.

You can signup and request beta access at my.flytbase.com/cloud

Support us by hitting subscribe on our Youtube Channel 
https://youtu.be/lNCA13mgbdw

Let us know your thoughts in comments. Thanks.

--

If you are developing applications for delivery, public safety, construction, inspection, docking station etc.? We can help you, write to us at letstalk@flytbase.com

Read more…

3689721734?profile=original

FlytBase will be releasing its AI platform for drone applications at the Drone World Expo, San Jose, on 3rd October 2017. FlytBase has built the world’s first IoT platform for commercial drones, the “Internet of Drones” (IoD) platform. Continuing on its mission to bring intelligence and connectivity to commercial drones, FlytBase is now extending its cloud and edge compute platforms to incorporate AI and machine learning.

Drones generate vast amounts of data, which is usually in the form of images or video streams. Identification of objects of interest, counting them, or detecting change over time, are some of the tasks that are monotonous and labor intensive. FlytBase AI platform offers a complete solution to automate such tasks. It has been designed and optimised specifically for drone applications. The cloud-based training system leverages the scalability of the cloud to accelerate the training of models, to suit various customer requirements. Based on the use-case, the trained model can be deployed in the cloud (for post-processing of data) or on the edge (for real-time analysis).

FlytBase AI platform is optimised for interpretation of drone data, and it seamlessly integrates with the rest of FlytBase platform to offer connectivity with your business applications.

Be the first one to know more about FlytBase AI Platform, signup to stay tuned. Visit: https://flytbase.com/ai


-
---------------------

You can join Nitin Gupta, CEO FlytBase Inc. in the panel discussion with other leaders from the Industry on The Role of IoT, Software & Platforms in the Commercial Drone Ecosystem

At Drone World Expo October 3, 2017 | Conference Room No. 1, San Jose Convention Center, San Jose, CA

Event Link: http://www.droneworldexpo.com/sessions_detail.asp?id=4425

Source

Read more…

3689720976?profile=original

Heart disease remains the No. 1 global cause of death with 17.3 million deaths each year, according to “Heart Disease and Stroke Statistics — 2015 Update: A Report From the American Heart Association.” That number is expected to rise to more than 23.6 million by 2030, the report found.

A minuscule number of these patients get defibrillation in-time and survive. The time between the cardiac arrest and defibrillation of the patient is the most crucial factor for achieving higher survival rates.

  

aed_1.jpg?width=500


FlyPulse, a Sweden based company, is building the next-generation technology, based on their specially-designed drones, to address this challenge. LifeDrone AED is a transportation drone, equipped with an automated external defibrillator (AED) — a portable electronic device that automatically diagnoses the life threatening cardiac arrhythmias and is able to treat them through defibrillation, the application of electrical therapy.

FlyPulse develops, sells and provide services with LifeDrone AED to medical and rescue service providers. Our system can improve the survival rate for people with cardiac arrest, where it is difficult to arrive with an AED in time with ambulances or other transportation,”  explains Sebastian Wallman, Co-Founder and CEO of FlyPulse. 

LifeDrone AED FlyPulsehttps://blogs.flytbase.com/wp-content/uploads/2017/09/crew-300x180.jpg 300w" sizes="(max-width: 500px) 100vw, 500px" />

To reach their goal, FlyPulse needs to build and integrate a number of complex technologies. This includes the mechanical drone system, it’s payload, electronics, and the software to manage complete operations, with a high level of automation. Of course, they have to build a viable business model around this technology and make sure that the technology addresses the requirements of all key stakeholders.

Software plays a critical role in the success of this application. They need to integrate the following major modules:

  • Flight control and safety
  • Payload management
  • Cloud connectivity for remote telemetry and control
  • Fleet management through a centralized dashboard
  • Live video streaming for situational awareness and remote patient monitoring
  • Collision avoidance and airspace management
  • Integration with first responders’ applications, emergency help lines, messaging platforms
  • Authentication, security and fail-safe management
  • Custom user and operator mobile/web apps with status updates

Manage fleet of droneshttps://blogs.flytbase.com/wp-content/uploads/2017/09/vlcsnap-2017-09-11-17h08m42s132-300x169.png 300w, https://blogs.flytbase.com/wp-content/uploads/2017/09/vlcsnap-2017-09-11-17h08m42s132-768x432.png 768w, https://blogs.flytbase.com/wp-content/uploads/2017/09/vlcsnap-2017-09-11-17h08m42s132-740x416.png 740w" sizes="(max-width: 770px) 100vw, 770px" />

After careful evaluation of their options, FlyPulse decided to partner with FlytBase, and leverage its products and expertise, to accelerate their drone application development.

Sebastian Wallman, CEO of FlyPulse, is excited to announce the partnership, “we get a very fast pace in our development with the help of FlytBase team and their products. Partnership with companies like FlytBase gives us the development speed we need.”

FlytBase offers the world’s most advanced drone software development platform, with “intelligence” and “connectivity” as its core features.

FlytOS, its operating system for drones is compatible with the widest range of drones and computer hardware platforms. This enables developers to make their applications agnostic to hardware. It provides a number of other capabilities, like, navigation and control, computer vision, payload management, authentication and security, machine learning, a simulator for testing, SDKs for web/mobile.

FlytBase Cloud
 extends a number of these capabilities to the cloud. It helps developers manage large fleets of drones, connect over wifi/4G networks, and easily integrate drones with other business applications.

FlytBase has deep expertise in drone technology and has won numbers of startup awards for their innovation in product and business. They recently graduated from the Cisco Launchpad Accelerator program. Their team of experts has an extensive background in Aerospace, Computer Science, and Robotics.

Sebastian comments, “very positive, helpful and skilled team. This is one of the most important reasons why we want to work with FlytBase.”

Nitin Gupta, CEO of FlytBase, adds, “FlytBase provides all the critical building-blocks to assist in quick assembly of complex drone applications. We look forward to working closely with FlyPulse team to help them build a successful professional product in the shortest possible time.

 

To learn more about FlyPulse, please visit: http://www.flypulse.se

To learn more about FlytBase, please visit: https://flytbase.com

————

Building a commercial drone application? Let FlytBase experts help you accelerate your development!

Schedule a call with our Drone application expert today.

Join our exclusive discussion group on Facebook

Source

Read more…

3689719653?profile=original

Introduction

The true success of any commercial product is defined by its customers. Customers who believe in the product and use it to solve real world problems. At FlytBase, we celebrate our customers who leverage our platform to build innovative commercial drone applications for their industry verticals.

One such customer is Shingo Matsuura. Shingo runs a research lab, Systems Nakashima Co. Ltd in Japan. Shingo’s most recent customer was a construction company, looking to integrate drones into their regular workflow, for improved efficiencies. They approached Shingo with a requirement to develop a deep learning solution deployed on a drone, that could detect and recognize construction equipment on site.

After evaluating various available options, Shingo decided to use the FlytBase platform to implement his solution.


How did FlytBase help?

“FlytBase platform is compatible with different autopilots and allows you to work on companion computer of your choice. This independent nature of the FlytBase platform made it an obvious choice.” said Shingo, when asked about why he chose FlytBase.

Shingo made use of FlytBase’s deep learning and object recognition capabilities to automatically recognize various equipment and to track their movement. He took inspiration from the Deep Learning Demo available with FlytOS on the Nvidia TX1 platform. FlytBase offers ready APIs and SDKs for quickly integrating complex drone applications, including, deep learning capabilities.

Visualizing the data on the user-interface, over a remote connection, is critical to the success of any drone application. Shingo used the FlytConsole, a web-based mission control software, bundled with FlytOS, to build his custom user-interface. This greatly simplified the visualization of detected objects, live video feed and exposing other controls on the UI.

FlytBase platform provided Shingo with all the tools, that he needed to assemble his drone application in under 10 days.

Results

Shingo showcased his solution at the Artificial Intelligence (AI) Expo held in Japan, which demonstrated the power of FlytBase platform. His work garnered a lot of attention and was very well received by the audience.

FlytBase congratulates Shingo Matsuura and everyone at Systems Nakashima for their success, and wish them the best for their future work.

————

Building a commercial drone application? Let FlytBase experts help you accelerate your development!

Visit flytbase.com/applications

Source

Read more…

3689719392?profile=original

FlytSim offers SITL (Software In The Loop) simulation environment for testing user apps without the drone hardware. FlytSim simulates the drone and its world, programmatically generating the state variables, while the control algorithms applied are same as onboard the drone. The FlytAPIs are also available in FlytSim and thus the user apps built with these APIs can be tested on any computer running FlytSim.

With FlytSim as a Docker app, FlytBase brings the power of Docker to our FlytSim developers. This eases FlytSim’s deployment procedure in any docker supported #Linux#Windows and #Mac environments.

Find detailed instructions here: https://goo.gl/hfJSMJ

 Visit https://flytbase.com/flytos/#flytsim to learn more

Read more…

3689719115?profile=original

Order your FlytPi Kit here


We, at FlytBase, are on a mission to empower drone developers and help them accelerate development of drone applications. Over last few months, we have introduced a suite of products, including, FlytOSFlytSIM, web/mobile SDKs, FlytBase CloudFlytConsole. We have been working hard to improve the user experience to further simplify the drone application development process.

Releasing the Internet of Drones platform is a step in that direction. By offering industry’s first Drone-API as a service, we have opened up a new world of exciting possibilities.

One of the key concerns of our users has been the complex process of initial setup. Both, the software installation, and hardware setup, is quite time consuming. Simple things, like, an adapter to power the companion computer on a drone, are not easily available. To address this pain point, we are introducing a ready-to-use companion computer kit for your custom drones, the FlytPi Kit.

FlytPi is a turnkey companion computer that comes pre-loaded with FlytOS Commercial Edition for Pixhawk/ Pixhawk Mini/ Pixhawk 2/ Cube Autopilot. With FlytPi, you can simply connect your drone autopilot with a specially designed cable and enhance the capabilities of your drone.

3689719136?profile=original

The kit also comes with a dedicated power module to power up FlytPi, Autopilot and Electronic Speed Controllers (ESC) using a single drone battery. It offers a hassle-free connection with the Autopilot, thus saving time and effort in getting your drone applications built and deployed.

Here is a list of features and specifications of FlytPi:

Features:

  • Plug & Play Companion Computer. Hassle-free connection with the Autopilot.
  • Pre-loaded FlytOS Commercial Edition.
  • Supports both Ardupilot and PX4 flight stacks.
  • Pre-Installed FlytOS Sample Apps (See description below)
  • FlytConsole – Web based GCS.
  • Access to extensive drone APIs for navigation, telemetry, and payloads.
  • FlytSDK for Mobile/Web applications.
  • Regular performance updates and bug fixes.
  • FlytSIM Simulator to test drone apps without drone hardware.
  • Priority Email and Chat support from FlytBase experts.

Add-on features:

  • Instantly connect your drone to FlytBase Cloud.
  • Real time access to drone navigation, telemetry and payload data over 4G/LTE.
  • Integration with 3rd Party Apps.
  • Cloud Simulator to test your applications.

You can order your FlytPi Kit on our store page.

As we continue to roll out exciting products for the drone community, your feedback is extremely critical. It helps ensure that we understand your problems and built the right solutions tailored for your requirements. Please feel free to send us your queries, suggestions, and comments at support@flytbase.com. You may also use our forum for your technical queries.


Source

Read more…

xSocial-media-templates-1-588x381.png.pagespeed.ic.UjX7m2wYEn.png

Drones are now matured as a technology and are better understood by developers and users alike. We at FlytBase, have made efforts to provide you with some open source drone applications. These drone apps provide a base platform for your app development and can be customized as and when needed. We invite interested developers to contribute to this repository and help us grow this resource. So without further ado, here is a list of the 10 open source apps for drones you can use.

1. Joystick App (Download Mobile AppSource code)

This simple Mobile/Web App allows the user to control the drone using a joystick interface. You can take off/land your drone with the press of a button and fly it in any direction.

This app uses FlytBase Drone Navigation APIs to send velocity setpoints to your drone and eventually control Roll, Pitch and Yaw movements of the vehicle. You can find the source code and download the Joystick web app here.

joystick-768x432.png?width=550

2. Video Streaming App (Download appSource code)

Live video streaming is a must have feature in almost any drone application.  This Drone App allows the user to view live video feed from the drone camera. The video is streamed is done over Wi-Fi from a camera attached to the Companion Computer mounted on your drone. You can find the source code and download the app here

videostreaming-768x432.png?width=550

3. GPS Follow Me App (Download appSource code)

GPS Follow Me mobile app enables your drone to follow you around.

This is a sample mobile app using GPS-based positioning. When the follow-me mode is turned on, the GPS coordinates of the device (running the app) are used to send position commands to drone and it starts following the person carrying the device.  

Visit here for detailed information and source code.

gpsfollowme-576x1024.png?width=350

Check out this video to see how GPS Follow Me App works:

4. AprilTag Detection App (Source code)

AprilTags are 2D barcodes developed for robotics applications and are now being used in various drone applications. This project integrates AprilTags detection with FlytOS. FlytOS takes input image from the camera and publishes processed image with detected AR tags. It can be used with any AprilTags family. You can find detailed information and source code here.

apriltags-768x436.png?width=550

The video below shows a demo on detecting AprilTags with FlytOS:

5. Obstacle Detection using Sonar (Source code)

This Obstacle detection app uses Sonar connected to Arduino to detect obstacles from 6 directions and publish the distance data into ROS. The app also gives you the option to visualize live data received from the distance sensor.

sonar-768x510.jpg?width=550

You can write your own custom onboard application that consumes the sonar data and produces an obstacle avoidance behavior. Download the app and check out the source code here.

 

6. Visual Servoing (Source code)

This app uses gimbal APIs to control a 3-axis gimbal in order to keep the camera(mounted on gimbal) focused on an object of interest in the vicinity. FlytOS vision APIs are used to detect and track the object. Read the full blog on Drone API for Gimbal Control [How to do Visual Servoing]

gimbal-768x432.png?width=550

Check this GitHub link for instructions and source code to setup this app for your drone.

Also, check out the video explaining how to do visual servoing using your companion computer:

Continue Reading

Read more…

Visual Servoing using Drone Gimbal API



While choosing the right camera and streaming the video from a drone to your app has always been a frequently asked question. There we’ve several Drone API and SDK to build awesome apps for your drones.

What comes along is the selection of a suitable drone Gimbal to stabilize and maneuver that camera. And, what becomes more challenging is when you wish to gain control of gimbal’s Roll, Pitch, and Yawmovements through your code or app.

Whether you are trying to automate the drone gimbal to take that perfect Cable Cam Shot or implement something such as Visual Servoing, Optic flow or any other Machine Vision application, controlling and integrating Gimbal Movements is necessary for your algorithm and hence your app.

FlytBase Drone Gimbal API lets you take control of Roll, Pitch and Yaw movements of your Gimbal through simple API calls. You can send commands from your Companion computer to the Gimbal connected to an autopilot.

Drone APIs are available in JS REST, JS Websocket, ROSC++, ROSPy, C++ and Python to build onboard as well as Mobile/Web application.

You can download FlytSIM – Drone App Simulator to test drone APIs and your Apps.

Visual Servoing Demo and Tutorial

In the Demo below we have demonstrated how a complicated drone application can be built easy & quick by using available FlytBase Drone APIs and Mobile/Web SDK.

 

We implemented a Visual Servoing application using drone gimbal. Visual servoing, also known as vision-based robot-actuator control, is a technique which uses feedback information extracted from a vision sensor (in this case Camera) to control the motion of a robot/actuator (in this case Gimbal). Below is the image showing the architecture of the Visual Servoing application.

gimbal-arch.png?width=550

Using this application you can select and lock onto the desired object and Gimbal will keep tracking that object (as long as it’s under the scope of the Gimbal)

The application uses following APIs and SDK

– Vision API to detect and track object

– Gimbal API to control attitude setpoint of gimbal (Roll, Pitch, Yaw)

– Web SDK to build a web-based application interface.

– Video Streaming API to stream the video to the app to see and select the object in the video.

( You can use Navigation APIs to give setpoints and control drone navigation from your app.)

You can find the Source Code for the app on Github in FlytBase Drone Sample Apps Repo.

Follow the Readme to learn: how to do visual servoing on your drone.

For any queries please reach out to us on Forums and Join FlytOS Developers group to engage in the interesting discussions with fellow developers. Also, you can follow us on Twitter.


Sign Up Today to download FlytOS and get build powerful drone applications.

For any queries you can reach out to us on Forums or FlytOS Developers Group

Source

Read more…

3689715374?profile=original


Introduction to Deep Learning on drones

Getting a deep learning program working flawlessly on the desktop is nontrivial, so when that application must run on an individual board computer handling a drone, the duty becomes quite challenging.

FlytOS offers a framework to ease these issues by aiding in the easy integration of your profound learning software with drones. FlytOS permits a Profound Learning software (like the next video tutorial) to create and running with a Nvidia TX1 rapidly.

In the previous tutorial, we had built Caffe and set up a Catkin work space for our ROS Caffe node.

In this tutorial we will write an ROS node that takes in images from a USB camera topic that FlytOS publishes, detects and localizes objects in the images, draws bounding boxes on the detected objects and finally, publishes the output on an ROS image topic.

Using the inbuilt video streaming capabilities of FlytOS, the video stream can be viewed in the web browser of any device that is connected to the same network the TX1 is connected to.

For object detection and localization, we use a popular Convolutional Neural Network framework called the Single Shot Multi Box detector (SSD). It is one of the fastest localization network available and is suitable for applications where the object localization is required to be performed in real time. In the context of drones, SSD can be used for applications like surveillance, tracking and following objects of interest, counting cattle etc.

The code for this tutorial is available on https://github.com/flytbase/flytos_tx1. We have taken the CPP example provided in the Single Shot Multibox detector repository and converted it into an ROS node.

The example originally takes input in the form of image files or videos and prints the detected bounding boxes coordinates in the terminal. We modified the input to be ROS image messages being published in FlytOS.

The output is in the form of ROS image messages with the bounding boxes drawn on each image. Images below illustrate example output images from the node.

Running the Code

  • For this example, you will have to download a model and pre-trained weights available from this link. If you want to read up on the dataset that this model was trained on and the different classes that it can classify, visit this link. There are more models available at https://github.com/weiliu89/caffe/t… which you can use to experiment. Extract the downloaded zip file and place the deploy.prototxt file and the .caffemodel file in a folder named model in your home folder.

 

  • You should also have FlytOS running on your Nvidia TX1. You can follow this tutorial if it is not the case.

 

  • We use a USB webcam attached to the TX1 for the video source. If you are using the TX1 Development Board, the USB webcam will probably show up with the device name /dev/video1 . You will have to edit the file /flyt/flytos/flytcore/share/vision_apps/launch/cam_api.launch and change the line
    <param name="video_device" value="/dev/video0"/> to
    <param name="video_device" value="/dev/video1"/>
  • You will need superuser permission to do so. Reboot FlytOS for the changes to take effect.

 

  • If you have followed the previous tutorial you will have the node ready to run. Make sure you have sourced your catkin workspace in your terminal. You can do that by typing:

source ~/flytos_tx1/devel/setup.bash

  • Then launch the SSD node by typing the following command in your terminal (the node takes .prototxt and .caffemodel file paths as arguments):

rosrun ssd_caffe ssd_all_bbox ~/model/deploy.prototxt
~/model/VGG_VOC0712_SSD_300x300_iter_60000.caffemodel

  • The model will take a few seconds to load, and then the node will start publishing on the /flytpod/flytcam/detected_objects topic.

 

  • You can view the image stream in your TX1’s browser by opening the link http://localhost/flytconsole, then clicking the video tab and selecting the /flytpod/flytcam/detected_objects topic in the drop-down list that is shown.

This stream can also be seen on any other computer on the same network connection. Just open the following address in your browser:

http://<ip-address-of-TX1>/flytconsole

The screenshot below shows live streaming being viewed in flyt console:

18121565_10209390837624371_4272633910433521186_o.jpg?width=550

Code explained

We will now explain the modifications we applied to the ssd_caffe.cpp example to convert it into an ROS node.

  • Including ROS related header files

//ROS related includes
#include <ros/ros.h>
#include <image_transport/image_transport.h>
#include <sensor_msgs/image_encodings.h>
#include <std_msgs/Header.h>
#include <cv_bridge/cv_bridge.h>

  • The following array holds the name of all output classes that the network can classify from.

std::string class_labels[] = {"__background__","Aeroplane","Bicycle","Bird","Boat","Bottle", "Bus", "Car", "Cat", "Chair","Cow", "Diningtable", "Dog", "Horse","Motorbike", "Person", "Foliage","Sheep", "Sofa", "Train", "Tvmonitor"};

  • Creating a subscriber and a callback function for the images on /flytpod/flytcam/image_rawtopic:

image_transport::ImageTransport it(nh);
image_transport::Subscriber sub = it.subscribe(“/flytpod/flytcam/image_raw”, 1,
imageCallback);

void imageCallback(const sensor_msgs::ImageConstPtr& msg)
{
try
{
new_img_header = msg->header;
new_img_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::BGR8);
}
catch (cv_bridge::Exception& e)
{
ROS_ERROR(“Could not convert from ‘%s’ to ‘bgr8’.”, msg->encoding.c_str());
}
}

  • Cropping the image to a square size and then passing it to the detector

cv::Mat img = img_uncropped(cv::Rect((int(img_uncropped.cols –

img_uncropped.rows)/2),0,img_uncropped.rows,img_uncropped.rows));
std::vector<vector > detections = detector.Detect(img);

  • Drawing the bounding box and writing the class name along with the confidence

cv::addWeighted(color, 0.5, roi, 0.5 , 0.0, roi);
cv::putText(img,class_labels[int(d[1])],cv::Point(static_cast(d[3] * img.cols),static_cast(d[4] * img.rows) +25), cv::FONT_HERSHEY_TRIPLEX,0.8,white,1,8);
cv::putText(img,confidence_text,cv::Point(static_cast(d[3] * img.cols),static_cast(d[4] * img.rows) +50), cv::FONT_HERSHEY_TRIPLEX,0.8,white,1,8);

  • Publishing the final image

sensor_msgs::ImagePtr pub_msg = cv_bridge::CvImage(std_msgs::Header(),"bgr8", img).toImageMsg();
image_pub.publish(pub_msg);

You are good to go.

For any queries, you can reach out to us on Forums or FlytOS Developers Group

Source

Read more…

3689715056?profile=original

Commercial drone applications
 require significant autonomy and intelligence which cannot be achieved using conventional Computer Vision algorithms. Deep Learning/Vision algorithms seem to be very promising in helping drones to be used in more advanced and complex commercial applications.

Getting a deep learning application working perfectly on a desktop is nontrivial, and when that application has to run on a single board computer aka Companion Computer controlling a drone, the task becomes quite challenging.

FlytOS provides a framework to alleviate these challenges by helping in an easy integration of your deep learning application with drones. It provides you the right set of Drone APIs and Drone SDK to build advanced drone applications.

We will post a series of tutorials to get a simple Deep Learning application like this video

… up and working on a Nvidia TX1 board running FlytOS.
In this tutorial, we will integrate Caffe, a popular deep learning framework widely used for Deep Vision.

For the demo, we will use a famous object detection and localizing network known as the Single Shot Multibox detector or SSD in short (https://github.com/weiliu89/caffe/tree/ssd).

A little knowledge about ROS will be helpful in understanding the tutorial better.

Making a catkin_ws:

We will have to make a catkin workspace for the ROS package we are going to write. If you already have a catkin workspace set up on your TX1, skip this step. In our case our workspace name is flytos_tx1. Clone the following repository: https://github.com/flytbase/flytos_tx1 as a reference to this tutorial.

Making Caffe:

Change your directory to ~/flytos_tx1/src/ssd_caffe/ and clone the caffe repository here by typing following commands:

git clone https://github.com/weiliu89/caffe.git
cd caffe
git checkout ssd_caffe

Now we have to make and install Caffe. You can refer the following link http://www.jetsonhacks.com/2016/09/18/caffe-deep-learning-framework-64-bit-nvidia-jetson-tx1/ as it is a nice resource on building Caffe for the TX1. Or follow these instructions:

Install some dependencies by typing the following commands:

sudo apt-get install cmake libprotobuf-dev libleveldb-dev libsnappy-dev libhdf5-serial-dev protobuf-compiler libatlas-base-dev libgflags-dev libgoogle-glog-dev liblmdb-dev python-dev python-numpy

sudo apt-get install –no-install-recommends libboost-all-dev

Then copy the example Makefile.config.example into a file name Makefile.config

cp Makefile.config.example Makefile.config

If you want to enable the use of CuDNN libraries for accelerated performance, uncomment the line USE_CUDNN in Makefile.config. Then enter the following command:

cmake -DCUDA_USE_STATIC_CUDA_RUNTIME=OFF

Next, add the following line at the end of Makefile.config to solve some issues related to making Caffe on Ubuntu 16.04:

INCLUDE_DIRS += /usr/include/hdf5/serial/

You can finally make caffe by typing these 2 commands:

make -j4 all

make install

Compiling nodes in our ROS package:

If you look at the CMake file our package ssd_caffe (~/flytos_tx1/src/ssd_caffe/CMakeLists.txt), you can see the 2 lines help Catkin in finding the required Caffe libraries to be linked to your ROS nodes.

set(CAFFE_INCLUDEDIR caffe/include /usr/local/cuda/include)
set(CAFFE_LINK_LIBRARY caffe/lib /usr/local/cuda/lib64)

Change your directory back to the catkin workspace and enter the following command to compile your code:

cd ~/flytos_tx1
catkin_make

Right now there is just an empty file in the source directory.

In the next tutorial, you will learn how to make your first ROS node with imported Caffe libraries. Stay tuned!

3689715104?profile=original


Thanks to Zubin and Pradeep for preparing this demo.

For any queries, you can reach out to us on Forums or FlytOS Developers Group.

Source

Read more…

FlytBase Now Also Supports Ardupilot

3689714782?profile=original

FlytBase now supports Ardupilot.

Ardupilot Companion Computers users can now install FlytOS on Intel Edison, NVIDIA Jetson TX1, Intel Aero, Qualcomm Snapdragon Flight, Raspberry Pi 3ODROID XU4 etc. and get access to extensive Drone APIs and Drone SDK to build custom drone applications.

Visit https://flytbase.com to get started.

Extensive Drone API documentation is available for multiple programming languages and technologies like ROS, C++, Python, Rest and Websocket.
For Ardupilot users using Pixhawk or Pixhawk2, FlytBase adds a lot of other capabilities, like, integration with ROS, OpenCV for building intelligent applications. It also provides you RESTful, WebSocket endpoints, and comes with ready SDKs for building custom web/mobile apps. Basically, it helps you leverage the true potential of your companion computer on Pixhawk or Pixhawk 2.

For those, who already have an activated FlytOS running in their devices, need only to connect to the internet. FlytOS would auto update itself.

Compatibility for Intel Joule and NVIDIA TX2 is on its way.

You can follow the easy step by step guide to Get Started with FlytOS and Companion Computer. Install FlytOS on your CC and try out available Sample Drone Apps.   
See the Developers section to start building your own drone apps.  

For discussions, you can join us on our Forums or FlytOS Developers Group.

Look forward to working with other members on building interesting drone applications!

Read more…

3689713944?profile=original

FlytSIM is Now Available! Learn more and Download FlytSIM here: 

FlytSIM is 3D Simulator for Drone Apps. It helps developers in the development & testing of drone applications at the comfort of their Laptop/Desktop.

Developers can get access to FlytOS Drone APIs, SDKs, & FlytSIM, and start building applications in ROS, C++, Python, Rest or Websocket.

FlytSIM offers an SITL simulation environment for testing user apps without the drone hardware. The drones and a virtual world are simulated using an ROS-Gazebo based setup. The autopilot in simulation uses the same control algorithms as on the FC/CC making the behavior closer to real flight.

This offers developers a pleasant drone application development experience while keeping it safe, and saving them a huge amount of time and effort.
Start building your drone application. Download FlytSIM today!

You can also try out our demo code/apps in FlytSIM

Refer to the documentation to install and learn more about FlytSIM

Read more…

3689713411?profile=originalFrom basic configuration to the GCS utilities, from planning complex drone missions to visualizing data analysis report - Mobile and Web Application Interfaces have an important role to play in any commercial drone applications of the day.    

A rich UI/UX helps drone operators execute the tasks efficiently, enable them to visualize data captured by drone in the correct form, thereby enabling them to make better decisions on and off the field.

FlytBase Drone SDK (aka FlytSDK) for Mobile/Web Applications provides the right tools for developers to jumpstart their Drone App development process. Using the Drone APIs (aka FlytAPIs) developers can quickly build and customize their drone applications.

The SDKs have Web and Android libraries pre-integrated. So, developers don’t need to worry about including related libraries or initializing the socket connection. A basic framework/template is readily made available for the REST or Socket calls.

Directly start editing the app screen, add your REST calls and telemetry data subscription to build your application.


3689713386?profile=original3689713301?profile=original

3689713395?profile=original

You can try out prebuilt sample apps, or use these as templates to build your own apps by following these simple steps.

Developers can get started by referring to the documentation.

Extensive documentation and sample codes are available here: http://docs.flytbase.com, https://github.com/flytbase/flytsamples

For any technical queries/assistance, please use the forums: http://forums.flytbase.com/

Several other demos are available on github repo and youtube channel: https://goo.gl/yLkuT3

Read more…