Skip to content
Nikolai Smolyanskiy edited this page Mar 28, 2018 · 12 revisions

Introduction

This project contains instructions, code and other artifacts that allow users to build mobile robots (drones, rovers) which can autonomously navigate through highly unstructured environments like forest trails. Our components use deep learning-based AI running on an NVIDIA Jetson embedded platform. The code implements ideas discussed in the arXiv paper, see references section.

The project has two major parts, modeling and platform implementation.

Modeling

The project's AI that enables autonomous navigation is based on a deep neural network (DNN) which can be trained from scratch using publicly available data. A few pre-trained DNNs are also available as a part of this project. In case you want to train TrailNet DNN from scratch, follow the steps on this page.

Platforms

The following platforms are currently supported:

In general, any platform that uses the Pixhawk autopilot should work too.

There is also an experimental support for APM Rover:

Getting started

Building a complete autonomous drone platform requires proper hardware and software configuration.

Jetson setup

The NVIDIA Jetson platform is used to run most of the components, such as DNN inference, the controller, and video streaming. The Jetson setup guide describes steps to install all required software and dependencies.

Drone setup

Depending on the drone platform some additional steps might be required. Follow the steps in the documentation for your particular platform:

GCS (Ground Control Station) setup

A laptop is a convenient way to run GCS software like QGroundControl as well to control the drone using an NVIDIA Shield or an XBox controller. Follow these steps to setup GCS machine.

Simulation

It is usually a good idea to test your code in a simulator. Follow these steps to run simulations using Gazebo.

Flying

Once the hardware and software setup steps are complete, it's time to take off! Follow these steps to fly the drone.

References