Project Overview

As the primary member leading this project, the focus is on developing an open-source quadcopter platform for advancing research in drone autonomy. The project encompasses the implementation of various deep learning and computer vision algorithms, including person tracking, gesture control using human pose estimation, optical flow stabilization, obstacle avoidance, and depth estimation using monocular vision.

Objectives and Contributions

The primary objectives of this project include:

  • Building an open-source quadcopter platform for research in drone autonomy.
  • Implementing deep learning and computer vision algorithms for person tracking, gesture control, optical flow stabilization, obstacle avoidance, and depth estimation.
  • Utilizing a Pixhawk flight controller with Raspberry Pi as a companion computer for efficient control.
  • Employing DJI Flame Wheel-450 for the quadcopter frame, customized with additional mountings for extra components.

Technology Stack

The technology stack for this project includes:

  • Pixhawk Flight Controller
  • Raspberry Pi
  • ROS (Robot Operating System)
  • Gazebo Simulation
  • Docker Containers

Project Implementation

The Raspberry Pi runs a ROS node, establishing communication with another ROS node on the host PC to transfer videos over Wi-Fi. To ensure the project’s open-source nature and ease of development, the simulation environment setup is dockerized using Docker containers. The ongoing development involves implementing and testing algorithms within the Gazebo Simulation.

Conclusion

This project contributes to the field of drone autonomy research by providing an open-source platform with advanced features such as person tracking, gesture control, optical flow stabilization, obstacle avoidance, and depth estimation. The utilization of industry-standard components like Pixhawk and Raspberry Pi enhances the project’s accessibility and reproducibility.

Quick Links