This repository holds the PX4 Pro flight control solution for drones, with the main applications located in the src/modules directory. It also contains the PX4 Drone Middleware Platform, which provides drivers and middleware to run drones. This Developer Guide is for software developers who want to modify the flight stack and middleware (e.g. to add new flight modes), hardware integrators who want to support new flight controller boards and peripherals, and anyone who wants to get PX4 working on a new (unsupported) airframe/vehicle.
px4 pixhawk uav uas dronecode dronekit autopilot ros drone mavlink multicopter fixed-wing ugv dronelink mavros qgroundcontrol dds dronecore fast-rtps'Openpose' for human pose estimation have been implemented using Tensorflow. It also provides several variants that have made some changes to the network structure for real-time processing on the CPU or low-power embedded devices. 2018.5.21 Post-processing part is implemented in c++. It is required compiling the part. See: https://github.com/ildoonet/tf-pose-estimation/tree/master/src/pafprocess 2018.2.7 Arguments in run.py script changed. Support dynamic input size.
deep-learning openpose tensorflow mobilenet pose-estimation convolutional-neural-networks neural-network image-processing human-pose-estimation embedded realtime cnn mobile ros robotics catkinOpen Hardware scanning laser rangefinder. It is really cheap - its components cost less than $35.
lidar stm32 ros diyMission Planner Ground Control Station (c# .net)
gcs ardupilot uav autopilot planner mission ros open-source missionplanner c-sharp autonomous pixhawk cube ground control station🤖 Places where you can learn robotics (and stuff like that) online 🤖
learning computer-science udacity algorithm university algorithms robotics coursera ros edx self-learning moocsCheck out our latest news and subscribe to our mailing list to get the latest updates. LG Electronics America R&D Lab has developed an HDRP Unity-based multi-robot simulator for autonomous vehicle developers. We provide an out-of-the-box solution which can meet the needs of developers wishing to focus on testing their autonomous vehicle algorithms. It currently has integration with The Autoware Foundation's Autoware.auto and Baidu's Apollo platforms, can generate HD maps, and can be immediately used for testing and validation of a whole system with little need for custom integrations. We hope to build a collaborative community among robotics and autonomous vehicle developers by open sourcing our efforts.
api machine-learning simulator reinforcement-learning computer-vision deep-learning game-engine unity tensorflow artificial-intelligence ros autonomous self-driving-car unreal-engine baidu 3d airsim autoware carlaThis is a ROS package developed for object detection in camera images. You only look once (YOLO) is a state-of-the-art, real-time object detection system. In the following ROS package you are able to use YOLO (V3) on GPU and CPU. The pre-trained model of the convolutional neural network is able to detect pre-trained classes including the data set from VOC and COCO, or you can also create a network with your own detection objects. For more information about YOLO, Darknet, available training data and training YOLO see the following link: YOLO: Real-Time Object Detection. The YOLO packages have been tested under ROS Noetic and Ubuntu 20.04. Note: We also provide branches that work under ROS Melodic, ROS Foxy and ROS2.
computer-vision deep-learning ros yolo object-detection darknet human-detection darknet-rosBest practices for ROS2 in the making. See the Foxy branch in the meanwhile. This is a loose collection of best practices, conventions, and tricks for using the Robot Operating System (ROS). It builds up on the official ROS documentation and other resources and is meant as summary and overview.
robotics best-practices conventions rosThis package provides executables and a small library for handling, evaluating and comparing the trajectory output of odometry and SLAM algorithms. See here for more infos about the formats.
benchmark robotics tum mapping metrics evaluation ros slam odometry trajectory kitti eurocThis is a C++ library with ROS interface to manage two-dimensional grid maps with multiple data layers. It is designed for mobile robotic mapping to store data such as elevation, variance, color, friction coefficient, foothold quality, surface normal, traversability etc. It is used in the Robot-Centric Elevation Mapping package designed for rough terrain navigation. This is research code, expect that it changes often and any fitness for a particular purpose is disclaimed.
opencv cpp navigation mapping terrain ros elevation pcl rviz height-map costmap grid-map occupancy octopmapTo stop reinventing the wheel you need to know about the wheel. This list is an attempt to show the variety of open and free tools in software and hardware development, which are useful in professional robotic development. Your contribution is necessary to keep this list alive, increase the quality and to expand it. You can read more about it's origin and how you can participate in the contribution guide and related blog post. All new project entries will have a tweet from protontypes.
machine-learning awesome robot cplusplus cpp robotics mapping aerospace point-cloud artificial-intelligence ros lidar self-driving-car awesome-list automotive slam autonomous-driving robotic ros2This is a fast and robust algorithm to segment point clouds taken with Velodyne sensor into objects. It works with all available Velodyne sensors, i.e. 16, 32 and 64 beam ones. I recommend using a virtual environment in your catkin workspace (<catkin_ws> in this readme) and will assume that you have it set up throughout this readme. Please update your commands accordingly if needed. I will be using pipenv that you can install with pip.
fast real-time clustering point-cloud range ros lidar depth segmentation pcl catkin velodyne-sensor velodyne depth-image range-image depth-clusteringAn updated lidar-initial odometry package, LIO-SAM, has been open-sourced and available for testing. You can use the following commands to download and compile the package.
mapping ros isam imu lidar ieee slam velodyne ugv odometry jackal gtsam loam iros lidar-odometryThis repository holds the PX4 flight control solution for drones, with the main applications located in the src/modules directory. It also contains the PX4 Drone Middleware Platform, which provides drivers and middleware to run drones. PX4 is highly portable, OS-independent and supports Linux, NuttX and QuRT out of the box.
uav drone ros px4 pixhawk uas dronecode autopilot mavlink autonomous drones dds hacktoberfest ugv mavros multicopter qgroundcontrol fixed-wing fast-rtps avoidanceWebots is an open-source robot simulator released under the terms of the Apache 2.0 license. It provides a complete development environment to model, program and simulate robots, vehicles and biomechanical systems. You can download pre-compiled binaries for Windows, macOS and Linux of the latest release, as well as older releases and nightly builds.
open-source multi-platform simulator robot ai computer-vision robotics simulation physics-engine ros robots autonomous-vehicles fluid-dynamics 3d-engine robot-simulator webots robotics-simulation simulated-robotsAsk questions here. Issues #71 and #7 address this problem. The current known solution is to build the same version of PCL that you have on your system from source, and set the CMAKE_PREFIX_PATH accordingly so that catkin can find it. See this issue for more details.
computer-vision mapping ros lidar pcl slam velodyne 3d pointcloud loam-velodyne loamThis C++ 14 library provides a framework to create BehaviorTrees. It was designed to be flexible, easy to use, reactive and fast. Even if our main use-case is robotics, you can use this library to build AI for games, or to replace Finite State Machines in your application.
games ai state-machine robotics coordination ros behaviortreeAutoware is the world's first "all-in-one" open-source software for self-driving vehicles. The capabilities of Autoware are primarily well-suited for urban cities, but highways, freeways, mesomountaineous regions, and geofenced areas can be also covered. The code base of Autoware is protected by the Apache 2 License. Please use it at your own discretion. For safe use, we provide a ROSBAG-based simulation environment for those who do not own real autonomous vehicles. If you plan to use Autoware with real autonomous vehicles, please formulate safety measures and assessment of risk before field testing. You may refer to Autoware Wiki for Users Guide and Developers Guide.
planner detection ros calibration autonomous-vehicles 3d-map autoware autoware-developers tier-ivThe MSCKF_VIO package is a stereo version of MSCKF. The software takes in synchronized stereo images and IMU messages and generates real-time 6DOF pose estimation of the IMU frame. The software is tested on Ubuntu 16.04 with ROS Kinetic.
ros visual-inertial-odometry stereo-vision
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.