burro - Platform for small-scale self-driving vehicles.

  •        12

Burro is a platform for small-scale self-driving cars. Using Burro you can build either a RC-style Ackermann steering car, or a Differential steering car (like a three-wheel robot). Depending on your hardware Burro will automatically select and setup the right kind of vehicle each time you run it. Thus you may share the same SD card among different vehicles without any changes.

https://github.com/yconst/burro

Tags
Implementation
License
Platform

   




Related Projects

autonomous-rc-car - Autonomous RC car using Raspberry Pi and ANN

  •    Python

This project aims to build an autonomous rc car using supervised learning of a neural network with a single hidden layer. We have not used any Machine Learning libraries since we wanted to implement the neural network from scratch to understand the concepts better. We have modified a remote controlled car to remove the dependency on the RF remote controller. A Raspberry Pi controls the DC motors via an L293D Motor Driver IC. You can find a post explaining this project in detail here. Here's a video of the car in action. We will be referring the DC motor controlling the left/right direction as the front motor and the motor controlling the forward/reverse direction as the back motor. Connect the BACK_MOTOR_DATA_ONE and BACK_MOTOR_DATA_TWO GPIO pins(GPIO17 and GPIO27) of the Raspberry Pi to the Input pins for Motor 1(Input 1, Input 2) and the BACK_MOTOR_ENABLE_PIN GPIO pin(GPIO22) to the Enable pin for Motor 1(Enable 1,2) in the L293D Motor Driver IC. Connect the Output pins for Motor 1(Output 1, Output 2) of the IC to the back motor.

self-driving-toy-car - A self driving toy car using end-to-end learning

  •    Jupyter

To make a lane follower based on a standard RC car using Raspberry Pi and a camera. The software is a simple Convolutional Network, which takes in the image fetched from the camera and outputs the steering angle. During data collection, we will simply hook the steering PWM of the car to pin GPIO17. The script raspberry_pi/collect_data.py will record the values of steering PWM and the associated images. The data of each trial are collectively stored in driving_trial_*. The trial folders are automatically numbered.

self-driving-car - The Udacity open source self-driving car project

  •    Jupyter

At Udacity, we believe in democratizing education. How can we provide opportunity to everyone on the planet? We also believe in teaching really amazing and useful subject matter. When we decided to build the Self-Driving Car Nanodegree program, to teach the world to build autonomous vehicles, we instantly knew we had to tackle our own self-driving car too. You can read more about our plans for this project.

carla - Open-source simulator for autonomous driving research.

  •    C++

CARLA is an open-source simulator for autonomous driving research. CARLA has been developed from the ground up to support development, training, and validation of autonomous urban driving systems. In addition to open-source code and protocols, CARLA provides open digital assets (urban layouts, buildings, vehicles) that were created for this purpose and can be used freely. The simulation platform supports flexible specification of sensor suites and environmental conditions. If you want to benchmark your model in the same conditions as in our CoRL’17 paper, check out Benchmarking.

apollo - An open autonomous driving platform

  •    C++

Apollo is a high performance, flexible architecture which accelerates the development, testing, and deployment of Autonomous Vehicles. For business and partnership, please visit our website.


AirSim - Open source simulator based on Unreal Engine for autonomous vehicles from Microsoft AI & Research

  •    C++

AirSim is a simulator for drones (and soon other vehicles) built on Unreal Engine. It is open-source, cross platform and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped in to any Unreal environment you want.

donkey - self driving car

  •    Python

Donkeycar is minimalist and modular self driving library for Python. It is developed for hobbyists and students with a focus on allowing fast experimentation and easy community contributions. After building a Donkey2 you can turn on your car and go to http://localhost:8887 to drive.

How_to_simulate_a_self_driving_car - This is the code for "How to Simulate a Self-Driving Car" by Siraj Raval on Youtube

  •    Python

This is the code for this video on Youtube by Siraj Raval. We're going to use Udacity's self driving car simulator as a testbed for training an autonomous car. You need a anaconda or miniconda to use the environment setting.

LightNet - LightNet: Light-weight Networks for Semantic Image Segmentation (Cityscapes and Mapillary Vistas Dataset)

  •    Python

This repository contains the code (in PyTorch) for: "LightNet: Light-weight Networks for Semantic Image Segmentation " (underway) by Huijun Liu @ TU Braunschweig. Semantic Segmentation is a significant part of the modern autonomous driving system, as exact understanding the surrounding scene is very important for the navigation and driving decision of the self-driving car. Nowadays, deep fully convolutional networks (FCNs) have a very significant effect on semantic segmentation, but most of the relevant researchs have focused on improving segmentation accuracy rather than model computation efficiency. However, the autonomous driving system is often based on embedded devices, where computing and storage resources are relatively limited. In this paper we describe several light-weight networks based on MobileNetV2, ShuffleNet and Mixed-scale DenseNet for semantic image segmentation task, Additionally, we introduce GAN for data augmentation[17] (pix2pixHD) concurrent Spatial-Channel Sequeeze & Excitation (SCSE) and Receptive Field Block (RFB) to the proposed network. We measure our performance on Cityscapes pixel-level segmentation, and achieve up to 70.72% class mIoU and 88.27% cat. mIoU. We evaluate the trade-offs between mIoU, and number of operations measured by multiply-add (MAdd), as well as the number of parameters.

self-driving-car-sim - A self-driving car simulator built with Unity

  •    CSharp

This simulator was built for Udacity's Self-Driving Car Nanodegree, to teach students how to train cars how to navigate road courses using deep learning. See more project details here. All the assets in this repository require Unity. Please follow the instructions below for the full setup.

CarND-Path-Planning-Project - Create a path planner that is able to navigate a car safely around a virtual highway

  •    C++

You can download the Term3 Simulator which contains the Path Planning Project from the [releases tab (https://github.com/udacity/self-driving-car-sim/releases/tag/T3_v1.2). In this project your goal is to safely navigate around a virtual highway with other traffic that is driving +-10 MPH of the 50 MPH speed limit. You will be provided the car's localization and sensor fusion data, there is also a sparse map list of waypoints around the highway. The car should try to go as close as possible to the 50 MPH speed limit, which means passing slower traffic when possible, note that other cars will try to change lanes too. The car should avoid hitting other cars at all cost as well as driving inside of the marked road lanes at all times, unless going from one lane to another. The car should be able to make one complete loop around the 6946m highway. Since the car is trying to go 50 MPH, it should take a little over 5 minutes to complete 1 loop. Also the car should not experience total acceleration over 10 m/s^2 and jerk that is greater than 10 m/s^3.

CarND-LaneLines-P1 - Lane Finding Project for Self-Driving Car ND

  •    Jupyter

When we drive, we use our eyes to decide where to go. The lines on the road that show us where the lanes are act as our constant reference for where to steer the vehicle. Naturally, one of the first things we would like to do in developing a self-driving car is to automatically detect lane lines using an algorithm. In this project you will detect lane lines in images using Python and OpenCV. OpenCV means "Open-Source Computer Vision", which is a package that has many useful tools for analyzing images.

CarND-Capstone

  •    CMake

This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here. Please use one of the two installation options, either native or docker installation.

Applying_EANNs - A 2D Unity simulation in which cars learn to navigate themselves through different courses

  •    ASP

Cars have to navigate through a course without touching the walls or any other obstacles of the course. A car has five front-facing sensors which measure the distance to obstacles in a given direction. The readings of these sensors serve as the input of the car's neural network. Each sensor points into a different direction, covering a front facing range of approximately 90 degrees. The maximum range of a sensor is 10 unity units. The output of the Neural Network then determines the car’s current engine and turning force. If you would like to tinker with the parameters of the simulation, you can do so in the Unity Editor. If you would simply like to run the simulation with default parameters, you can start the built file [Builds/Applying EANNs.exe](Builds/Applying EANNs.exe).

CarND-Vehicle-Detection - Vehicle detection using YOLO in Keras runs at 21FPS

  •    Jupyter

This is a project for Udacity self-driving car Nanodegree program. The aim of this project is to detect the vehicles in a dash camera video. The implementation of the project is in the file vehicle_detection.ipynb. This implementation is able to achieve 21FPS without batching processing. The final video output is here. In this README, each step in the pipeline will be explained in details.

CarND-TensorFlow-Lab - TensorFlow Lab for Self-Driving Car ND

  •    Jupyter

We've prepared a Jupyter notebook that will guide you through the process of creating a single layer neural network in TensorFlow. If you don't have Docker already, download and install Docker from here.

oscc - Open Source Car Control 💻🚗🙌

  •    C++

Open Source Car Control (OSCC) is an assemblage of software and hardware designs that enable computer control of modern cars in order to facilitate the development of autonomous vehicle technology. It is a modular and stable way of using software to interface with a vehicle’s communications network and control systems. OSCC enables developers to send control commands to the vehicle, read control messages from the vehicle’s OBD-II CAN network, and forward reports for current vehicle control state. Such as steering angle & wheel speeds. Control commands are issued to the vehicle component ECUs via the steering wheel torque sensor, throttle position sensor, and brake position sensor. (Because the gas-powered Kia Soul isn’t brake by-wire capable, an auxiliary actuator is added to enable braking.) This low-level interface means that OSCC offers full-range control of the vehicle without altering the factory safety-case, spoofing CAN messages, or hacking ADAS features.

Car World

  •    C++

Car World is a car driving simulation/demo using simple mechanics. it uses SDL as multi platform windowing toolkit and OpenGL for rendering.

CarND-Extended-Kalman-Filter-Project - Self-Driving Car Nanodegree Program Starter Code for the Extended Kalman Filter Project

  •    C++

In this project you will utilize a kalman filter to estimate the state of a moving object of interest with noisy lidar and radar measurements. Passing the project requires obtaining RMSE values that are lower than the tolerance outlined in the project rubric. This repository includes two files that can be used to set up and install uWebSocketIO for either Linux or Mac systems. For windows you can use either Docker, VMware, or even Windows 10 Bash on Ubuntu to install uWebSocketIO. Please see this concept in the classroom for the required version and installation scripts.