Donkeycar is minimalist and modular self driving library for Python. It is developed for hobbyists and students with a focus on allowing fast experimentation and easy community contributions. After building a Donkey2 you can turn on your car and go to http://localhost:8887 to drive.
self-driving-car raspberry-pi tensorflow kerasApollo is a high performance, flexible architecture which accelerates the development, testing, and deployment of Autonomous Vehicles. For business and partnership, please visit our website.
apollo autonomous-vehicles autonomous-driving autonomy self-driving-carCARLA is an open-source simulator for autonomous driving research. CARLA has been developed from the ground up to support development, training, and validation of autonomous urban driving systems. In addition to open-source code and protocols, CARLA provides open digital assets (urban layouts, buildings, vehicles) that were created for this purpose and can be used freely. The simulation platform supports flexible specification of sensor suites and environmental conditions. If you want to benchmark your model in the same conditions as in our CoRL’17 paper, check out Benchmarking.
simulator autonomous-vehicles autonomous-driving research ai artificial-intelligence computer-vision deep-learning deep-reinforcement-learning imitation-learning self-driving-car ue4 unreal-engine-4 cross-platformAirSim is a simulator for drones (and soon other vehicles) built on Unreal Engine. It is open-source, cross platform and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped in to any Unreal environment you want.
drones ai self-driving-car autonomous-vehicles autonomous-quadcoptor research computer-vision artificial-intelligence deeplearning deep-reinforcement-learning control-systems pixhawk cross-platform platform-independent airsim unreal-engine simulatorTo make a lane follower based on a standard RC car using Raspberry Pi and a camera. The software is a simple Convolutional Network, which takes in the image fetched from the camera and outputs the steering angle. During data collection, we will simply hook the steering PWM of the car to pin GPIO17. The script raspberry_pi/collect_data.py will record the values of steering PWM and the associated images. The data of each trial are collectively stored in driving_trial_*. The trial folders are automatically numbered.
raspberry-pi cnn-keras deep-learning machine-learning servo self-driving-car convolutional-networksXVIZ is a protocol for real-time transfer and visualization of autonomy data. Learn more in the docs and specification. You need Node.js and yarn to run the examples.
3d robotics self-driving-car protocolBurro is a platform for small-scale self-driving cars. Using Burro you can build either a RC-style Ackermann steering car, or a Differential steering car (like a three-wheel robot). Depending on your hardware Burro will automatically select and setup the right kind of vehicle each time you run it. Thus you may share the same SD card among different vehicles without any changes.
self-driving-car autonomous-vehicles autonomous-car self-driving rc-car raspberry-pi neural-networkThis project is developed and being maintained by the Microsoft Deep Learning and Robotics Garage Chapter. This is currently a work in progress. We will continue to add more tutorials and scenarios based on requests from our users and the availability of our collaborators.Autonomous Driving has transcended far beyond being a crazy moonshot idea over the last half decade or so. It is quickly becoming the biggest technology today that promises to shape our tomorrow, not very unlike when cars came into existence in the first place. Almost every single car manufacturer, every big technology company, and a number of very promising startups have been working on different aspects of autonomous driving to help shape this revolution. Some of the biggest drivers powering this change have been the recent advances in software (robotics and deep learning techniques), hardware technology (GPUs, FPGAs etc.) and cloud computing. Cloud platforms like Azure have enabled ingest and processing of large amounts of data, making it possible for companies to push for levels 4 and 5 of AD autonomy.
deep-learning self-driving-car autonomous-vehicles tensorflow cntk keras airsim autonomous-driving autonomous-driving-cookbookWelcome to my blog 听雨居. It contains detailed description of the code here.
machine-learning self-driving-carThe purpose of this project is to build and learn a deep neural network that can mimic the behavior of humans driving a car. A simple simulator is used for this purpose. When we drive a car, the simulator stores images and steering angles for training. You can learn the neural network with the stored training data and check the results of training in the simulator's autonomous mode. When you first run the simulator, you’ll see a configuration screen asking what size and graphical quality you would like.
deep-learning self-driving-car behavioral-cloningThis is a project for Udacity Self-Driving Car Nanodegree program. The aim of this project is to control a car in a simulator using neural network. This implementation uses a convolutional neural network (CNN) with only 63 parameters, yet performs well on both the training and test tracks. The implementation of the project is in the files drive.py and model.py and the explanation of the implementation is in project-3.ipynb. Videos of the actions of this neural network are here: Track 1 Track 2. A post about this solution is at Self-driving car in a simulator with a tiny neural network. in the terminal. The connection between the simulator and the controlling neural network will be taken care automatically.
self-driving-car neural-network keras simulatorThis project uses computer vision to label the lanes in a driving video, calculate the curvature of the lane, and estimate the distance of the vehicle from the center of the lane. If you don't already have tools like Jupyter and OpenCV installed, follow the Udacity instructions to configure an anaconda environment that will give you these tools.
udacity lane-finding self-driving-car computer-vision cv2This is an implementation in Keras of the paper "3D Bounding Box Estimation Using Deep Learning and Geometry" (https://arxiv.org/abs/1612.00496).
self-driving-car cnn-keras convolutional-neural-networks deep-learning bounding-boxes regression kitti-datasetEnhance! A neural network for image superresolution. This project hit the front page of Hacker News and Trending on Github.
artificial-intelligence deep-learning self-driving-car generative-adversarial-network bitcoin blockchain reinforcement-learning evolution-strategiesApplications based on OpenDLV are grouped in UDP multicast sessions belonging to IPv4 address 225.0.0.X, where X is from the within the range [1,254]. All microservices belonging to the same UDP multicast group are able to communicate with each other; thus, two applications running in different UDP multicast sessions do not see each other and are completely separated. The actual UDP multicast session is selected using the commandline parameter --cid=111, where 111 would define the UDP multicast address 225.0.0.111. Microservices exchange data using the message Envelope that contains besides the actual message to send further meta information like sent and received timestamp and the point in time when the contained message was actually sampled. All messages are encoded in Google's Protobuf data format (example) that has been adjusted to preserve forwards and backwards compatibility using libcluon's native implementation of Protobuf.
opendlv libcluon cpp14 microservice docker self-driving-car autonomous-driving amd64 armhf aarch64 trimble-gps gps nmea oxts-gps velodyne lidar applanix video4linux openh264 h264Udacity's Self-Driving Car Nanodegree project files and notes. This repository is a compilation of project files and lecture notes for Udacity's Self-Driving Car Engineer Nanodegree program which I started working on January 19, 2017.
computer-vision sensors localisation udacity car self-driving-car sensor-fusion deep-learningMetacar is a 2D reinforcement learning environment for autonomous vehicles running in the browser. The project aims to let reinforcement learning be more accessible to everyone through solving fun problems. Metacar comes with a set of a predefined levels, some harder to address than others. More levels and possibile scenarios will be added soon (pedestrian, bikes...). Furthermore, the library let you create your own levels and personalize the environment to create your desired scenario. You can also take a look at the online demo.
reinforcement-learning self-driving-car autonomous-vehicles pixijs tensorflowjs browserTeach an AI how to drive
artificial-intelligence convnetsharp skiasharp machine-learning self-driving-car
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.