A curated list of awesome quantum machine learning algorithms,study materials,libraries and software (by language).
quantum quantum-computing quantum-programming-language machine-learning artificial-intelligence artificial-neural-networks tensorflow awesome-list awesome machine-learning-algorithms knn-classification fcm kmeans hmm-model qubits ant-colony-optimization ai quantum-ai qmlA python library built to empower developers to build applications and systems with self-contained Deep Learning and Computer Vision capabilities using simple and few lines of code. Built with simplicity in mind, ImageAI supports a list of state-of-the-art Machine Learning algorithms for image prediction, custom image prediction, object detection, video detection, video object tracking and image predictions trainings. ImageAI currently supports image prediction and training using 4 different Machine Learning algorithms trained on the ImageNet-1000 dataset. ImageAI also supports object detection, video detection and object tracking using RetinaNet, YOLOv3 and TinyYOLOv3 trained on COCO dataset. Eventually, ImageAI will provide support for a wider and more specialized aspects of Computer Vision including and not limited to image recognition in special environments and special fields.
artificial-intelligence machine-learning prediction image-prediction python3 offline-capable imageai artificial-neural-networks algorithm image-recognition object-detection squeezenet densenet video inceptionv3 detection gpu ai-practice-recommendationsGenann is a minimal, well-tested library for training and using feedforward artificial neural networks (ANN) in C. Its primary focus is on being simple, fast, reliable, and hackable. It achieves this by providing only the necessary functions and little extra. Genann is self-contained in two files: genann.c and genann.h. To use Genann, simply add those two files to your project.
backpropagation genetic-algorithm artificial-neural-networks ann neurons hidden-layers neural-network neural-networks neural ansi tinyTrending deep learning Github repositories can be found here. Hint: This will be updated regularly.
deep-learning deep-neural-networks deep-reinforcement-learning convolutional-neural-networks recurrent-neural-networks stargazers-count artificial-neural-networks artificial-intelligence machine-learning top-repositoriesCars have to navigate through a course without touching the walls or any other obstacles of the course. A car has five front-facing sensors which measure the distance to obstacles in a given direction. The readings of these sensors serve as the input of the car's neural network. Each sensor points into a different direction, covering a front facing range of approximately 90 degrees. The maximum range of a sensor is 10 unity units. The output of the Neural Network then determines the car’s current engine and turning force. If you would like to tinker with the parameters of the simulation, you can do so in the Unity Editor. If you would simply like to run the simulation with default parameters, you can start the built file [Builds/Applying EANNs.exe](Builds/Applying EANNs.exe).
artificial-neural-networks neural-networks evolutionary-algorithms genetic-algorithm deep-learning machine-learning self-driving-carsAbout a year ago, it has been officially announced that Theano will stop support for their library. They don't add new features anymore and soon, they will stop adding bug fixes to the library. NeuPy cannot evolve having large number of features that depend on the dead library. For this reason, NeuPy was moved to the Tensorflow. All the Theano based code has been fully migrated to Tenorflow and it can be tested from the release/v0.7.0 branch.
deep-learning deep-neural-networks deeplearning neural-network artificial-neural-networks neupy theanoDLL is a library that aims to provide a C++ implementation of Restricted Boltzmann Machine (RBM) and Deep Belief Network (DBN) and their convolution versions as well. It also has support for some more standard neural networks. Note: When you clone the library, you need to clone the sub modules as well, using the --recursive option.
c-plus-plus cpp cpp11 cpp14 performance machine-learning deep-learning artificial-neural-networks gpu rbm cpu convolutional-neural-networksA curated list of awesome awesomeness about artificial intelligence(AI). If you want to contribute to this list (please do), send me a pull request.
natural-language-processing computer-vision deep-learning artificial-intelligence neural-networks artificial-neural-networks machine-lThis project is an experimental wrapper of TensorFlow C API which enables Machine Learning in Server Side Swift.This package builds with Swift Package Manager and is part of the Perfect project but can also be used as an independent module.
machine-learning artificial-intelligence artificial-neural-networks swift perfect matrix-library tensorflow deep-learningTraining AI machine learning models on the Fashion MNIST dataset. Fashion-MNIST is a dataset consisting of 70,000 images (60k training and 10k test) of clothing objects, such as shirts, pants, shoes, and more. Each example is a 28x28 grayscale image, associated with a label from 10 classes. The 10 classes are listed below.
mnist fashion dataset fashion-mnist machine-learning artificial-intelligence artificial-neural-networks support-vector-machines svm xgboost data-science r supervised-learning classification image-recognition image-classificationNeuroEvolution of Augmenting Topologies (NEAT) implemented in Javascript (with tests done in Mocha for verification). Can be used as a node module or in a browser
neural-network nn artificial-neural-networks ann cppn neat hyperneatSimple Neural Network Examples in 8 lines APL
machine-learning artificial-neural-networks apl jRepositories with 50000 stars or more are excluded. Top deep learning Github repositories can be found here.
deep-learning deep-neural-networks trending-repositories convolutional-neural-networks recurrent-neural-networks deep-reinforcement-learning stargazers-count artificial-neural-networks artificial-intelligence machine-learningDuring the time that I was writing my bachelor's thesis Sequence-to-Sequence Learning of Financial Time Series in Algorithmic Trading (in which I used LSTM-based RNNs for modeling the thesis problem), I became interested in natural language processing. After reading Andrej Karpathy's blog post titled The Unreasonable Effectiveness of Recurrent Neural Networks, I decided to give text generation using LSTMs for NLP a go. Although slightly trivial, the project still comprises an interesting program and demo, and gives really interesting (and sometimes very funny) results. I implemented the program over the course of a weekend in Hy (a LISP built on top of Python) using Keras and TensorFlow. You can train the model on any text sources you like. Remember to give it enough time to go over at least fifty epochs, otherwise the generated text will not be very interesting, rather seemingly random garbage.
lstm lstm-neural-networks rnn tensorflow tensorflow-experiments keras text-generation natural-language-processing nlp-machine-learning machine-learning lisp hylang keras-neural-networks artificial-intelligence artificial-neural-networks recurrent-neural-networksThis is my bachelor's thesis that I wrote over the course of two months during my final year of studies, earning my Bachelor of Science in Computer Science degree. The thesis was co-authored by my good friend Tobias Ånhed. Click here for revised edition on DiVA.
lstm-neural-networks research-paper bachelor-thesis sequence-to-sequence machine-learning finance trading forex algorithmic-trading recurrent-neural-networks forex-trading technical-analysis technical-indicators artificial-neural-networks keras time-series-analysis financial-analysis white-paper publication trading-algorithmsBecause the different modules are standalone you can use pyERA for building SOM using only the som.py class. Feel free to fork the project and add your own stuff. Any feedback is appreciated. The Epigenetic Robotic Architecture (ERA) is a hybrid behavior-based robotics and neural architecture purposely built to implement embodied principles in cognitive development. This architecture has been already tested in a variety of cognitive and developmental tasks directly modeling child psychology data. The ERA architecture uses a behaviour-based subsumption mechanism to handle the integration of competing sensorimotor input. The learning system is based on an ensemble of pre-trained SOMs connected via Hebbian weights. The basic unit of the ERA architecture is formed by the structured association of multiple self-organizing maps. Each SOM receives a subset of the input available to that unit and is typically partially prestabilized using random input distributed across the appropriate ranges for those inputs. In the simplest case, the ERA architecture comprises of multiple SOMs, each receiving input from a different sensory modality, and each with a single winning unit. Each of these winning units is then associated to the winning unit of a special “hub” SOM using a bidirectional connection weighted with positive Hebbian learning.
som artificial-neural-networks self-organizing-map hebbian-learningNeuron class provides LNU (Linear Neural Unit), QNU (Quadratic Neural Unit), RBF (Radial Basis Function), MLP (Multi Layer Perceptron), MLP-ELM (Multi Layer Perceptron - Extreme Learning Machine) neurons learned with Gradient descent or LeLevenberg–Marquardt algorithm. This class is suitable for prediction on time series. Neuron class needs pandas and numpy to work propertly.
neurons mlp rbf lnu multi-layer-perceptron mlp-elm qnu artificial-intelligence time-series ann neural-network neural-networks artificial-neural-networks predictionThis repository provides implementation of NeuroEvolution of Augmenting Topologies (NEAT) method written in Go language. The Neuroevolution (NE) is an artificial evolution of Neural Networks (NN) using genetic algorithms in order to find optimal NN parameters and topology. Neuroevolution of NN may assume search for optimal weights of connections between NN nodes as well as search for optimal topology of resulting NN. The NEAT method implemented in this work do search for both: optimal connections weights and topology for given task (number of NN nodes per layer and their interconnections).
artificial-neural-networks neuroevolution neat augmenting-topologies unsupervised-learning unsupervised-machine-learning neural-network reinforcement-learning-algorithms reinforcement-learningThis repository provides implementation of Neuro-Evolution of Augmented Topologies (NEAT) with Novelty Search optimization implemented in GoLang. The Neuro-Evolution (NE) is an artificial evolution of Neural Networks (NN) using genetic algorithms in order to find optimal NN parameters and topology. Neuro-Evolution of NN may assume search for optimal weights of connections between NN nodes as well as search for optimal topology of resulting NN. The NEAT method implemented in this work do search for both: optimal connections weights and topology for given task (number of NN nodes per layer and their interconnections).
neuroevolution neat novelty-search artificial-neural-networks augmenting-topologies unsupervised-learning unsupervised-machine-learning unsupervised-learning-algorithms reinforcement-learning-algorithms modular-ai explainable-ai explainable-artificial-intelligenceThe Machine Learning Model Playgrounds is a project that is part of the dream of a team of Moses Olafenwa and John Olafenwa to bring current capabilities in machine learning and artificial intelligence into practical use for non-programmers and average computer users. This project is the first step in what we hope will become mainstream application in modern technology in which Computers, Smartphones, Edge Devices and Systems will have in-built state-of-the-art Machine Learning and Artificial Intelligence capabilities without having to connect to cloud based services. The Machine Learning Model Playgrounds is a series of Windows programs built using pure python libraries and code. Each of the programs is a user-friendly demo of Image Classification powered by a specific image classification model of popular Machine Learning Algorithms trained on the ImageNet (1000 object classes ) dataset. Each program provides a user interface where users can select a picture from their Windows system folder while the program process the selected picture and give top-10 possible results of the objects detected with percentage probability per each result. This repository contains the source code, models and builds of each of the programs in the Model Playgrounds series. It is provided to allow other developers outside our team to adapt, modify or extend the code to produce more programs that may be specific to a social, business, economic or scientific need. The dependencies used for this project are listed below: - Python 3.5.2 - Tensorflow 1.4.0 - Keras 2.0.8 - Numpy 1.13.1 - Scipy 0.19.1 - wxPython 4.0.0 Below you will find the details and pictures of each of the programs in the series. The ResNet Playground is powered by the ResNet50 model trained on the ImageNet dataset. You can find its source codes in the resnet-playground folder of this repository or follow this link. You can also download the Windows Installer for the program in the Release section of this project or follow this link. This program is a Windows 64-bit software that can be installed on Windows 7 and later versions of the Operating System. It has an installer size of 227mb and install size of 690mb. The program was compiled using PyInstaller 3.3 for Python 3.5 .
machine-learning artificial-intelligence artificial-neural-networks inceptionv3 resnet resnet-50 squeezenet densenet playgrounds inbuilt-api model-playgrounds
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.