Displaying 1 to 12 from 12 results

ann-visualizer - A python library for visualizing Artificial Neural Networks (ANN)

  •    Python

A great visualization python library used to work with Keras. It uses python's graphviz library to create a presentable graph of the neural network you are building. This library is still unstable. Please report all bug to the issues section. It is currently tested with python3.5 and python3.6, but it should run just fine on any python3.

genann - simple neural network library in ANSI C

  •    C

Genann is a minimal, well-tested library for training and using feedforward artificial neural networks (ANN) in C. Its primary focus is on being simple, fast, reliable, and hackable. It achieves this by providing only the necessary functions and little extra. Genann is self-contained in two files: genann.c and genann.h. To use Genann, simply add those two files to your project.

Haystack - Build a natural language interface for your data

  •    Python

Haystack is an end-to-end framework that enables you to build powerful and production-ready pipelines for different search use cases. Whether you want to perform Question Answering or semantic document search, you can use the State-of-the-Art NLP models in Haystack to provide unique search experiences and allow your users to query in natural language. Haystack is built in a modular fashion so that you can combine the best technology from other open-source projects like Huggingface's Transformers, Elasticsearch, or Milvus.

Hopfield Simulator

  •    DotNet

This is The Hopfield neural network simulator. It show how a Hopfield neural network works as recurent NN. At the moment I need to develop GUI for this project. And I want to invite funs of AI algorithms like me. Project is developin in C++ and C#. I use Visual Studio 2008.

cppnjs - Compositional Pattern Producing Networks, now in a javascript library near you!

  •    Javascript

Tests are run through Mocha and should.js. Activation functions and cppns are tested against the HyperNEAT C# code from Sebastian Risi's work. This library is used by the neatjs library and the winjs library as well, usually as an NPM module -- but can be used inside HTML.

neatjs - A javascript implementation of Neuro Evolution of Augmenting Topologies

  •    Javascript

NeuroEvolution of Augmenting Topologies (NEAT) implemented in Javascript (with tests done in Mocha for verification). Can be used as a node module or in a browser

SUDL - light deep neural network tools box(LSTM,GRU,RNN,CNN,Bi-LSTM,etc)

  •    C++

net architecture is built by proto file that you defined, just like what the examples do.

cerebrum - Artificial Neural Networks (ANN's) in Ruby (unmaintained, unfortunately)

  •    Ruby

cerebrum is an implementation of ANNs in Ruby. There's no reason to train a neural network in Ruby, I'm using it to experiment and play around with the bare fundamentals of ANNs, original idea for this project here is currently unmaintained. Extensions on top of that are personal experimentation. Use Cerebrum#train to train the network with an array of training data.

python-neuron - Neuron class provides LNU, QNU, RBF, MLP, MLP-ELM neurons

  •    Python

Neuron class provides LNU (Linear Neural Unit), QNU (Quadratic Neural Unit), RBF (Radial Basis Function), MLP (Multi Layer Perceptron), MLP-ELM (Multi Layer Perceptron - Extreme Learning Machine) neurons learned with Gradient descent or LeLevenberg–Marquardt algorithm. This class is suitable for prediction on time series. Neuron class needs pandas and numpy to work propertly.


  •    Matlab

This code helps you classify malignant and benign tumors using Neural Networks. The example code is in Matlab (R2016 or higher will work).

ntnu-som - Using Self-Organizing Maps for Travelling Salesman Problem

  •    Python

Self-organizing maps (SOM) or Kohonen maps are a type of artificial neural network (ANN) that mixes in an interesting way the concepts of competitive and cooperative neural networks. A SOM behaves as a typical competitive ANN, where the neurons fight for a case. The interesting twist added by Kohonen is that when a neurons wins a case, the prize is shared with its neighbors. Typically, the neighborhood is bigger at the beginning of the training, and it shrinks in order to let the system converge to a solution. One of the most interesting applications of this technique is applying it to the Travelling Salesman Problem, in which we can use a coordinate map and trace a route using the neurons in the ANN. By defining weight vectors as positions in the map, we can iterate the cities and treat each one as a case that can be won by a single neuron. The neuron that wins the case gets it weight vector updated to be closer to the city, but also its neighbors get updated. The neurons are placed in a 2D space, but they are only aware of a single dimension in their internal ANN, so their behavior is like an elastic ring that will eventually fit all the cities in the shortest distance possible.

We have large collection of open source products. Follow the tags from Tag Cloud >>

Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.