Displaying 1 to 19 from 19 results

gorgonia - Gorgonia is a library that helps facilitate machine learning in Go.

  •    Go

Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily. If this sounds like Theano or TensorFlow, it's because the idea is quite similar. Specifically, the library is pretty low-level, like Theano, but has higher goals like Tensorflow.The main reason to use Gorgonia is developer comfort. If you're using a Go stack extensively, now you have access to the ability to create production-ready machine learning systems in an environment that you are already familiar and comfortable with.

gorgonia - Gorgonia is a library that helps facilitate machine learning in Go.

  •    Go

Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily. If this sounds like Theano or TensorFlow, it's because the idea is quite similar. Specifically, the library is pretty low-level, like Theano, but has higher goals like Tensorflow. The main reason to use Gorgonia is developer comfort. If you're using a Go stack extensively, now you have access to the ability to create production-ready machine learning systems in an environment that you are already familiar and comfortable with.

Regression - Multiple Regression Package for PHP

  •    PHP

This is a library for regression analysis of data. That is, it attempts to find the line of best fit to describe a relationship within the data. It takes in a series of training observations, each consisting of features and an outcome, and finds how much each feature contributes to the outcome. This library also handles logistic regression, in which the outcomes are booleans. In this case, the regression would give you how much each feature contributes to the probability of the outcome and the prediction process would give you the probability of the outcome for a given new example.




descent - First-order optimization tools

  •    Python

Descent is a package for performing first-order optimization in python. Documentation (work in progress) is available at descent.readthedocs.org.

NeuralNetwork

  •    CSharp

NeuralNetwork.NET is a .NET Standard 2.0 library that implements sequential and computation graph neural networks with customizable layers, built from scratch with C#. It provides simple APIs designed for quick prototyping to define and train models using stochastic gradient descent, as well as methods to save/load a network model and its metadata and more. The library also exposes CUDA-accelerated layers with more advanced features that leverage the GPU and the cuDNN toolkit to greatly increase the performances when training or using a neural network.


backprop - Heterogeneous automatic differentiation ("backpropagation") in Haskell

  •    Haskell

Automatic heterogeneous back-propagation. Write your functions to compute your result, and the library will automatically generate functions to compute your gradient.

linear-regression - Linear regression implemented in Ruby.

  •    Ruby

An implementation of a linear regression machine learning algorithm implemented in Ruby. More details about this example implementation can be found in this blog post.

Deep-Learning-From-Scratch - Six snippets of code that made deep learning what it is today.

  •    Jupyter

There are six snippets of code that made deep learning what it is today. Coding the History of Deep Learning on Floydhub' s blog covers the inventors and the background to their breakthroughs. In this repo, you can find all the code samples from the story.

machine-learning-course - R code for the assignments of Coursera machine learning course

  •    R

This is the R version assignments of the online machine learning course (MOOC) on Coursera website by Prof. Andrew Ng. This repository provides the starter code to solve the assignment in R statistical software; the completed assignments are also available beside each exercise file.

linear-regression-gradient-descent - ⭐️ Linear Regression with Gradient Descent in JavaScript (Unvectorized, Visualized)

  •    Javascript

This example project demonstrates how the gradient descent algorithm may be used to solve a linear regression problem. Read more about it here.

multivariate-linear-regression-gradient-descent-javascript - ⭐️ Multivariate Linear Regression with Gradient Descent in JavaScript (Vectorized)

  •    Javascript

This example project demonstrates how the gradient descent algorithm may be used to solve a multivariate linear regression problem. Read more about it here.

fmin_adam - Matlab implementation of the Adam stochastic gradient descent optimisation algorithm

  •    Matlab

This is a Matlab implementation of the Adam optimiser from Kingma and Ba [1], designed for stochastic gradient descent. It maintains estimates of the moments of the gradient independently for each parameter. fmin_adam is an implementation of the Adam optimisation algorithm (gradient descent with Adaptive learning rates individually on each parameter, with Momentum) from Kingma and Ba [1]. Adam is designed to work on stochastic gradient descent problems; i.e. when only small batches of data are used to estimate the gradient on each iteration, or when stochastic dropout regularisation is used [2].