Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily. If this sounds like Theano or TensorFlow, it's because the idea is quite similar. Specifically, the library is pretty low-level, like Theano, but has higher goals like Tensorflow.The main reason to use Gorgonia is developer comfort. If you're using a Go stack extensively, now you have access to the ability to create production-ready machine learning systems in an environment that you are already familiar and comfortable with.
machine-learning artificial-intelligence neural-network computation-graph differentiation gradient-descent gorgonia deep-learning deeplearning deep-neural-networks automatic-differentiation symbolic-differentiation go-libraryGorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily. If this sounds like Theano or TensorFlow, it's because the idea is quite similar. Specifically, the library is pretty low-level, like Theano, but has higher goals like Tensorflow. The main reason to use Gorgonia is developer comfort. If you're using a Go stack extensively, now you have access to the ability to create production-ready machine learning systems in an environment that you are already familiar and comfortable with.
machine-learning artificial-intelligence neural-network computation-graph differentiation gradient-descent gorgonia deep-learning deeplearning deep-neural-networks automatic-differentiation symbolic-differentiationThis is a library for regression analysis of data. That is, it attempts to find the line of best fit to describe a relationship within the data. It takes in a series of training observations, each consisting of features and an outcome, and finds how much each feature contributes to the outcome. This library also handles logistic regression, in which the outcomes are booleans. In this case, the regression would give you how much each feature contributes to the probability of the outcome and the prediction process would give you the probability of the outcome for a given new example.
logistic-regression gradient-descent multiple regressionInput 2D data points and fit a simple linear regression model using gradient descent. Built with PureScript. Playable at lettier.com/simple-linear-regression/. For a full write up, visit Let's make a Linear Regression Calculator with PureScript.
linear-regression gradient-descent data-science purescript functional-programming press-statistic machine-learning machine-learning-algorithms regression functional artificial-intelligence ai statistics web-development halogen nueral-networks purescript-halogen frontendDescent is a package for performing first-order optimization in python. Documentation (work in progress) is available at descent.readthedocs.org.
optimization numerical-methods proximal gradient-descentNeuralNetwork.NET is a .NET Standard 2.0 library that implements sequential and computation graph neural networks with customizable layers, built from scratch with C#. It provides simple APIs designed for quick prototyping to define and train models using stochastic gradient descent, as well as methods to save/load a network model and its metadata and more. The library also exposes CUDA-accelerated layers with more advanced features that leverage the GPU and the cuDNN toolkit to greatly increase the performances when training or using a neural network.
neural-network convolutional-neural-networks backpropagation-algorithm gradient-descent machine-learning classification-algorithims cnn supervised-learning ai cuda gpu-acceleration netstandard net-framework visual-studioThe GDLibrary is a pure-Matlab library of a collection of unconstrained optimization algorithms. This solves an unconstrained minimization problem of the form, min f(x). Note that the SGDLibrary internally contains this GDLibrary.
optimization optimization-algorithms machine-learning machine-learning-algorithms big-data gradient-descent gradient logistic-regression newton linear-regression svm lasso matrix-completion rosenbrock-problem softmax-regression multinomial-regression statistical-learning classificationThe NMFLibrary is a pure-Matlab library of a collection of algorithms of non-negative matrix factorization (NMF).
nmf optimization-algorithms machine-learning-algorithms factorization matrix-factorization big-data data-analysis constrained-optimization gradient-descent matlab matlab-toolbox online-learning stochastic-optimizers stochastic-gradient-descent robust-optimization bigdata sparse-representations orthogonal nonnegative-matrix-factorization nonnegativity-constraintsThe SparseGDLibrary is a pure-Matlab library of a collection of unconstrained optimization algorithms for sparse modeling. Run run_me_first for path configurations.
optimization optimization-algorithms machine-learning-algorithms machine-learning big-data gradient-descent sparse-linear-solver sparse-regression lasso-regression lasso elasticnet solver algorithms admm proximal-algorithms proximal-operators logistic-regression matrix-completion coordinate-descent support-vector-machinesAutomatic heterogeneous back-propagation. Write your functions to compute your result, and the library will automatically generate functions to compute your gradient.
backprop graph backpropagation deep-learning neural-network gradient-descent automatic-differentiationAn implementation of a linear regression machine learning algorithm implemented in Ruby. More details about this example implementation can be found in this blog post.
linear-regression gradient-descent machine-learning rubymlThere are six snippets of code that made deep learning what it is today. Coding the History of Deep Learning on Floydhub' s blog covers the inventors and the background to their breakthroughs. In this repo, you can find all the code samples from the story.
deep-learning linear-regression mnist perceptron least-squares gradient-descent backpropagationThis is the R version assignments of the online machine learning course (MOOC) on Coursera website by Prof. Andrew Ng. This repository provides the starter code to solve the assignment in R statistical software; the completed assignments are also available beside each exercise file.
machine-learning learning-curve pca linear-regression gradient-descent svm principal-component-analysis clustering neural-network k-means recommender-system classification regularization anomalydetection ghThis example project demonstrates how the gradient descent algorithm may be used to solve a linear regression problem. Read more about it here.
linear-regression gradient-descent machine-learningThis example project demonstrates how the gradient descent algorithm may be used to solve a multivariate linear regression problem. Read more about it here.
linear-regression gradient-descent multivariate machine-learning vectorizationA curated list of awesome machine learning and deep learning mathematics and advanced mathematics descriptions,documents,concepts,study materials,videos,libraries and software (by language).
deep-learning machine-learning algorithm mathematics linear-algebra static-analysis probability gradient-descent machine-learning-mathematics deep-learning-mathematics approximation-algorithms advanced-probabilityOptimLib is a lightweight C++ library of numerical optimization methods for nonlinear functions. The final command will install OptimLib into /usr/local.
optim evolutionary-algorithms differential-evolution particle-swarm-optimization bfgs cpp cpp11 lbfgs optimization optimization-algorithms openmp openmp-parallelization armadillo numerical-optimization-methods gradient-descent newtonThis is a Matlab implementation of the Adam optimiser from Kingma and Ba [1], designed for stochastic gradient descent. It maintains estimates of the moments of the gradient independently for each parameter. fmin_adam is an implementation of the Adam optimisation algorithm (gradient descent with Adaptive learning rates individually on each parameter, with Momentum) from Kingma and Ba [1]. Adam is designed to work on stochastic gradient descent problems; i.e. when only small batches of data are used to estimate the gradient on each iteration, or when stochastic dropout regularisation is used [2].
matlab optimization optimization-algorithms gradient-descent stochastic-gradient-descentCollection of machine learning research paper references
deep-neural-networks deep-learning gradient-descent machine-learning research-paper
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.