Displaying 1 to 4 from 4 results

extreme-deconvolution - Density estimation using Gaussian mixtures in the presence of noisy, heterogeneous and incomplete data

  •    Python

Extreme-deconvolution (XD) is a general algorithm to infer a d-dimensional distribution function from a set of heterogeneous, noisy observations or samples. It is fast, flexible, and treats the data's individual uncertainties properly, to get the best description possible of the underlying distribution. It performs well over the full range of density estimation, from small data sets with only tens of samples per dimension, to large data sets with millions of data points. The extreme-deconvolution algorithm is available here as a dynamic C-library that your programs can link to, or through Python, R, or IDL wrappers that allow you to call the fast underlying C-code in your high-level applications with minimal overhead.

jlearn - Machine Learning Library, written in J

  •    J

WIP Machine learning library, written in J. Various algorithm implementations, including MLPClassifiers, MLPRegressors, Mixture Models, K-Means, KNN, RBF-Network, Self-organizing Maps. Models can be serialized to text files, with a mixture of text and binary packing. The size of the serialized file depends on the size of the model, but will probably range from 10 MB and upwards for NN models (including convnets and rec-nets).

PyBGMM - Bayesian inference for Gaussian mixture model with some novel algorithms

  •    Python

Bayesian inference for Gaussian mixture model to reduce over-clustering via the powered Chinese restaurant process (pCRP). We use collapsed Gibbs sampling for posterior inference.