Displaying 1 to 16 from 16 results

bayesian-machine-learning - Notebooks related to Bayesian methods for machine learning

  •    Jupyter

This repository is a collection of notebooks covering various topics of Bayesian methods for machine learning. Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikit-learn and GPy.

vae_cf - Variational autoencoders for collaborative filtering

  •    Jupyter

This notebook accompanies the paper "Variational autoencoders for collaborative filtering" by Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, and Tony Jebara, in The Web Conference (aka WWW) 2018. In this notebook, we show a complete self-contained example of training a variational autoencoder (as well as a denoising autoencoder) with multinomial likelihood (described in the paper) on the public Movielens-20M dataset, including both data preprocessing and model training.

video_prediction - Stochastic Adversarial Video Prediction

  •    Python

TensorFlow implementation for stochastic adversarial video prediction. Given a sequence of initial frames, our model is able to predict future frames of various possible futures. For example, in the next two sequences, we show the ground truth sequence on the left and random predictions of our model on the right. Predicted frames are indicated by the yellow bar at the bottom. For more examples, visit the project page. Stochastic Adversarial Video Prediction, Alex X. Lee, Richard Zhang, Frederik Ebert, Pieter Abbeel, Chelsea Finn, Sergey Levine. arXiv preprint arXiv:1804.01523, 2018.

NeuralDialog-CVAE - Tensorflow Implementation of Knowledge-Guided CVAE for dialog generation

  •    OpenEdge

We provide a TensorFlow implementation of the CVAE-based dialog model described in Learning Discourse-level Diversity for Neural Dialog Models using Conditional Variational Autoencoders, published as a long paper in ACL 2017. See the paper for more details. The outputs will be printed to stdout and generated responses will be saved at test.txt in the test_path.

tybalt - Training and evaluating a variational autoencoder for pan-cancer gene expression data

  •    Jupyter

The repository stores scripts to train, evaluate, and extract knowledge from a variational autoencoder (VAE) trained on 33 different cancer-types from The Cancer Genome Atlas (TCGA). The specific VAE model is named Tybalt after an instigative, cat-like character in Shakespeare's "Romeo and Juliet". Just as the character Tybalt sets off the series of events in the play, the model Tybalt begins the foray of VAE manifold learning in transcriptomics. Also, deep unsupervised learning likes cats.

vae-pytorch - AE and VAE Playground in PyTorch

  •    Jupyter

Disclaimer: VAE coming soon... The last activation of the decoder layer, the loss function, and the normalization scheme used on the training data are crucial for obtaining good reconstructions and preventing exploding negative losses.

VAE-Gumbel-Softmax - An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1

  •    Python

Also, included is a jupyter notebook which shows how the Gumbel-Max trick for sampling discrete variables relates to Concrete distributions. Note: Current Dockerfile is for TensorFlow 1.5 CPU training.

Synthesize3DviaDepthOrSil - [CVPR 2017] Generation and reconstruction of 3D shapes via modeling multi-view depth maps or silhouettes

  •    Lua

The following installs luaJIT and luarocks locally in $HOME/usr. If you want a system-wide installation, remove the -DCMAKE_INSTALL_PREFIX=$HOME/usr option. We assume luarocks and luajit are in $PATH. If they are not - and assuming you installed them locally in $HOME/usr - you can instead run ~/usr/bin/luarocks and ~/usr/bin/luajit.

lagvae - Lagrangian VAE

  •    Python

TensorFlow implementation for the paper A Lagrangian Perspective of Latent Variable Generative Models, UAI 2018 Oral. Lagrangian VAE provides a practical way to find the best trade-off between "consistency constraints" and "mutual information objectives", as opposed of performing extensive hyperparameter tuning. We demonstrate an example over InfoVAE, a latent variable generative model objective that requires tuning the strengths of corresponding hyperparameters.


  •    Python

Convolutional Autoencoder for Loop Closure 2.0. To get started, download the COCO dataset and the "stuff" annotations, then run dataset/gen_tfrecords.py. Make sure to unzip the tar in the dataset directory first. Doing this will generate the sharded tfrecord files as well as loss_weights.txt.

vae-cnn-mnist - Conditional variational autoencoder applied to EMNIST + an interactive demo to explore the latent space

  •    Jupyter

Conditional variational autoencoder applied to EMNIST + an interactive demo to explore the latent space.

We have large collection of open source products. Follow the tags from Tag Cloud >>

Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.