word-rnn-tensorflow - Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow

  •        31

Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow. Mostly reused code from https://github.com/sherjilozair/char-rnn-tensorflow which was inspired from Andrej Karpathy's char-rnn.

https://github.com/hunkim/word-rnn-tensorflow

Tags
Implementation
License
Platform

   




Related Projects

char-rnn-tensorflow - Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow

  •    Python

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow. Inspired from Andrej Karpathy's char-rnn.

LSTM-Human-Activity-Recognition - Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN (Deep Learning algo)

  •    Jupyter

Compared to a classical approach, using a Recurrent Neural Networks (RNN) with Long Short-Term Memory cells (LSTMs) require no or almost no feature engineering. Data can be fed directly into the neural network who acts like a black box, modeling the problem correctly. Other research on the activity recognition dataset can use a big amount of feature engineering, which is rather a signal processing approach combined with classical data science techniques. The approach here is rather very simple in terms of how much was the data preprocessed. Let's use Google's neat Deep Learning library, TensorFlow, demonstrating the usage of an LSTM, a type of Artificial Neural Network that can process sequential data / time series.

crfasrnn_keras - CRF-RNN Keras/Tensorflow version

  •    Python

This repository contains Keras/Tensorflow code for the "CRF-RNN" semantic image segmentation method, published in the ICCV 2015 paper Conditional Random Fields as Recurrent Neural Networks. This paper was initially described in an arXiv tech report. The online demo of this project won the Best Demo Prize at ICCV 2015. Original Caffe-based code of this project can be found here. Results produced with this Keras/Tensorflow code are almost identical to that with the Caffe-based version. The root directory of the clone will be referred to as crfasrnn_keras hereafter.

sketch-rnn - Multilayer LSTM and Mixture Density Network for modelling path-level SVG Vector Graphics data in TensorFlow

  •    Python

This version of sketch-rnn has been depreciated. Please see an updated version of sketch-rnn, which is a full generative model for vector drawings. Implementation multi-layer recurrent neural network (RNN, LSTM GRU) used to model and generate sketches stored in .svg vector graphic files. The methodology used is to combine Mixture Density Networks with a RNN, along with modelling dynamic end-of-stroke and end-of-content probabilities learned from a large corpus of similar .svg files, to generate drawings that is simlar to the vector training data.

advanced-tensorflow - Little More Advanced TensorFlow Implementations

  •    Jupyter

Collection of (Little More + Refactored) Advanced TensorFlow Implementations. Try my best to implement algorithms with a single Jupyter Notebook.


Char-RNN-TensorFlow - Multi-language Char RNN for TensorFlow >= 1.2.

  •    Python

Multi-language Char RNN in TensorFlow. You can use this code to generate English text, Chinese poetries and lyrics, Japanese text and text in other language.

LSTM-Sentiment-Analysis - Sentiment Analysis with LSTMs in Tensorflow

  •    Jupyter

This repository contains the iPython notebook and training data to accompany the O'Reilly tutorial on sentiment analysis with LSTMs in Tensorflow. See the original tutorial to run this code in a pre-built environment on O'Reilly's servers with cell-by-cell guidance, or run these files on your own machine. There is also another file called Pre-Trained LSTM.ipynb which allows you to input your own text, and see the output of the trained network. Before running the notebook, you'll first need to download all data we'll be using. This data is located in the models.tar.gz and training_data.tar.gz tarballs. We will extract these into the same directory as Oriole LSTM.ipynb. As always, the first step is to clone the repository.

Tensorflow-Programs-and-Tutorials - Implementations of CNNs, RNNs, GANs, etc

  •    Jupyter

CNN's with Noisy Labels - This notebook looks at a recent paper that discusses how convolutional neural networks that are trained on random labels (with some probability) are still able to acheive good accuracy on MNIST. I thought that the paper showed some eye-brow raising results, so I went ahead and tried it out for myself. It was pretty amazing to see that even when training a CNN with random labels 50% of the time, and the correct labels the other 50% of the time, the network was still able to get a 90+% accuracy. Character Level RNN (Work in Progress) - This notebook shows you how to train a character level RNN in Tensorflow. The idea was inspired by Andrej Karpathy's famous blog post and was based on this Keras implementation. In this notebook, you'll learn more about what the model is doing, and how you can input your own dataset, and train a model to generate similar looking text.

tf-rnn-attention - Tensorflow implementation of attention mechanism for text classification tasks.

  •    Python

Tensorflow implementation of attention mechanism for text classification tasks. Inspired by "Hierarchical Attention Networks for Document Classification", Zichao Yang et al. (http://www.aclweb.org/anthology/N16-1174).

Conditional-PixelCNN-decoder - Tensorflow implementation of Gated Conditional Pixel Convolutional Neural Network

  •    Python

This is a Tensorflow implementation of Conditional Image Generation with PixelCNN Decoders which introduces the Gated PixelCNN model based on PixelCNN architecture originally mentioned in Pixel Recurrent Neural Networks. The model can be conditioned on latent representation of labels or images to generate images accordingly. Images can also be modelled unconditionally. It can also act as a powerful decoder and can replace deconvolution (transposed convolution) in Autoencoders and GANs. A detailed summary of the paper can be found here. The gating accounts for remembering the context and model more complex interactions, like in LSTM. The network stack on the left is the Vertical stack that takes care of blind spots that occure while convolution due to the masking layer (Refer the Pixel RNN paper to know more about masking). Use of residual connection significantly improves the model performance.

tensorflow_cookbook - Code for Tensorflow Machine Learning Cookbook

  •    Jupyter

This chapter intends to introduce the main objects and concepts in TensorFlow. We also introduce how to access the data for the rest of the book and provide additional resources for learning about TensorFlow. After we have established the basic objects and methods in TensorFlow, we now want to establish the components that make up TensorFlow algorithms. We start by introducing computational graphs, and then move to loss functions and back propagation. We end with creating a simple classifier and then show an example of evaluating regression and classification algorithms.

pixel-rnn-tensorflow - in progress

  •    Python

Samples generated with pixel_cnn after 50 epochs.

write-rnn-tensorflow - Generative Handwriting using LSTM Mixture Density Network with TensorFlow

  •    Python

An attempt to implement the random handwriting generation portion of Alex Graves' paper. See my blog post at blog.otoro.net for more information.

neuralmonkey - An open-source tool for sequence learning in NLP built on TensorFlow.

  •    Python

The Neural Monkey package provides a higher level abstraction for sequential neural network models, most prominently in Natural Language Processing (NLP). It is built on TensorFlow. It can be used for fast prototyping of sequential models in NLP which can be used e.g. for neural machine translation or sentence classification. The higher-level API brings together a collection of standard building blocks (RNN encoder and decoder, multi-layer perceptron) and a simple way of adding new building blocks implemented directly in TensorFlow.

Tensorflow-Tutorial - Tensorflow tutorial from basic to hard

  •    Python

In these tutorials, we will build our first Neural Network and try to build some advanced Neural Network architectures developed recent years. All methods mentioned below have their video and text tutorial in Chinese. Visit 莫烦 Python for more.

tensorflow-rnn-shakespeare - Code from the "Tensorflow and deep learning - without a PhD, Part 2" session on Recurrent Neural Networks

  •    Python

This sample has now been updated for Tensorflow 1.1. Please make sure you redownload the checkpoint files if you use rnn_play.py. The script rnn_train.py trains a language model on the complete works of William Shakespeare. You can also train on Tensorflow Python code. See comments in the file.

char-rnn - Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch

  •    Lua

This code implements multi-layer Recurrent Neural Network (RNN, LSTM, and GRU) for training/sampling from character-level language models. In other words the model takes one text file as input and trains a Recurrent Neural Network that learns to predict the next character in a sequence. The RNN can then be used to generate text character by character that will look like the original training data. The context of this code base is described in detail in my blog post. If you are new to Torch/Lua/Neural Nets, it might be helpful to know that this code is really just a slightly more fancy version of this 100-line gist that I wrote in Python/numpy. The code in this repo additionally: allows for multiple layers, uses an LSTM instead of a vanilla RNN, has more supporting code for model checkpointing, and is of course much more efficient since it uses mini-batches and can run on a GPU.

seq2seq-signal-prediction - Signal prediction with a Sequence-to-Sequence (seq2seq) Recurrent Neural Network (RNN) model in TensorFlow - Guillaume Chevalier

  •    Jupyter

The goal of this project of mine is to bring users to try and experiment with the seq2seq neural network architecture. This is done by solving different simple toy problems about signal prediction. Normally, seq2seq architectures may be used for other more sophisticated purposes than for signal prediction, let's say, language modeling, but this project is an interesting tutorial in order to then get to more complicated stuff. Except the fact I made available an ".py" Python version of this tutorial within the repository, it is more convenient to run the code inside the notebook. The ".py" code exported feels a bit raw as an exportation.

DeepQA - My tensorflow implementation of "A neural conversational model", a Deep learning based chatbot

  •    Python

This work tries to reproduce the results of A Neural Conversational Model (aka the Google chatbot). It uses a RNN (seq2seq model) for sentence predictions. It is done using python and TensorFlow. The loading corpus part of the program is inspired by the Torch neuralconvo from macournoyer.