tf-rnn-attention - Tensorflow implementation of attention mechanism for text classification tasks.

  •        429

Tensorflow implementation of attention mechanism for text classification tasks. Inspired by "Hierarchical Attention Networks for Document Classification", Zichao Yang et al. (http://www.aclweb.org/anthology/N16-1174).

https://github.com/ilivans/tf-rnn-attention

Tags
Implementation
License
Platform

   




Related Projects

LSTM-Sentiment-Analysis - Sentiment Analysis with LSTMs in Tensorflow

  •    Jupyter

This repository contains the iPython notebook and training data to accompany the O'Reilly tutorial on sentiment analysis with LSTMs in Tensorflow. See the original tutorial to run this code in a pre-built environment on O'Reilly's servers with cell-by-cell guidance, or run these files on your own machine. There is also another file called Pre-Trained LSTM.ipynb which allows you to input your own text, and see the output of the trained network. Before running the notebook, you'll first need to download all data we'll be using. This data is located in the models.tar.gz and training_data.tar.gz tarballs. We will extract these into the same directory as Oriole LSTM.ipynb. As always, the first step is to clone the repository.

Seq2Seq-PyTorch - Sequence to Sequence Models with PyTorch

  •    Python

A vanilla sequence to sequence model presented in https://arxiv.org/abs/1409.3215, https://arxiv.org/abs/1406.1078 consits of using a recurrent neural network such as an LSTM (http://dl.acm.org/citation.cfm?id=1246450) or GRU (https://arxiv.org/abs/1412.3555) to encode a sequence of words or characters in a source language into a fixed length vector representation and then deocoding from that representation using another RNN in the target language. An extension of sequence to sequence models that incorporate an attention mechanism was presented in https://arxiv.org/abs/1409.0473 that uses information from the RNN hidden states in the source language at each time step in the deocder RNN. This attention mechanism significantly improves performance on tasks like machine translation. A few variants of the attention model for the task of machine translation have been presented in https://arxiv.org/abs/1508.04025.

sha-rnn - Single Headed Attention RNN - "Stop thinking with your head"

  •    Python

For full details see the paper Single Headed Attention RNN: Stop Thinking With Your Head. In summary, "stop thinking with your (attention) head".

word-rnn-tensorflow - Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow

  •    Python

Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow. Mostly reused code from https://github.com/sherjilozair/char-rnn-tensorflow which was inspired from Andrej Karpathy's char-rnn.


Char-RNN-TensorFlow - Multi-language Char RNN for TensorFlow >= 1.2.

  •    Python

Multi-language Char RNN in TensorFlow. You can use this code to generate English text, Chinese poetries and lyrics, Japanese text and text in other language.

crfasrnn_keras - CRF-RNN Keras/Tensorflow version

  •    Python

This repository contains Keras/Tensorflow code for the "CRF-RNN" semantic image segmentation method, published in the ICCV 2015 paper Conditional Random Fields as Recurrent Neural Networks. This paper was initially described in an arXiv tech report. The online demo of this project won the Best Demo Prize at ICCV 2015. Original Caffe-based code of this project can be found here. Results produced with this Keras/Tensorflow code are almost identical to that with the Caffe-based version. The root directory of the clone will be referred to as crfasrnn_keras hereafter.

text-classification-models-tf - Tensorflow implementations of Text Classification Models.

  •    Python

Tensorflow implementation of Text Classification Models. Semi-supervised text classification(Transfer learning) models are implemented at [dongjun-Lee/transfer-learning-text-tf].

char-rnn-tensorflow - Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow

  •    Python

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow. Inspired from Andrej Karpathy's char-rnn.

tensorflow_cookbook - Code for Tensorflow Machine Learning Cookbook

  •    Jupyter

This chapter intends to introduce the main objects and concepts in TensorFlow. We also introduce how to access the data for the rest of the book and provide additional resources for learning about TensorFlow. After we have established the basic objects and methods in TensorFlow, we now want to establish the components that make up TensorFlow algorithms. We start by introducing computational graphs, and then move to loss functions and back propagation. We end with creating a simple classifier and then show an example of evaluating regression and classification algorithms.

Tensorflow-Tutorial - Tensorflow tutorial from basic to hard

  •    Python

In these tutorials, we will build our first Neural Network and try to build some advanced Neural Network architectures developed recent years. All methods mentioned below have their video and text tutorial in Chinese. Visit 莫烦 Python for more.

cnn-text-classification-tf-chinese - CNN for Chinese Text Classification in Tensorflow

  •    Python

This code belongs to the "Implementing a CNN for Text Classification in Tensorflow" blog post. It is slightly simplified implementation of Kim's Convolutional Neural Networks for Sentence Classification paper in Tensorflow.

Tensorflow-Programs-and-Tutorials - Implementations of CNNs, RNNs, GANs, etc

  •    Jupyter

CNN's with Noisy Labels - This notebook looks at a recent paper that discusses how convolutional neural networks that are trained on random labels (with some probability) are still able to acheive good accuracy on MNIST. I thought that the paper showed some eye-brow raising results, so I went ahead and tried it out for myself. It was pretty amazing to see that even when training a CNN with random labels 50% of the time, and the correct labels the other 50% of the time, the network was still able to get a 90+% accuracy. Character Level RNN (Work in Progress) - This notebook shows you how to train a character level RNN in Tensorflow. The idea was inspired by Andrej Karpathy's famous blog post and was based on this Keras implementation. In this notebook, you'll learn more about what the model is doing, and how you can input your own dataset, and train a model to generate similar looking text.

NLP-Models-Tensorflow - Gathers machine learning and Tensorflow deep learning models for NLP problems, 1

  •    Jupyter

NLP-Models-Tensorflow, Gathers machine learning and tensorflow deep learning models for NLP problems, code simplify inside Jupyter Notebooks 100%. I will attached github repositories for models that I not implemented from scratch, basically I copy, paste and fix those code for deprecated issues.

LSTM-Human-Activity-Recognition - Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN (Deep Learning algo)

  •    Jupyter

Compared to a classical approach, using a Recurrent Neural Networks (RNN) with Long Short-Term Memory cells (LSTMs) require no or almost no feature engineering. Data can be fed directly into the neural network who acts like a black box, modeling the problem correctly. Other research on the activity recognition dataset can use a big amount of feature engineering, which is rather a signal processing approach combined with classical data science techniques. The approach here is rather very simple in terms of how much was the data preprocessed. Let's use Google's neat Deep Learning library, TensorFlow, demonstrating the usage of an LSTM, a type of Artificial Neural Network that can process sequential data / time series.

attention-ocr - A Tensorflow model for text recognition (CNN + seq2seq with visual attention) available as a Python package and compatible with Google Cloud ML Engine

  •    Python

Visual attention-based OCR model for image recognition with additional tools for creating TFRecords datasets and exporting the trained model with weights as a SavedModel or a frozen graph. This project is based on a model by Qi Guo and Yuntian Deng. You can find the original model in the da03/Attention-OCR repository.

sketch-rnn - Multilayer LSTM and Mixture Density Network for modelling path-level SVG Vector Graphics data in TensorFlow

  •    Python

This version of sketch-rnn has been depreciated. Please see an updated version of sketch-rnn, which is a full generative model for vector drawings. Implementation multi-layer recurrent neural network (RNN, LSTM GRU) used to model and generate sketches stored in .svg vector graphic files. The methodology used is to combine Mixture Density Networks with a RNN, along with modelling dynamic end-of-stroke and end-of-content probabilities learned from a large corpus of similar .svg files, to generate drawings that is simlar to the vector training data.

cnn-text-classification-tf - Convolutional Neural Network for Text Classification in Tensorflow

  •    Python

This code belongs to the "Implementing a CNN for Text Classification in Tensorflow" blog post. It is slightly simplified implementation of Kim's Convolutional Neural Networks for Sentence Classification paper in Tensorflow.

advanced-tensorflow - Little More Advanced TensorFlow Implementations

  •    Jupyter

Collection of (Little More + Refactored) Advanced TensorFlow Implementations. Try my best to implement algorithms with a single Jupyter Notebook.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.