This is an attempt at implementing Sequence to Sequence Learning with Neural Networks (seq2seq) and reproducing the results in A Neural Conversational Model (aka the Google chatbot). Human: What is the purpose of living? Machine: To live forever.
seq2seq torch machine-learning deep-learning neural-conversation-modelsIt is written in Theano and Lasagne. It uses end-to-end trained embeddings of 5 different emotions to generate responses conditioned by a given emotion. The code is flexible and allows to condition a response by an arbitrary categorical variable defined for some samples in the training data. With CakeChat you can, for example, train your own persona-based neural conversational model[5] or create an emotional chatting machine without external memory[4].
conversational-ai conversational-agents conversational-bots dialogue-agents dialogue-systems dialog-systems nlp deep-learning seq2seq seq2seq-chatbot seq2seq-model theano lasagneThis work tries to reproduce the results of A Neural Conversational Model (aka the Google chatbot). It uses a RNN (seq2seq model) for sentence predictions. It is done using python and TensorFlow. The loading corpus part of the program is inspired by the Torch neuralconvo from macournoyer.
chatbot deep-learning tensorflow seq2seqThis is the code for the article 'Turning design mockups into code with deep learning' on FloydHub's blog. Within three years deep learning will change front-end development. It will increase prototyping speed and lower the barrier for building software.
keras deep-learning seq2seq encoder-decoder lstm floydhub machine-learning cnn cnn-keras jupyter-notebook jupyterThese tutorials have been merged into the official PyTorch tutorials. Please go there for better maintained versions of these tutorials compatible with newer versions of PyTorch. Learn PyTorch with project-based tutorials. These tutorials demonstrate modern techniques with readable code and use regular data from the internet.
natural-language-processing natural-language-generation nlp nlg seq2seqDELTA is a deep learning based end-to-end natural language and speech processing platform. DELTA aims to provide easy and fast experiences for using, deploying, and developing natural language processing and speech models for both academia and industry use cases. DELTA is mainly implemented using TensorFlow and Python 3. For details of DELTA, please refer to this paper.
nlp deep-learning tensorflow speech sequence-to-sequence seq2seq speech-recognition text-classification speaker-verification nlu text-generation emotion-recognition tensorflow-serving tensorflow-lite inference asr serving front-endVisual attention-based OCR model for image recognition with additional tools for creating TFRecords datasets and exporting the trained model with weights as a SavedModel or a frozen graph. This project is based on a model by Qi Guo and Yuntian Deng. You can find the original model in the da03/Attention-OCR repository.
machine-learning ocr tensorflow google-cloud ml cnn seq2seq image-recognition hacktoberfest ocr-recognition google-cloud-ml텐서플로우를 기초부터 응용까지 단계별로 연습할 수 있는 소스 코드를 제공합니다. 텐서플로우 공식 사이트에서 제공하는 안내서의 대부분의 내용을 다루고 있으며, 공식 사이트에서 제공하는 소스 코드보다는 훨씬 간략하게 작성하였으므로 쉽게 개념을 익힐 수 있을 것 입니다. 또한, 모든 주석은 한글로(!) 되어 있습니다.
neural-network tensorflow mnist autoencoder rnn deep-learning tutorial chatbot seq2seq dqn word2vec cnn gan inception🤗 Transformers provides thousands of pretrained models to perform tasks on texts such as classification, information extraction, question answering, summarization, translation, text generation and more in over 100 languages. Its aim is to make cutting-edge NLP easier to use for everyone. 🤗 Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the community on our model hub. At the same time, each python module defining an architecture is fully standalone and can be modified to enable quick research experiments.
nlp natural-language-processing tensorflow pytorch transformer speech-recognition seq2seq flax gpt pretrained-models language-models natural-language-generation nlp-library language-model bert natural-language-understanding jax xlnet pytorch-transformers model-hubStock-Prediction-Models, Gathers machine learning and deep learning models for Stock forecasting, included trading bots and simulations. I code LSTM Recurrent Neural Network and Simple signal rolling agent inside Tensorflow JS, you can try it here, huseinhouse.com/stock-forecasting-js, you can download any historical CSV and upload dynamically.
deep-learning monte-carlo trading-bot lstm stock-market stock-price-prediction seq2seq learning-agents stock-price-forecasting evolution-strategies lstm-sequence stock-prediction-models deep-learning-stock strategy-agent monte-carlo-markov-chainAn NLP library with Awesome pre-trained Transformer models and easy-to-use interface, supporting wide-range of NLP tasks from research to industrial applications.
nlp dataset transformer seq2seq pretrained-models embedding bert ernie paddlenlpLingvo is a framework for building neural networks in Tensorflow, particularly sequence models. A list of publications using Lingvo can be found here.
nlp research translation tensorflow machine-translation speech distributed tts speech-synthesis mnist speech-recognition lm seq2seq speech-to-text gpu-computing language-model asrThe goal of this project of mine is to bring users to try and experiment with the seq2seq neural network architecture. This is done by solving different simple toy problems about signal prediction. Normally, seq2seq architectures may be used for other more sophisticated purposes than for signal prediction, let's say, language modeling, but this project is an interesting tutorial in order to then get to more complicated stuff. Except the fact I made available an ".py" Python version of this tutorial within the repository, it is more convenient to run the code inside the notebook. The ".py" code exported feels a bit raw as an exportation.
seq2seq tensorflow tensorflow-tutorialsA vanilla sequence to sequence model presented in https://arxiv.org/abs/1409.3215, https://arxiv.org/abs/1406.1078 consits of using a recurrent neural network such as an LSTM (http://dl.acm.org/citation.cfm?id=1246450) or GRU (https://arxiv.org/abs/1412.3555) to encode a sequence of words or characters in a source language into a fixed length vector representation and then deocoding from that representation using another RNN in the target language. An extension of sequence to sequence models that incorporate an attention mechanism was presented in https://arxiv.org/abs/1409.0473 that uses information from the RNN hidden states in the source language at each time step in the deocder RNN. This attention mechanism significantly improves performance on tasks like machine translation. A few variants of the attention model for the task of machine translation have been presented in https://arxiv.org/abs/1508.04025.
pytorch seq2seq deep-learning rnnFelix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton and Matt Post (2017): Sockeye: A Toolkit for Neural Machine Translation. In eprint arXiv:cs-CL/1712.05690.If you are interested in collaborating or have any questions, please submit a pull request or issue. You can also send questions to sockeye-dev-at-amazon-dot-com.
deep-learning deep-neural-networks mxnet machine-learning machine-translation neural-machine-translation encoder-decoder attention-mechanism sequence-to-sequence sequence-to-sequence-models sockeye attention-is-all-you-need attention-alignment-visualization attention-model seq2seq convolutional-neural-networks translationThis is a framework for sequence-to-sequence (seq2seq) models implemented in PyTorch. The framework has modularized and extensible components for seq2seq models, training and inference, checkpoints, etc. This is an alpha release. We appreciate any kind of feedback or contribution. This package requires Python 2.7 or 3.6. We recommend creating a new virtual environment for this project (using virtualenv or conda).
pytorch seq2seq deeplearningNote: the repository is not maintained. Feel free to PM me if you'd like to take up the maintainance. Build a general-purpose conversational chatbot based on a hot seq2seq approach implemented in tensorflow. Since it doesn't produce good results so far, also consider other implementations of seq2seq.
seq2seq chatbot tensorflowThis is a research project, not an official NVIDIA product. OpenSeq2Seq main goal is to allow researchers to most effectively explore various sequence-to-sequence models. The efficiency is achieved by fully supporting distributed and mixed-precision training. OpenSeq2Seq is built using TensorFlow and provides all the necessary building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition. We plan to extend it with other modalities in the future.
neural-machine-translation multi-gpu deep-learning sequence-to-sequence seq2seq multi-node speech-recognition speech-to-text mixed-precision float16This is a project use seq2seq model to play couplets (对对联)。 This project is written with Tensorflow. You can try the demo at https://ai.binwang.me/couplet. You will need some data to run this program, the dataset can be downloaded from this project.
seq2seq deep-learning machine-learning
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.