OpenNMT is a full-featured, open-source (MIT) neural machine translation system utilizing the Torch mathematical toolkit. OpenNMT only requires a Torch installation with few dependencies.
neural-machine-translation torch opennmt machine-translation deep-learningThis is a Pytorch port of OpenNMT, an open-source (MIT) neural machine translation system. It is designed to be research friendly to try out new ideas in translation, summary, image-to-text, morphology, and many other domains. Codebase is relatively stable, but PyTorch is still evolving. We currently only support PyTorch 0.4 and recommend forking if you need to have stable code.
deep-learning pytorch machine-translation neural-machine-translationSome examples require MNIST dataset for training and testing. Don't worry, this dataset will automatically be downloaded when running examples (with input_data.py). MNIST is a database of handwritten digits, for a quick description of that dataset, you can check this notebook.
recurrent-neural-networks convolutional-neural-networks deep-learning-tutorial tensorflow tensorlayer keras deep-reinforcement-learning tensorflow-tutorials deep-learning machine-learning notebook autoencoder multi-layer-perceptron reinforcement-learning tflearn neural-networks neural-network neural-machine-translation nlp cnnNLP-Models-Tensorflow, Gathers machine learning and tensorflow deep learning models for NLP problems, code simplify inside Jupyter Notebooks 100%. I will attached github repositories for models that I not implemented from scratch, basically I copy, paste and fix those code for deprecated issues.
nlp machine-learning embedded deep-learning chatbot language-detection lstm summarization attention speech-to-text neural-machine-translation optical-character-recognition pos-tagging lstm-seq2seq-tf dnc-seq2seq luong-apiSentencePiece is an unsupervised text tokenizer and detokenizer mainly for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. SentencePiece implements subword units (e.g., byte-pair-encoding (BPE) and unigram language model with the extension of direct training from raw sentences. SentencePiece allows us to make a purely end-to-end system that does not depend on language-specific pre postprocessing.
neural-machine-translation natural-language-processing word-segmentation translation text tokenizer neural-networkFelix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton and Matt Post (2017): Sockeye: A Toolkit for Neural Machine Translation. In eprint arXiv:cs-CL/1712.05690.If you are interested in collaborating or have any questions, please submit a pull request or issue. You can also send questions to sockeye-dev-at-amazon-dot-com.
deep-learning deep-neural-networks mxnet machine-learning machine-translation neural-machine-translation encoder-decoder attention-mechanism sequence-to-sequence sequence-to-sequence-models sockeye attention-is-all-you-need attention-alignment-visualization attention-model seq2seq convolutional-neural-networks translationand all of the above can be used simultaneously to train novel and complex architectures. See the predefined models to discover how they are defined and the API documentation to customize them. Additional experimental models are available in the config/models/ directory and can be used with the option --model <model_file.py>.
neural-machine-translation tensorflow opennmt machine-translation deep-learning natural-language-processingThe Neural Monkey package provides a higher level abstraction for sequential neural network models, most prominently in Natural Language Processing (NLP). It is built on TensorFlow. It can be used for fast prototyping of sequential models in NLP which can be used e.g. for neural machine translation or sentence classification. The higher-level API brings together a collection of standard building blocks (RNN encoder and decoder, multi-layer perceptron) and a simple way of adding new building blocks implemented directly in TensorFlow.
neural-machine-translation tensorflow nlp sequence-to-sequence neural-networks nmt machine-translation mt deep-learning image-captioning encoder-decoder gpuThis is a research project, not an official NVIDIA product. OpenSeq2Seq main goal is to allow researchers to most effectively explore various sequence-to-sequence models. The efficiency is achieved by fully supporting distributed and mixed-precision training. OpenSeq2Seq is built using TensorFlow and provides all the necessary building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition. We plan to extend it with other modalities in the future.
neural-machine-translation multi-gpu deep-learning sequence-to-sequence seq2seq multi-node speech-recognition speech-to-text mixed-precision float16Feel free add to this via a pull request, with each section alphabetically ordered.
nmt mt neural-machine-translation machine-translation sequence-to-sequence deep-learningThis is an example of using the new Google's TensorFlow library on monolingual translation going from modern English to Shakespeare based on research from Wei Xu. Delete /cache to start anew.
tensorflow shakespeare seq2seq neural-machine-translationNeural Machine Translation with Keras (Theano and Tensorflow). for obtaining the required packages for running this library.
neural-machine-translation keras deep-learning sequence-to-sequence theano machine-learning nmt machine-translation lstm-networks gru tensorflow attention-mechanism web-demo transformer attention-is-all-you-need attention-model attention-seq2seqWe implement a Deep Character-Level Neural Machine Translation based on Theano and Blocks. Please intall relative packages according to Blocks before testing our program. Note that, please use Python 3 instead of Python 2. There will be some problems with Python 2. The architecture of DCNMT is shown in the following figure which is a single, large neural network.
nmt neural-machine-translationCTranslate is a C++ implementation of OpenNMT's translate.lua script with no LuaTorch dependencies. It facilitates the use of OpenNMT models in existing products and on various platforms using Eigen as a backend. CTranslate provides optimized CPU translation and optionally offloads matrix multiplication on a CUDA-compatible device using cuBLAS. It only supports OpenNMT models released with the release_model.lua script.
eigen opennmt neural-machine-translation cppThis is NPMT, the source codes of Towards Neural Phrase-based Machine Translation and Sequence Modeling via Segmentations from Microsoft Research. It is built on top of the fairseq toolkit in Torch. We present the setup and Neural Machine Translation (NMT) experiments in Towards Neural Phrase-based Machine Translation. Neural Phrase-based Machine Translation (NPMT) explicitly models the phrase structures in output sequences using Sleep-WAke Networks (SWAN), a recently proposed segmentation-based sequence modeling method. To mitigate the monotonic alignment requirement of SWAN, we introduce a new layer to perform (soft) local reordering of input sequences. Different from existing neural machine translation (NMT) approaches, NPMT does not use attention-based decoding mechanisms. Instead, it directly outputs phrases in a sequential order and can decode in linear time.
sequence-to-sequence machine-translation npmt swan fairseq neural-machine-translation torchAfter 15 epochs, I obtained the Bleu score 7.38, which is far from good. Maybe some part in the implementation is incorrect. Or maybe we need more data or a bigger model. Details are available in the results folder.
bytenet-translation implementation neural-machine-translation
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.