Apertium is a machine translation platform, initially aimed at related-language pairs but expanded to deal with more divergent language pairs (such as English-Catalan). The platform provides a language-independent machine translation engine, tools to manage the linguistic data necessary to build a machine translation system for a given language pair and linguistic data for a growing number of language pairs.
translation language-translation machine-translation linguisticA general-purpose encoder-decoder framework for Tensorflow that can be used for Machine Translation, Text Summarization, Conversational Modeling, Image Captioning, and more.The official code used for the Massive Exploration of Neural Machine Translation Architectures paper.
tensorflow translation machine-translation neural-network deeplearningOpenNMT is a full-featured, open-source (MIT) neural machine translation system utilizing the Torch mathematical toolkit. OpenNMT only requires a Torch installation with few dependencies.
neural-machine-translation torch opennmt machine-translation deep-learningThis is a Pytorch port of OpenNMT, an open-source (MIT) neural machine translation system. It is designed to be research friendly to try out new ideas in translation, summary, image-to-text, morphology, and many other domains. Codebase is relatively stable, but PyTorch is still evolving. We currently only support PyTorch 0.4 and recommend forking if you need to have stable code.
deep-learning pytorch machine-translation neural-machine-translationFacebook recently open-sourced word vectors in 89 languages. However these vectors are monolingual; meaning that while similar words within a language share similar vectors, translation words from different languages do not have similar vectors. In a recent paper at ICLR 2017, we showed how the SVD can be used to learn a linear transformation (a matrix), which aligns monolingual vectors from two languages in a single vector space. In this repository we provide 78 matrices, which can be used to align the majority of the fastText languages in a single space. Word embeddings define the similarity between two words by the normalised inner product of their vectors. The matrices in this repository place languages in a single space, without changing any of these monolingual similarity relationships. When you use the resulting multilingual vectors for monolingual tasks, they will perform exactly the same as the original vectors. To learn more about word embeddings, check out Colah's blog or Sam's introduction to vector representations.
word-vectors machine-learning machine-translation natural-language-processing nlp distributed-representationsThis repo contains the source code in my personal column (https://zhuanlan.zhihu.com/zhaoyeyu), implemented using Python 3.6. Including Natural Language Processing and Computer Vision projects, such as text generation, machine translation, deep convolution GAN and other actual combat code.
deep-learning tensorflow-examples convolutional-neural-networks recurrent-neural-networks autoencoder gan style-transfer natural-language-processing machine-translationBartyCrouch incrementally updates your Strings files from your Code and from Interface Builder files. "Incrementally" means that BartyCrouch will by default keep both your already translated values and even your altered comments. Additionally you can also use BartyCrouch for machine translating from one language to 40+ other languages. Using BartyCrouch is as easy as running a few simple commands from the command line what can even be automated using a build script within your project. Bartycrouch now is part of Homebrew Core! No tap needed any more.
localization xib storyboard code machine-translation translation language xcode incrementalFelix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton and Matt Post (2017): Sockeye: A Toolkit for Neural Machine Translation. In eprint arXiv:cs-CL/1712.05690.If you are interested in collaborating or have any questions, please submit a pull request or issue. You can also send questions to sockeye-dev-at-amazon-dot-com.
deep-learning deep-neural-networks mxnet machine-learning machine-translation neural-machine-translation encoder-decoder attention-mechanism sequence-to-sequence sequence-to-sequence-models sockeye attention-is-all-you-need attention-alignment-visualization attention-model seq2seq convolutional-neural-networks translationand all of the above can be used simultaneously to train novel and complex architectures. See the predefined models to discover how they are defined and the API documentation to customize them. Additional experimental models are available in the config/models/ directory and can be used with the option --model <model_file.py>.
neural-machine-translation tensorflow opennmt machine-translation deep-learning natural-language-processingThe Neural Monkey package provides a higher level abstraction for sequential neural network models, most prominently in Natural Language Processing (NLP). It is built on TensorFlow. It can be used for fast prototyping of sequential models in NLP which can be used e.g. for neural machine translation or sentence classification. The higher-level API brings together a collection of standard building blocks (RNN encoder and decoder, multi-layer perceptron) and a simple way of adding new building blocks implemented directly in TensorFlow.
neural-machine-translation tensorflow nlp sequence-to-sequence neural-networks nmt machine-translation mt deep-learning image-captioning encoder-decoder gpuThis project is initiated and actively maintained by DiDi NLP team under DiDi AI Labs.
nlp chinese-nlp machine-translation chinese-word-segmentation entity-linkingMinimal Seq2Seq model with attention for neural machine translation in PyTorch. This implementation relies on torchtext to minimize dataset management and preprocessing parts.
seq2seq deep-learning machine-translationFeel free add to this via a pull request, with each section alphabetically ordered.
nmt mt neural-machine-translation machine-translation sequence-to-sequence deep-learningNeural Machine Translation with Keras (Theano and Tensorflow). for obtaining the required packages for running this library.
neural-machine-translation keras deep-learning sequence-to-sequence theano machine-learning nmt machine-translation lstm-networks gru tensorflow attention-mechanism web-demo transformer attention-is-all-you-need attention-model attention-seq2seqThe functionality provided by this plugin is built into OmegaT 4.1.0 and later. Please uninstall the plugin if you are using this version or later. Before run, Log in to Tencent Cloud API Key Console, create a secretId & scretKey or use an existing secretId & scretKey.
omegat machine-translation tencent-cloudis a nodejs module that uses statistical machine translation to translate between two different languages. the module is loosely based off of the IBM model 1 algorithm and has been tested using english.
machine-translation nlp natural-language-processing language translation machine statistics text probability corpusNatural Language Processing Pipeline - Sentence Splitting, Tokenization, Lemmatization, Part-of-speech Tagging and Dependency Parsing
embeddings parse nlp-cube language-pipeline tokenization sentence-splitting part-of-speech-tagger lemmatization dependency-parser dependency-parsing universal-dependencies machine-translation information-extractionEvaluation code for various unsupervised automated metrics for NLG (Natural Language Generation). It takes as input a hypothesis file, and one or more references files and outputs values of metrics. Rows across these files should correspond to the same example. where each line in the hypothesis file is a generated sentence and the corresponding lines across the reference files are ground truth reference sentences for the corresponding hypothesis.
natural-language-generation natural-language-processing nlg nlp evaluation bleu bleu-score meteor cider rouge rouge-l task-oriented-dialogue machine-translation dialog dialogue skip-thought-vectors skip-thoughtsIni adalah repository (gudang penyimpanan) data yang dikhususkan untuk proyek penelitian dan pengembangan teknologi penerjemahan mesin dari dan ke bahasa Indonesia (GARENG). Hanya data uji yang tersimpan di sini. Data latih yang digunakan masih bersifat privat dan belum dapat dibuat publik dengan alasan perizinan. Proyek penelitian dan pengembangan sistem penerjemahan mesin GARENG adalah kerjasama antara Rekanalar dan Beritagar.
machine-translation test-dataGNU Lesser General Public License version 2.1 or, at your option, any later version.
nlp machine-translation tokenizer
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.