It is the generic golden program for deep learning with TensorFlow.Following are the supported features.
tensorflow tfrecords libsvm csv deep-learning machine-learning mlp cnn lstm classifier recommendation-system cpp spark grpc android mavenbrain.js is a library of Neural Networks written in JavaScript. 💡 Note: This is a continuation of the harthur/brain repository (which is not maintained anymore). For more details, check out this issue.
neural-network brain recurrent-neural-networks easy-to-use api web nodejs browser convolutional-neural-networks node stream ai artificial-intelligence brainjs brain.js feed-forward classifier neural network neural-networks machine-learning synapse recurrent long-short-term-memory gated-recurrent-unit rnn lstm gruIntel MKL-DNN repository migrated to https://github.com/intel/mkl-dnn. The old address will continue to be available and will redirect to the new repo. Please update your links. Intel(R) Math Kernel Library for Deep Neural Networks (Intel(R) MKL-DNN) is an open source performance library for deep learning applications. The library accelerates deep learning applications and framework on Intel(R) architecture. Intel(R) MKL-DNN contains vectorized and threaded building blocks which you can use to implement deep neural networks (DNN) with C and C++ interfaces.
intel mkl-dnn deep-learning deep-neural-networks cnn rnn lstm c-plus-plus intel-architecture xeon xeon-phi atom core simd sse42 avx2 avx512 avx512-vnni performanceAiLearning: 机器学习 - MachineLearning - ML、深度学习 - DeepLearning - DL、自然语言处理 NLP
fp-growth apriori mahchine-leaning naivebayes svm adaboost kmeans svd pca logistic regression recommendedsystem sklearn scikit-learn nlp deeplearning dnn lstm rnnThis package contains an OCR engine - libtesseract and a command line program - tesseract. The lead developer is Ray Smith. The maintainer is Zdenko Podobny. For a list of contributors see AUTHORS and GitHub's log of contributors.
tesseract tesseract-ocr ocr lstm machine-learning ocr-engineTensorflow implementation of Character-Aware Neural Language Models. The original code of author can be found here. The current implementation has a performance issue. See #3.
tensorflow cnn lstm nlpThe model can be composed of an LSTM or a Quasi-Recurrent Neural Network (QRNN) which is two or more times faster than the cuDNN LSTM in this setup while achieving equivalent or better accuracy. The codebase is now PyTorch 0.4 compatible for most use cases (a big shoutout to https://github.com/shawntan for a fairly comprehensive PR https://github.com/salesforce/awd-lstm-lm/pull/43). Mild readjustments to hyperparameters may be necessary to obtain quoted performance. If you desire exact reproducibility (or wish to run on PyTorch 0.3 or lower), we suggest using an older commit of this repository. We are still working on pointer, finetune and generate functionalities.
lstm pytorch language-model sgd qrnnThis is a rough list of my favorite deep learning resources. It has been useful to me for learning how to do deep learning, I use it for revisiting topics or for reference. I (Guillaume Chevalier) have built this list and got through all of the content listed here, carefully. You might also want to look at Andrej Karpathy's new post about trends in Machine Learning research.
awesome awesome-list deep-learning machine-learning tensorflow lstm cnnCompared to a classical approach, using a Recurrent Neural Networks (RNN) with Long Short-Term Memory cells (LSTMs) require no or almost no feature engineering. Data can be fed directly into the neural network who acts like a black box, modeling the problem correctly. Other research on the activity recognition dataset can use a big amount of feature engineering, which is rather a signal processing approach combined with classical data science techniques. The approach here is rather very simple in terms of how much was the data preprocessed. Let's use Google's neat Deep Learning library, TensorFlow, demonstrating the usage of an LSTM, a type of Artificial Neural Network that can process sequential data / time series.
machine-learning deep-learning lstm human-activity-recognition neural-network rnn recurrent-neural-networks tensorflow activity-recognitionSequence labeling models are quite popular in many NLP tasks, such as Named Entity Recognition (NER), part-of-speech (POS) tagging and word segmentation. State-of-the-art sequence labeling models mostly utilize the CRF structure with input word features. LSTM (or bidirectional LSTM) is a popular deep learning based feature extractor in sequence labeling task. And CNN can also be used due to faster computation. Besides, features within word are also useful to represent word, which can be captured by character LSTM or character CNN structure or human-defined neural features. NCRF++ is a PyTorch based framework with flexiable choices of input features and output structures. The design of neural sequence labeling models with NCRF++ is fully configurable through a configuration file, which does not require any code work. NCRF++ is a neural version of CRF++, which is a famous statistical CRF framework.
pytorch ner sequence-labeling crf lstm-crf char-rnn char-cnn named-entity-recognition part-of-speech-tagger chunking neural-networks nbest lstm cnn batchAn ai powered automatically generats poems in Chinese. 很久以来,我们都想让机器自己创作诗歌,当无数作家、编辑还没有抬起笔时,AI已经完成了数千篇文章。现在,这里是第一步....
tensorflow poetry lstm rnnSome interesting TensorFlow tutorials for beginners.
tensorflow tensorflow-tutorials lstm cnnMulti-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow. Mostly reused code from https://github.com/sherjilozair/char-rnn-tensorflow which was inspired from Andrej Karpathy's char-rnn.
rnn tensorflow rnn-tensorflow lstmThis is the code for the article 'Turning design mockups into code with deep learning' on FloydHub's blog. Within three years deep learning will change front-end development. It will increase prototyping speed and lower the barrier for building software.
keras deep-learning seq2seq encoder-decoder lstm floydhub machine-learning cnn cnn-keras jupyter-notebook jupyterThis project is currently in development. ml5.js aims to make machine learning approachable for a broad audience of artists, creative coders, and students. The library provides access to machine learning algorithms and models in the browser, building on top of TensorFlow.js with no other external dependencies.
lstm deep-learning imagenet machine-learning neural-network p5xjs p5jsTo the extent possible under law, Feross Aboukhadijeh has waived all copyright and related or neighboring rights to this work.
awesome awesome-list lstm npm nodejs browser browserify mad-science mad-science-modules science hackathonPlease use python 2.7 to install LSTMVis.
lstm neural-network visualization recurrent-neural-networksThis is the solution for Zhihu Machine Learning Challenge 2017. We won the champion out of 963 teams. You may need tf.contrib.keras.preprocessing.sequence.pad_sequences for data preprocessing.
pytorch nlp textcnn textrnn fasttext textrcnn lstmThis repository contains the iPython notebook and training data to accompany the O'Reilly tutorial on sentiment analysis with LSTMs in Tensorflow. See the original tutorial to run this code in a pre-built environment on O'Reilly's servers with cell-by-cell guidance, or run these files on your own machine. There is also another file called Pre-Trained LSTM.ipynb which allows you to input your own text, and see the output of the trained network. Before running the notebook, you'll first need to download all data we'll be using. This data is located in the models.tar.gz and training_data.tar.gz tarballs. We will extract these into the same directory as Oriole LSTM.ipynb. As always, the first step is to clone the repository.
sentiment-analysis tensorflow lstm rnnChatbot in 200 lines of code using TensorLayer
tensorlayer tensorflow chatbot rnn lstm bot nlp chat corpus
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.