Displaying 1 to 17 from 17 results

nolearn contains a number of wrappers and abstractions around existing neural network libraries, most notably Lasagne, along with a few machine learning utility modules. All code is written to be compatible with scikit-learn. We recommend using venv (when using Python 3) or virtualenv (Python 2) to install nolearn.

scikit-learn lasagne deep-learning machine-learningA course on reinforcement learning in the wild. Taught on-campus at HSE and YSDA and maintained to be friendly to online students (both english and russian). The syllabus is approximate: the lectures may occur in a slightly different order and some topics may end up taking two weeks.

reinforcement-learning course-materials deep-learning deep-reinforcement-learning git-course mooc theano lasagne tensorflow pytorch pytorch-tutorials kerasIt is written in Theano and Lasagne. It uses end-to-end trained embeddings of 5 different emotions to generate responses conditioned by a given emotion. The code is flexible and allows to condition a response by an arbitrary categorical variable defined for some samples in the training data. With CakeChat you can, for example, train your own persona-based neural conversational model[5] or create an emotional chatting machine without external memory[4].

conversational-ai conversational-agents conversational-bots dialogue-agents dialogue-systems dialog-systems nlp deep-learning seq2seq seq2seq-chatbot seq2seq-model theano lasagneIf you want to share your data and configurations between the host (your machine or VM) and the container in which you are using Deepo, use the -v option, e.g. This will make /host/data from the host visible as /data in the container, and /host/config as /config. Such isolation reduces the chances of your containerized experiments overwriting or using wrong data.

deep-learning jupyter lasagne caffe tensorflow sonnet keras theano chainer torch pytorch mxnet cntk dockerfile-generator docker-image caffe2 onnxThis repo supplements Deep Learning course taught at YSDA and Skoltech @spring'18. For previous iteration visit the fall17 branch. Lecture and seminar materials for each week are in ./week* folders. Homeworks are in ./homework* folders.

deep-learning course course-materials theano lasagneAgentNet is a deep reinforcement learning framework, which is designed for ease of research and prototyping of Deep Learning models for Markov Decision Processes. We have a full in-and-out support for Lasagne deep learning library, granting you access to all convolutions, maxouts, poolings, dropouts, etc. etc. etc.

reinforcement-learning framework theano lasagne opeani-gym binder qlearning deep-learning deep-neural-networksWhile research in Deep Learning continues to improve the world, we use a bunch of tricks to implement algorithms with TensorLayer day to day. Here are a summary of the tricks to use TensorLayer. If you find a trick that is particularly useful in practice, please open a Pull Request to add it to the document. If we find it to be reasonable and verified, we will merge it in.

tensorlayer tensorflow deep-learning machine-learning data-science neural-network reinforcement-learning neural-networks tensorflow-tutorials tensorflow-models computer-vision tensorflow-framework tensorflow-library tflearn keras tensorboard nlp natural-language-processing lasagne tensorflow-experimentsRecent results in Bayesian statistics for constructing robust neural networks have proved that it is one of the best ways to deal with uncertainty, overfitting but still having good performance. Gelato will help to use bayes for neural networks. Library heavily relies on Theano, Lasagne and PyMC3.I use generic approach for decorating all Lasagne at once. Thus, for using Gelato you need to replace import statements for layers only. For constructing a network you need to be the in pm.Model context environment.

bayesian-inference lasagne uncertainty variational-inference gelato neural-network theano deep-learning bayesianThis is a port of the caffe implementation of the ICCV'15 paper "FlowNet: Learning Optical Flow with Convolutional Networks" by Dosovitskiy et al to Theano and Lasagne. It contains both FlowNetS and FlowNetC models and a port of the correlation layer. caffe_to_numpy.py script can be used to convert caffe models to the npz format. caffemodel and prototxt files should be placed in the model subdirectory. Alternatively you can download weights from Google Drive.

deep-learning optical-flow theano lasagne computer-visionThis repo contains the code for the paper "Deep Relative Attributes" by Yaser Souri, Erfan Noury, Ehsan Adeli Mosabbeb. Deep Relative Attributes by Yaser Souri (@yassersouri), Erfan Noury (@erfannoury), Ehsan Adeli Mosabbeb (@eadeli). ACCV 2016.

deeplearning convolutional-neural-networks lasagne paper deep-learningWelcome to my GitHub repo. I am a Data Scientist and I code in R, Python and Wolfram Mathematica. Here you will find some Machine Learning, Deep Learning, Natural Language Processing and Artificial Intelligence models I developed.

r python3 python-3 mathematica lasagne theano theano-models autoencoder face-recognition natural-language-processing nlp nlp-machine-learning deep-learning keras lstm lstm-neural-networks timeseries time-series-analysis word2vecThis code implements Periodic Spatial Generative Adversarial Networks (PSGANs) on top of Lasagne/Theano. The code was tested on top of Lasagne (version 0.2.dev1) and Theano (0.9.0dev2). PSGANs can generate sample textures of arbitrary size that look strikingly similar - but not exactly the same - compared to a single (or several) source image(s).

generative-adversarial-networks dcgan gan generative-model machine-learning theano lasagneThe code was tested on top of Lasagne (version 0.2.dev1) and Theano (0.9.0dev2). SGANs can generate sample textures of arbitrary size that look strikingly similar - but not exactly the same - compared to a single (or several) source image(s).

generative-adversarial-network machine-learning dcgan gan theano lasagneand most importantly, a SYMBOLIC/DECLARATIVE programming environment allowing CONCISE/EXPLICIT/OPTIMIZED computations. For a deep network oriented imperative library built on JAX and with a JAX syntax check out FLAX.

deep-neural-networks lasagne theano deep-learning tensorflow numpy dataset jaxFormats and cleans your data to get it ready for machine learning!

neural-network machine-learning data-formatting normalization min-max-normalization min-max-normalizing brain.js automated-machine-learning bestbrain data-science kaggle scikit-learn sklearn scikit-neuralnetworks lasagne nolearn nolearn.lasagne data-cleaning data-munging data-preparation imputing-missing-values filling-in-missing-values dataset data-set training testing random-forest vectorization categorization one-hot-encoding dictvectorizer preprocessing feature-selection feature-engineering
We have large collection of open source products. Follow the tags from
Tag Cloud >>

Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
**Add Projects.**