The current test error is 0.21% and the best test error is 0.20%. The current test accuracy is 99.31% and the best test accuracy is 99.32%. A Capsule is a group of neurons whose activity vector represents the instantiation parameters of a specific type of entity such as an object or object part.
https://github.com/cedrickchee/capsule-net-pytorchTags | capsnet capsule-network neural-networks pytorch dynamic-routing dynamic-routing-between-capsules hinton capsule |
Implementation | Python |
License | Public |
Platform | Windows Linux |
Part I: Intuition. Part II: How Capsules Work.
pytorch pytorch-tutorials easy-to-use clean-code capsule-networkA curated list of awesome resources related to capsule networks maintained by AI Summary. Please pull a request if you are aware of additional resources.
capsule-networks capsule-network computer-vision deep-learning neural-networks awesome-listA comprehensive list of Deep Learning / Artificial Intelligence and Machine Learning tutorials - rapidly expanding into areas of AI/Deep Learning / Machine Vision / NLP and industry specific areas such as Automotives, Retail, Pharma, Medicine, Healthcare by Tarry Singh until at-least 2020 until he finishes his Ph.D. (which might end up being inter-stellar cosmic networks! Who knows! 😀)
machine-learning deep-learning tensorflow pytorch keras matplotlib aws kaggle pandas scikit-learn torch artificial-intelligence neural-network convolutional-neural-networks tensorflow-tutorials python-data ipython-notebook capsule-networkPyTorch is a flexible deep learning framework that allows automatic differentiation through dynamic neural networks (i.e., networks that utilise dynamic control flow like if statements and while loops). It supports GPU acceleration, distributed training, various optimisations, and plenty more neat features. These are some notes on how I think about using PyTorch, and don't encompass all parts of the library or every best practice, but may be helpful to others. Neural networks are a subclass of computation graphs. Computation graphs receive input data, and data is routed to and possibly transformed by nodes which perform processing on the data. In deep learning, the neurons (nodes) in neural networks typically transform data with parameters and differentiable functions, such that the parameters can be optimised to minimise a loss via gradient descent. More broadly, the functions can be stochastic, and the structure of the graph can be dynamic. So while neural networks may be a good fit for dataflow programming, PyTorch's API has instead centred around imperative programming, which is a more common way for thinking about programs. This makes it easier to read code and reason about complex programs, without necessarily sacrificing much performance; PyTorch is actually pretty fast, with plenty of optimisations that you can safely forget about as an end user (but you can dig in if you really want to).
deep-learningPyTorch is a deep learning framework that puts Python first. It is a python package that provides Tensor computation (like numpy) with strong GPU acceleration, Deep Neural Networks built on a tape-based autograd system. You can reuse your favorite python packages such as numpy, scipy and Cython to extend PyTorch when needed.
neural-network autograd gpu numpy deep-learning tensorCapsule is a packaging and deployment tool for JVM applications. A capsule is a single executable JAR that contains everything your application needs to run either in the form of embedded files or as declarative metadata. It can contain your JAR artifacts, your dependencies and resources, native libraries, the require JRE version, the JVM flags required to run the application well, Java or native agents and more. In short, a capsule is a self-contained JAR that knows everything there is to know about how to run your application the way it's meant to run.
jar jvm packaging deployment java-packaging-toolOpen Neural Network Exchange (ONNX) is the first step toward an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Initially we focus on the capabilities needed for inferencing (evaluation). Caffe2, PyTorch, Microsoft Cognitive Toolkit, Apache MXNet and other tools are developing ONNX support. Enabling interoperability between different frameworks and streamlining the path from research to production will increase the speed of innovation in the AI community. We are an early stage and we invite the community to submit feedback and help us further evolve ONNX.
deep-learning deep-neural-networks neural-network onnx pytorch caffe2 cntkCapsule aims to become a full-fledged (immutable) collections library for Java 8+ that is solely built around persistent tries. The library is designed for standalone use and for being embedded in domain-specific languages. Capsule still has to undergo some incubation before it can ship as a well-rounded collection library. Nevertheless, the code is stable and performance is solid.
hashmap trie immutable immutable-collections persistent-data-structure hashset performance data-structure collections map list champThis is the code for this video on Youtube by Siraj Raval on Capsule Networks. If you find out that the Wechat group QR is invalid, add my personal account.
The goal is to teach a siamese network to be able to distinguish pairs of images. This project uses pytorch. Any dataset can be used. Each class must be in its own folder. This is the same structure that PyTorch's own image folder dataset uses.
pytorch-tutorial deep-learning neural-network siamese-network pytorch face-recognitionThis repository contains material related to Udacity's Deep Reinforcement Learning Nanodegree program. The tutorials lead you through implementing various algorithms in reinforcement learning. All of the code is in PyTorch (v0.4) and Python 3.
deep-reinforcement-learning reinforcement-learning reinforcement-learning-algorithms neural-networks pytorch pytorch-rl ddpg dqn ppo dynamic-programming cross-entropy hill-climbing ml-agents openai-gym-solutions openai-gym rl-algorithmsPyTorch implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks. * All samples in README.md are genearted by neural network except the first image for each row. * Network structure is slightly diffferent (here) from the author's code.
gan generative-model unsupervised-learning pytorchIgnite is a high-level library to help with training neural networks in PyTorch. As you can see, the code is more concise and readable with ignite. Furthermore, adding additional metrics, or things like early stopping is a breeze in ignite, but can start to rapidly increase the complexity of your code when "rolling your own" training loop.
pytorch neural-network machine-learning deep-learningAmazon Forest Computer Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks
pytorch data-augmentation kaggle-competition kaggle deep-learning computer-vision keras neural-networks neural-network-example transfer-learningDistiller is an open-source Python package for neural network compression research. Network compression can reduce the memory footprint of a neural network, increase its inference speed and save energy. Distiller provides a PyTorch environment for prototyping and analyzing compression algorithms, such as sparsity-inducing methods and low-precision arithmetic.
pytorch pruning quantization pruning-structures jupyter-notebook network-compression deep-neural-networks regularization group-lassoThis repo provides PyTorch Implementation of MSG-Net (ours) and Neural Style (Gatys et al. CVPR 2016), which has been included by ModelDepot. We also provide Torch implementation and MXNet implementation. Image Style Transfer Using Convolutional Neural Networks by Leon A. Gatys, Alexander S. Ecker, and Matthias Bethge.
style-transfer deep-neural-networks real-timeThis repository contains the code (in PyTorch) for: "LightNet: Light-weight Networks for Semantic Image Segmentation " (underway) by Huijun Liu @ TU Braunschweig. Semantic Segmentation is a significant part of the modern autonomous driving system, as exact understanding the surrounding scene is very important for the navigation and driving decision of the self-driving car. Nowadays, deep fully convolutional networks (FCNs) have a very significant effect on semantic segmentation, but most of the relevant researchs have focused on improving segmentation accuracy rather than model computation efficiency. However, the autonomous driving system is often based on embedded devices, where computing and storage resources are relatively limited. In this paper we describe several light-weight networks based on MobileNetV2, ShuffleNet and Mixed-scale DenseNet for semantic image segmentation task, Additionally, we introduce GAN for data augmentation[17] (pix2pixHD) concurrent Spatial-Channel Sequeeze & Excitation (SCSE) and Receptive Field Block (RFB) to the proposed network. We measure our performance on Cityscapes pixel-level segmentation, and achieve up to 70.72% class mIoU and 88.27% cat. mIoU. We evaluate the trade-offs between mIoU, and number of operations measured by multiply-add (MAdd), as well as the number of parameters.
semantic-segmentation mobilenet-v2 deeplabv3plus mixedscalenet senet wide-residual-networks dual-path-networks pytorch cityscapes mapillary-vistas-dataset shufflenet inplace-activated-batchnorm encoder-decoder-model mobilenet light-weight-net deeplabv3 mobilenetv2plus rfmobilenetv2plus group-normalization semantic-context-lossRepository for the book Introduction to Artificial Neural Networks and Deep Learning: A Practical Guide with Applications in Python. Deep learning is not just the talk of the town among tech folks. Deep learning allows us to tackle complex problems, training artificial neural networks to recognize complex patterns for image and speech recognition. In this book, we'll continue where we left off in Python Machine Learning and implement deep learning algorithms in PyTorch.
deep-learning neural-network machine-learning tensorflow artificial-intelligence data-science pytorchIn these tutorials for pyTorch, we will build our first Neural Network and try to build some advanced Neural Network architectures developed recent years. Thanks for liufuyang's notebook files which is a great contribution to this tutorial.
neural-network pytorch-tutorial batch-normalization cnn rnn autoencoder pytorch regression classification batch tutorial dropout dqn reinforcement-learning gan generative-adversarial-network machine-learningPyTorch-NLP, or torchnlp for short, is a library of neural network layers, text processing modules and datasets designed to accelerate Natural Language Processing (NLP) research. Join our community, add datasets and neural network layers! Chat with us on Gitter and join the Google Group, we're eager to collaborate with you.
pytorch nlp natural-language-processing pytorch-nlp torchnlp data-loader embeddings word-vectors deep-learning dataset metrics neural-network sru machine-learning
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.