Displaying 1 to 15 from 15 results

ENAS-pytorch - PyTorch implementation of "Efficient Neural Architecture Search via Parameters Sharing"

  •    Python

PyTorch implementation of Efficient Neural Architecture Search via Parameters Sharing. ENAS reduce the computational requirement (GPU-hours) of Neural Architecture Search (NAS) by 1000x via parameter sharing between models that are subgraphs within a large computational graph. SOTA on Penn Treebank language modeling.

darts - Differentiable architecture search for convolutional and recurrent networks

  •    Python

DARTS: Differentiable Architecture Search Hanxiao Liu, Karen Simonyan, Yiming Yang. arXiv:1806.09055. NOTE: PyTorch 0.4 is not supported at this moment and would lead to OOM.

morph-net - Fast & Simple Resource-Constrained Learning of Deep Network Structure

  •    Python

FiGS, is a probabilistic approach to channel regularization that we introduced in Fine-Grained Stochastic Architecture Search. It outperforms our previous regularizers and can be used as either a pruning algorithm or a full fledged Differentiable Architecture Search method. This is the recommended way to apply MorphNet. In the below documentation it is referred to as the LogisticSigmoid regularizer. MorphNet is a method for learning deep network structure during training. The key principle is continuous relaxation of the network-structure learning problem. In short, the MorphNet regularizer pushes the influence of filters down, and once they are small enough, the corresponding output channels are marked for removal from the network.

slimmable_networks - Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019

  •    Python

Illustration of slimmable neural networks. The same model can run at different widths (number of active channels), permitting instant and adaptive accuracy-efficiency trade-offs. Illustration of universally slimmable networks. The same model can run at arbitrary widths.

awesome-automl-papers - A curated list of automated machine learning papers, articles, tutorials, slides and projects


Awesome-AutoML-Papers is a curated list of automated machine learning papers, articles, tutorials, slides and projects. Star this repository, and then you can keep abreast of the latest developments of this booming research field. Thanks to all the people who made contributions to this project. Join us and you are welcome to be a contributor. Automated Machine Learning (AutoML) provides methods and processes to make Machine Learning available for non-Machine Learning experts, to improve efficiency of Machine Learning and to accelerate research on Machine Learning.

devol - Genetic neural architecture search with Keras

  •    Python

DEvol (DeepEvolution) is a basic proof of concept for genetic architecture search in Keras. The current setup is designed for classification problems, though this could be extended to include any other output type as well. See example/demo.ipynb for a simple example.

autokeras - AutoML library for deep learning

  •    Python

AutoKeras: An AutoML system based on Keras. It is developed by DATA Lab at Texas A&M University. The goal of AutoKeras is to make machine learning accessible to everyone. Official website tutorials.

AdaNet - Fast and flexible AutoML with learning guarantees

  •    Jupyter

AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on recent AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models.

nni - An open source AutoML toolkit for neural architecture search and hyper-parameter tuning.

  •    TypeScript

NNI (Neural Network Intelligence) is a toolkit to help users run automated machine learning experiments. The tool dispatches and runs trial jobs that generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments (e.g. local machine, remote servers and cloud). This command will start an experiment and a WebUI. The WebUI endpoint will be shown in the output of this command (for example, http://localhost:8080). Open this URL in your browser. You can analyze your experiment through WebUI, or browse trials' tensorboard.

Archai - Reproducible Rapid Research for Neural Architecture Search (NAS)

  •    Python

Archai is a platform for Neural Network Search (NAS) that allows you to generate efficient deep networks for your applications. Archai aspires to accelerate NAS research by enabling easy mix and match between different techniques while ensuring reproducibility, self-documented hyper-parameters and fair comparison. To achieve this, Archai uses a common code base that unifies several algorithms.

neural-architecture-search - Basic implementation of [Neural Architecture Search with Reinforcement Learning](https://arxiv

  •    Python

Basic implementation of Controller RNN from Neural Architecture Search with Reinforcement Learning and Learning Transferable Architectures for Scalable Image Recognition. At a high level : For full training details, please see train.py.

NASLib - NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers

  •    Python

NASLib is a modular and flexible framework created with the aim of providing a common codebase to the community to facilitate research on Neural Architecture Search (NAS). It offers high-level abstractions for designing and reusing search spaces, interfaces to benchmarks and evaluation pipelines, enabling the implementation and extension of state-of-the-art NAS methods with a few lines of code. The modularized nature of NASLib allows researchers to easily innovate on individual components (e.g., define a new search space while reusing an optimizer and evaluation pipeline, or propose a new optimizer with existing search spaces). It is designed to be modular, extensible and easy to use. NASLib was developed by the AutoML Freiburg group and with the help of the NAS community, we are constantly adding new search spaces, optimizers and benchmarks to the library. Please reach out to zelaa@cs.uni-freiburg.de for any questions or potential collaborations.

awesome-transformer-search - A curated list of awesome resources combining Transformers with Neural Architecture Search


This repository is maintained by the AutoML Group Freiburg. Please feel free to pull requests or open an issue to add papers.

We have large collection of open source products. Follow the tags from Tag Cloud >>

Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.