auto-sklearn is an automated machine learning toolkit and a drop-in replacement for a scikit-learn estimator.
automl scikit-learn automated-machine-learning hyperparameter-optimization hyperparameter-tuning hyperparameter-search bayesian-optimization metalearning meta-learning smacDetermined integrates these features into an easy-to-use, high-performance deep learning environment — which means you can spend your time building models instead of managing infrastructure. To use Determined, you can continue using popular DL frameworks such as TensorFlow and PyTorch; you just need to update your model code to integrate with the Determined API.
kubernetes machine-learning deep-learning tensorflow pytorch hyperparameter-optimization hyperparameter-tuning hyperparameter-search distributed-training ml-infrastructure ml-platformEvalML is an AutoML library which builds, optimizes, and evaluates machine learning pipelines using domain-specific objective functions.
data-science machine-learning optimization feature-selection model-selection feature-engineering hyperparameter-tuning automlCode for tuning hyperparams with Hyperband, adapted from Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization. Use defs.meta/defs_regression.meta to try many models in one Hyperband run. This is an automatic alternative to constructing search spaces with multiple models (like defs.rf_xt, or defs.polylearn_fm_pn) by hand.
hyperparameters hyperparameter-optimization hyperparameter-tuning gradient-boosting-classifier gradient-boosting machine-learningAttention: This package is under heavy development and subject to change. A stable release of SMAC (v2) in Java can be found here. The documentation can be found here.
bayesian-optimization bayesian-optimisation hyperparameter-optimization hyperparameter-tuning hyperparameter-search configuration algorithm-configuration automl automated-machine-learningRL Baselines3 Zoo is a training framework for Reinforcement Learning (RL), using Stable Baselines3. It provides scripts for training, evaluating agents, tuning hyperparameters, plotting results and recording videos.
reinforcement-learning robotics optimization lab openai gym hyperparameter-optimization rl sde hyperparameter-tuning hyperparameter-search pybullet stable-baselines pybullet-environments tuning-hyperparametersAdaTune is a library to perform gradient based hyperparameter tuning for training deep neural networks. AdaTune currently supports tuning of the learning_rate parameter but some of the methods implemented here can be extended to other hyperparameters like momentum or weight_decay etc. AdaTune provides the following gradient based hyperparameter tuning algorithms - HD, RTHO and our newly proposed algorithm, MARTHE. The repository also contains other commonly used non-adaptive learning_rate adaptation strategies like staircase-decay, exponential-decay and cosine-annealing-with-restarts. The library is implemented in PyTorch. The goal of the methods in this package is to automatically compute in an online fashion a learning rate schedule for stochastic optimization methods (such as SGD) only on the basis of the given learning task, aiming at producing models with associated small validation error.
machine-learning deep-learning pytorch neural-networks hyperparameter-tuning automl learning-rate-schedulingA library for performing hyperparameter optimization on top of Spark
hyperparameter-optimization spark optimizer optimization-algorithms machine-learning machinelearning hyperparameter-tuning hyperparameters grid-search random-search vowpal-wabbitMGO implements NGSAII, CP (Calibration Profile), PSE (Pattern Search Experiment). All algorithm in MGO have version to compute on noisy fitness function. MGO handle noisy fitness functions by resampling only the most promising individuals. It uses an aggregation function to aggregate the multiple sample when needed.
genetic-algorithm optimisation functional-programming parameter-tuning hyperparameter-optimization hyperparameters hyperparameter-tuningMilano (Machine learning autotuner and network optimizer) is a tool for enabling machine learning researchers and practitioners to perform massive hyperparameters and architecture searches. Your script can use any framework of your choice, for example, TensorFlow, PyTorch, Microsoft Cognitive Toolkit etc. or no framework at all. Milano only requires minimal changes to what your script accepts via command line and what it returns to stdout.
deep-learning deep-neural-networks automl hyperparameter-tuning hyperparameter-optimization machine-learningThis project acts as both a tutorial and a demo to using Hyperopt with Keras, TensorFlow and TensorBoard. Not only we try to find the best hyperparameters for the given hyperspace, but also we represent the neural network architecture as hyperparameters that can be tuned. This automates the process of searching for the best neural architecture configuration and hyperparameters. Here, we are meta-optimizing a neural net and its architecture on the CIFAR-100 dataset (100 fine labels), a computer vision task. This code could be easily transferred to another vision dataset or even to another machine learning task.
hyperopt hyperparameter-optimization hyperparameter-tuning hyperparameters-optimization hyperparameter-search keras cnn cnn-keras tensorflowEach of the above broad sentence categories can be expanded and can be made more indepth. The way these networks and scripts are designed it should be possible expand to classify other sentence types, provided the data is provided. This was developed for applications at Metacortex and is accompanied by a guide on building practical/applied neural networks on austingwalters.com.
sentence-classification hyperparameter-tuning fasttext cnn rnn neural-networkParameters which define the model architecture are referred to as hyperparameters and thus this process of searching for the ideal model architecture is referred to as hyperparameter tuning.
machine-learning deep-learning neural-network mxnet tensorflow keras pytorch hyperparameter-optimization hyperparameter-tuning optimization-algorithms keras-tensorflow pytorch-implmention torchvisionThis repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts" (https://arxiv.org/abs/1905.10862). We have verified that this works with pip 18.1 and 20.2.
hyperparameter-optimization bayesian-optimization pareto-front hyperparameter-tuning active-learning multiobjective-optimization differential-privacy differential-privacy-deep-learningGetting the best models in minimum time - Generate optimal models and achieve better performance by employing state-of-the-art hyperparameter optimization (HPO) and model compression techniques. Auptimizer will run and record sophisticated HPO and model compression experiments on compute resources of your choice with effortless consistency and reproducibility. Making your models edge-ready - Get model-device compatibility and enhanced on-device performance by converting models into the industry-standard ONNX and TensorFlow Lite formats. Auptimizer-Converter provides validated conversion techniques to ensure worry-free format transformations.
data-science machine-learning deep-learning data-engineering neural-networks hyperparameter-optimization hyperparameter-tuning hpo automl automated-machine-learning
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.