auto-sklearn is an automated machine learning toolkit and a drop-in replacement for a scikit-learn estimator.
automl scikit-learn automated-machine-learning hyperparameter-optimization hyperparameter-tuning hyperparameter-search bayesian-optimization metalearning meta-learning smacAttention: This package is under heavy development and subject to change. A stable release of SMAC (v2) in Java can be found here. The documentation can be found here.
bayesian-optimization bayesian-optimisation hyperparameter-optimization hyperparameter-tuning hyperparameter-search configuration algorithm-configuration automl automated-machine-learningThis repository is a collection of notebooks covering various topics of Bayesian methods for machine learning. Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikit-learn and GPy.
machine-learning bayesian-methods bayesian-machine-learning gaussian-processes bayesian-optimization variational-autoencoderModel-based optimization with mlr. mlrMBO is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.
model-based-optimization r r-package optimization mlr hyperparameter-optimization black-box-optimization bayesian-optimizationA julia package for bayesian optimization of black box functions. Please see sample notebook for details.
julia bayesian-optimization optimization machine-learningThis repository contains the code used in my master thesis on LSTM based anomaly detection for time series data. The thesis report can be downloaded from here. We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investigate different ways of maintaining LSTM state, and the effect of using a fixed number of time steps on LSTM prediction and detection performance. LSTMs are also compared to feed-forward neural networks with fixed size time windows over inputs. Our experiments, with three real-world datasets, show that while LSTM RNNs are suitable for general purpose time series modeling and anomaly detection, maintaining LSTM state is crucial for getting desired results. Moreover, LSTMs may not be required at all for simple time series.
lstm anomaly-detection bayesian-optimization time-series recurrent-neural-networks deep-learning lstm-neural-networks neural-networksSee our NIPS Spotlight video for TLDR. BOPP is a package for automated marginal maximum a posteriori inference (MMAP) based around the probabilistic programming system Anglican. The user only needs to write their model in the same manner as existing Anglican programs and by using the defopt construct instead of defquery, select the variables to be optimized, with the rest marginalized out. It can also be used as a means of exploiting the target source code to improve Bayesian optimization, delivering things such as automatic domain scaling, unbounded optimization, and implicit constraint satisfaction including equality constraints. The key idea is to use a series of code transformations to extract from the original program all the things that are needed to carry out the MMAP problem, such as the target function itself and a program for optimizing the acquisition function subject to the implicit constraints. These are then passed to our other package Deodorant, which uses these to solve the problem probabilistic programs creates for BO. The following paper should be referred to for full algorithmic details and we ask that you cite this paper if you use BOPP in your work.
probabilistic-programming bayesian-optimization bayesian-inferenceRainforth, T., Le, T. A., van de Meent, J.-W., Osborne, M. A., & Wood, F. (2016). Bayesian Optimization for Probabilistic Programs. In Advances in Neural Information Processing Systems. which provides all the required inputs automatically given a program. Even when the intention is simply optimization, using BOPP rather than Deodorant directly is currently recommended. The rational of providing Deodorant as its own independent package is to separate out the parts of BOPP that are Anglican dependent and those that are not. As such, one may wish to integrate Deodorant into another similar package that provides all the required inputs.
bayesian-optimization bayesian-inferenceBayesian optimization framework for black-box functions, inspired by Optuna. This library is not only for hyperparameter tuning of machine learning models. Everything will be able to optimized if you can define the objective function (e.g. Optimizing the number of goroutines of your server and the memory buffer size of the caching systems). See the blog post for more details: Practical bayesian optimization using Goptuna.
black-box-optimization bayesian-optimization machine-learning tpeThis repository contains PyTorch implementations of several Laplace approximation methods (LA) [1]. It is similar to this TensorFlow implementation which approximates the curvature of neural networks, except that our main purpose is approximate Bayesian inference instead of second-order optimization. The aim is to make LA easy to use while LA in itself is a practical approach, because trained networks can be used without any modification. Our implementation supports this plug-in-and-play principle, i.e. you can make already pretrained network Bayesian, and obtain calibrated uncertainty in deep neural network's predictions! Our library also features a Bayesian Optimization method for easier tuning of hyperparameters.
deep-learning robotics pytorch bayesian-inference bayesian-optimization bayesian-deep-learning laplace-approximationThis repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts" (https://arxiv.org/abs/1905.10862). We have verified that this works with pip 18.1 and 20.2.
hyperparameter-optimization bayesian-optimization pareto-front hyperparameter-tuning active-learning multiobjective-optimization differential-privacy differential-privacy-deep-learningThis project provides a benchmark framework to easily compare Bayesian optimization methods on real machine learning tasks. This project is experimental and the APIs are not considered stable.
machine-learning sklearn benchmark-framework bayesian-optimizationCurrently, reaction optimisation in the fine chemicals industry is done by intuition or design of experiments. Both scale poorly with the complexity of the problem. Summit uses recent advances in machine learning to make the process of reaction optimisation faster. Essentially, it applies algorithms that learn which conditions (e.g., temperature, stoichiometry, etc.) are important to maximising one or more objectives (e.g., yield, enantiomeric excess). This is achieved through an iterative cycle.
machine-learning chemistry optimization neural-networks drug-discovery bayesian-optimization nelder-mead self-optimization tsemo snobfitBayex is a high performance Bayesian global optimization library using Gaussian processes. In contrast to existing Bayesian optimization libraries, Bayex is designed to use JAX as its backend. Instead of relaying on external libraries, Bayex only relies on JAX and its custom implementations, without requiring importing massive libraries such as sklearn.
automatic-differentiation bayesian-optimization gaussian-process-regression jaxFirst of all, we would like to thank BBO Challenge Organizers for this interesting competition. And congratulations to all winners. Here is the code of Optuna Developers' solution for NeurIPS 2020 Black-Box Optimization Challenge. Our solution achieved 96.939 for public and also 91.806 for private. We ranked 9th place in public and 5th place in private.
gaussian-processes bayesian-optimization blackbox-optimization neurips-2020For the full set of features refer to the documentation.
slurm hyperparameter-optimization tensorboard bayesian-optimization gpyoptHPOBench is a library for providing benchmarks for (multi-fidelity) hyperparameter optimization and with a focus on reproducibility. For more examples see /example/.
benchmarking benchmark hyperparameter-optimization bayesian-optimization containerized-benchmarks hpolib
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.