Displaying 1 to 17 from 17 results

SMAC3 - Sequential Model-based Algorithm Configuration

  •    Python

Attention: This package is under heavy development and subject to change. A stable release of SMAC (v2) in Java can be found here. The documentation can be found here.

bayesian-machine-learning - Notebooks related to Bayesian methods for machine learning

  •    Jupyter

This repository is a collection of notebooks covering various topics of Bayesian methods for machine learning. Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikit-learn and GPy.

mlrMBO - Toolbox for Bayesian Optimization and Model-Based Optimization in R

  •    R

Model-based optimization with mlr. mlrMBO is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.

BayesianOptimization.jl - A julia package for bayesian optimization of black box functions.

  •    Julia

A julia package for bayesian optimization of black box functions. Please see sample notebook for details.

lstm_anomaly_thesis - Anomaly detection for temporal data using LSTMs

  •    Jupyter

This repository contains the code used in my master thesis on LSTM based anomaly detection for time series data. The thesis report can be downloaded from here. We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investigate different ways of maintaining LSTM state, and the effect of using a fixed number of time steps on LSTM prediction and detection performance. LSTMs are also compared to feed-forward neural networks with fixed size time windows over inputs. Our experiments, with three real-world datasets, show that while LSTM RNNs are suitable for general purpose time series modeling and anomaly detection, maintaining LSTM state is crucial for getting desired results. Moreover, LSTMs may not be required at all for simple time series.

bopp - BOPP: Bayesian Optimization for Probabilistic Programs

  •    Clojure

See our NIPS Spotlight video for TLDR. BOPP is a package for automated marginal maximum a posteriori inference (MMAP) based around the probabilistic programming system Anglican. The user only needs to write their model in the same manner as existing Anglican programs and by using the defopt construct instead of defquery, select the variables to be optimized, with the rest marginalized out. It can also be used as a means of exploiting the target source code to improve Bayesian optimization, delivering things such as automatic domain scaling, unbounded optimization, and implicit constraint satisfaction including equality constraints. The key idea is to use a series of code transformations to extract from the original program all the things that are needed to carry out the MMAP problem, such as the target function itself and a program for optimizing the acquisition function subject to the implicit constraints. These are then passed to our other package Deodorant, which uses these to solve the problem probabilistic programs creates for BO. The following paper should be referred to for full algorithmic details and we ask that you cite this paper if you use BOPP in your work.

deodorant - Deodorant: Solving the problems of Bayesian Optimization

  •    Clojure

Rainforth, T., Le, T. A., van de Meent, J.-W., Osborne, M. A., & Wood, F. (2016). Bayesian Optimization for Probabilistic Programs. In Advances in Neural Information Processing Systems. which provides all the required inputs automatically given a program. Even when the intention is simply optimization, using BOPP rather than Deodorant directly is currently recommended. The rational of providing Deodorant as its own independent package is to separate out the parts of BOPP that are Anglican dependent and those that are not. As such, one may wish to integrate Deodorant into another similar package that provides all the required inputs.

goptuna - :cyclone:Distributed bayesian optimization framework, inspired by Optuna.

  •    Go

Bayesian optimization framework for black-box functions, inspired by Optuna. This library is not only for hyperparameter tuning of machine learning models. Everything will be able to optimized if you can define the objective function (e.g. Optimizing the number of goroutines of your server and the memory buffer size of the caching systems). See the blog post for more details: Practical bayesian optimization using Goptuna.

curvature - Official Code: Estimating Model Uncertainty of Neural Networks in Sparse Information Form, ICML2020

  •    Python

This repository contains PyTorch implementations of several Laplace approximation methods (LA) [1]. It is similar to this TensorFlow implementation which approximates the curvature of neural networks, except that our main purpose is approximate Bayesian inference instead of second-order optimization. The aim is to make LA easy to use while LA in itself is a practical approach, because trained networks can be used without any modification. Our implementation supports this plug-in-and-play principle, i.e. you can make already pretrained network Bayesian, and obtain calibrated uncertainty in deep neural network's predictions! Our library also features a Bayesian Optimization method for easier tuning of hyperparameters.

bayesmark - Benchmark framework to easily compare Bayesian optimization methods on real machine learning tasks

  •    Python

This project provides a benchmark framework to easily compare Bayesian optimization methods on real machine learning tasks. This project is experimental and the APIs are not considered stable.

summit - Optimising chemical reactions using machine learning

  •    Jupyter

Currently, reaction optimisation in the fine chemicals industry is done by intuition or design of experiments. Both scale poorly with the complexity of the problem. Summit uses recent advances in machine learning to make the process of reaction optimisation faster. Essentially, it applies algorithms that learn which conditions (e.g., temperature, stoichiometry, etc.) are important to maximising one or more objectives (e.g., yield, enantiomeric excess). This is achieved through an iterative cycle.

bayex - Bayesian Optimization Python Library powered by JAX

  •    Python

Bayex is a high performance Bayesian global optimization library using Gaussian processes. In contrast to existing Bayesian optimization libraries, Bayex is designed to use JAX as its backend. Instead of relaying on external libraries, Bayex only relies on JAX and its custom implementations, without requiring importing massive libraries such as sklearn.

bboc-optuna-developers - Black-box optimizer submitted to BBO challenge at NeurIPS 2020

  •    Jupyter

First of all, we would like to thank BBO Challenge Organizers for this interesting competition. And congratulations to all winners. Here is the code of Optuna Developers' solution for NeurIPS 2020 Black-Box Optimization Challenge. Our solution achieved 96.939 for public and also 91.806 for private. We ranked 9th place in public and 5th place in private.

HPOBench - Collection of hyperparameter optimization benchmark problems

  •    Python

HPOBench is a library for providing benchmarks for (multi-fidelity) hyperparameter optimization and with a focus on reproducibility. For more examples see /example/.