Displaying 1 to 8 from 8 results

hyperband - Tuning hyperparams fast with Hyperband

  •    Python

Code for tuning hyperparams with Hyperband, adapted from Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization. Use defs.meta/defs_regression.meta to try many models in one Hyperband run. This is an automatic alternative to constructing search spaces with multiple models (like defs.rf_xt, or defs.polylearn_fm_pn) by hand.

SMAC3 - Sequential Model-based Algorithm Configuration

  •    Python

Attention: This package is under heavy development and subject to change. A stable release of SMAC (v2) in Java can be found here. The documentation can be found here.




mgo - Purely functional genetic algorithms for multi-objective optimisation

  •    Scala

MGO implements NGSAII, CP (Calibration Profile), PSE (Pattern Search Experiment). All algorithm in MGO have version to compute on noisy fitness function. MGO handle noisy fitness functions by resampling only the most promising individuals. It uses an aggregation function to aggregate the multiple sample when needed.

Milano - Milano is a tool for automating hyper-parameters search for your models on a backend of your choice

  •    Python

Milano (Machine learning autotuner and network optimizer) is a tool for enabling machine learning researchers and practitioners to perform massive hyperparameters and architecture searches. Your script can use any framework of your choice, for example, TensorFlow, PyTorch, Microsoft Cognitive Toolkit etc. or no framework at all. Milano only requires minimal changes to what your script accepts via command line and what it returns to stdout.

Hyperopt-Keras-CNN-CIFAR-100 - Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset

  •    Python

This project acts as both a tutorial and a demo to using Hyperopt with Keras, TensorFlow and TensorBoard. Not only we try to find the best hyperparameters for the given hyperspace, but also we represent the neural network architecture as hyperparameters that can be tuned. This automates the process of searching for the best neural architecture configuration and hyperparameters. Here, we are meta-optimizing a neural net and its architecture on the CIFAR-100 dataset (100 fine labels), a computer vision task. This code could be easily transferred to another vision dataset or even to another machine learning task.

sentence-classification - Sentence Classifications with Neural Networks

  •    Python

Each of the above broad sentence categories can be expanded and can be made more indepth. The way these networks and scripts are designed it should be possible expand to classify other sentence types, provided the data is provided. This was developed for applications at Metacortex and is accompanied by a guide on building practical/applied neural networks on austingwalters.com.