Displaying 1 to 15 from 15 results

determined - Determined: Deep Learning Training Platform

  •    Python

Determined integrates these features into an easy-to-use, high-performance deep learning environment — which means you can spend your time building models instead of managing infrastructure. To use Determined, you can continue using popular DL frameworks such as TensorFlow and PyTorch; you just need to update your model code to integrate with the Determined API.

evalml - EvalML is an AutoML library written in python.

  •    Python

EvalML is an AutoML library which builds, optimizes, and evaluates machine learning pipelines using domain-specific objective functions.

hyperband - Tuning hyperparams fast with Hyperband

  •    Python

Code for tuning hyperparams with Hyperband, adapted from Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization. Use defs.meta/defs_regression.meta to try many models in one Hyperband run. This is an automatic alternative to constructing search spaces with multiple models (like defs.rf_xt, or defs.polylearn_fm_pn) by hand.




SMAC3 - Sequential Model-based Algorithm Configuration

  •    Python

Attention: This package is under heavy development and subject to change. A stable release of SMAC (v2) in Java can be found here. The documentation can be found here.

adatune - Gradient based Hyperparameter Tuning library in PyTorch

  •    Python

AdaTune is a library to perform gradient based hyperparameter tuning for training deep neural networks. AdaTune currently supports tuning of the learning_rate parameter but some of the methods implemented here can be extended to other hyperparameters like momentum or weight_decay etc. AdaTune provides the following gradient based hyperparameter tuning algorithms - HD, RTHO and our newly proposed algorithm, MARTHE. The repository also contains other commonly used non-adaptive learning_rate adaptation strategies like staircase-decay, exponential-decay and cosine-annealing-with-restarts. The library is implemented in PyTorch. The goal of the methods in this package is to automatically compute in an online fashion a learning rate schedule for stochastic optimization methods (such as SGD) only on the basis of the given learning task, aiming at producing models with associated small validation error.


mgo - Purely functional genetic algorithms for multi-objective optimisation

  •    Scala

MGO implements NGSAII, CP (Calibration Profile), PSE (Pattern Search Experiment). All algorithm in MGO have version to compute on noisy fitness function. MGO handle noisy fitness functions by resampling only the most promising individuals. It uses an aggregation function to aggregate the multiple sample when needed.

Milano - Milano is a tool for automating hyper-parameters search for your models on a backend of your choice

  •    Python

Milano (Machine learning autotuner and network optimizer) is a tool for enabling machine learning researchers and practitioners to perform massive hyperparameters and architecture searches. Your script can use any framework of your choice, for example, TensorFlow, PyTorch, Microsoft Cognitive Toolkit etc. or no framework at all. Milano only requires minimal changes to what your script accepts via command line and what it returns to stdout.

Hyperopt-Keras-CNN-CIFAR-100 - Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset

  •    Python

This project acts as both a tutorial and a demo to using Hyperopt with Keras, TensorFlow and TensorBoard. Not only we try to find the best hyperparameters for the given hyperspace, but also we represent the neural network architecture as hyperparameters that can be tuned. This automates the process of searching for the best neural architecture configuration and hyperparameters. Here, we are meta-optimizing a neural net and its architecture on the CIFAR-100 dataset (100 fine labels), a computer vision task. This code could be easily transferred to another vision dataset or even to another machine learning task.

sentence-classification - Sentence Classifications with Neural Networks

  •    Python

Each of the above broad sentence categories can be expanded and can be made more indepth. The way these networks and scripts are designed it should be possible expand to classify other sentence types, provided the data is provided. This was developed for applications at Metacortex and is accompanied by a guide on building practical/applied neural networks on austingwalters.com.

auptimizer - An automatic ML model optimization tool.

  •    Python

Getting the best models in minimum time - Generate optimal models and achieve better performance by employing state-of-the-art hyperparameter optimization (HPO) and model compression techniques. Auptimizer will run and record sophisticated HPO and model compression experiments on compute resources of your choice with effortless consistency and reproducibility. Making your models edge-ready - Get model-device compatibility and enhanced on-device performance by converting models into the industry-standard ONNX and TensorFlow Lite formats. Auptimizer-Converter provides validated conversion techniques to ensure worry-free format transformations.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.