Displaying 1 to 3 from 3 results

hypersearch - Hyerparameter Optimization for PyTorch

  •    Python

Tune the hyperparameters of your PyTorch models with HyperSearch. Keys are of the form {layer_num}_{hyperparameter} where layer_num can be a layer from your nn.Sequential model or all to signify all layers. Values are of the form [distribution, x] where distribution can be one of uniform, quniform, choice, etc.

mlrHyperopt - Easy Hyper Parameter Optimization with mlr and mlrMBO.

  •    HTML

Easy Hyper Parameter Optimization with mlr and mlrMBO. Mainly it uses the learner implemented in mlr and uses the tuning methods also available in mlr. Unfortunately mlr lacks of well defined search spaces for each learner to make hyperparameter tuning easy.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.