New to MLJ? Start here. Wanting to integrate an existing machine learning model into the MLJ framework? Start here.
data-science machine-learning statistics pipeline clustering julia pipelines regression tuning classification ensemble-learning predictive-modeling tuning-parameters stackingTune the hyperparameters of your PyTorch models with HyperSearch. Keys are of the form {layer_num}_{hyperparameter} where layer_num can be a layer from your nn.Sequential model or all to signify all layers. Values are of the form [distribution, x] where distribution can be one of uniform, quniform, choice, etc.
hyperband hyperparameter-optimization pytorch deep-learning tuning-parametersEasy Hyper Parameter Optimization with mlr and mlrMBO. Mainly it uses the learner implemented in mlr and uses the tuning methods also available in mlr. Unfortunately mlr lacks of well defined search spaces for each learner to make hyperparameter tuning easy.
mlr optimization learners hyperparameter-optimization tuning-parameters r-package r machine-learning
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.