Part 1: Introduction
- PyTorch example code
where you will find the following functions that define the hyper-parameters
- trial.suggest_int(“n_layers”, 1, 3)
- trial.suggest_categorical(“optimizer”, [“Adam”, “RMSprop”])
- trial.suggest_float(“lr”, 1e-5, 1e-1, log=True)
- In Optuna there are three terminologies:
- objective: objective function that you want to optimize
- trial: a single call of the objective function
- study: an optimization session, which is a set of trials
- parameters: a variable whose value is to be optimized