autogluon.core.space¶
Search space of possible hyperparameter values to consider.
Example
Define a dummy training function with searchable spaces for hyperparameters lr and wd:
>>> import numpy as np
>>> import autogluon.core as ag
>>> @ag.args(
>>> lr=ag.space.Real(1e-3, 1e-2, log=True),
... wd=ag.space.Real(1e-3, 1e-2),
... epochs=10)
>>> def train_fn(args, reporter):
... print('lr: {}, wd: {}'.format(args.lr, args.wd))
... for e in range(args.epochs):
... dummy_accuracy = 1 - np.power(1.8, -np.random.uniform(e, 2*e))
... reporter(epoch=e+1, accuracy=dummy_accuracy, lr=args.lr, wd=args.wd)
Create a scheduler to manage training jobs and begin hyperparameter tuning with the provided search space:
>>> scheduler = ag.scheduler.HyperbandScheduler(train_fn,
>>> resource={'num_cpus': 2, 'num_gpus': 0},
>>> num_trials=10,
>>> reward_attr='accuracy',
>>> time_attr='epoch',
>>> grace_period=1)
>>> scheduler.run()
>>> scheduler.join_jobs()
Visualize the results:
>>> scheduler.get_training_curves(plot=True)

Search Space¶
Search space for numeric hyperparameter that takes continuous values. |
|
Search space for numeric hyperparameter that takes integer values. |
|
Search space for hyperparameter that is either True or False. |
|
Nested search space for hyperparameters which are categorical. Such a hyperparameter takes one value out of the discrete set of provided options. |
Real¶
-
class
autogluon.core.space.
Real
(lower, upper, default=None, log=False)[source]¶ Search space for numeric hyperparameter that takes continuous values.
- Parameters
- lowerfloat
The lower bound of the search space (minimum possible value of hyperparameter)
- upperfloat
The upper bound of the search space (maximum possible value of hyperparameter)
- defaultfloat (optional)
Default value tried first during hyperparameter optimization
- log(True/False)
Whether to search the values on a logarithmic rather than linear scale. This is useful for numeric hyperparameters (such as learning rates) whose search space spans many orders of magnitude.
Examples
>>> learning_rate = Real(0.01, 0.1, log=True)
- Attributes
default
Return default value of hyperparameter corresponding to this search space.
Methods
convert_to_sklearn
Int¶
-
class
autogluon.core.space.
Int
(lower, upper, default=None)[source]¶ Search space for numeric hyperparameter that takes integer values.
- Parameters
- lowerint
The lower bound of the search space (minimum possible value of hyperparameter)
- upperint
The upper bound of the search space (maximum possible value of hyperparameter)
- defaultint (optional)
Default value tried first during hyperparameter optimization
Examples
>>> range = Int(0, 100)
- Attributes
default
Return default value of hyperparameter corresponding to this search space.
Methods
convert_to_sklearn
Bool¶
-
class
autogluon.core.space.
Bool
[source]¶ - Search space for hyperparameter that is either True or False.
ag.Bool() serves as shorthand for: ag.space.Categorical(True, False)
Examples
pretrained = ag.space.Bool()
- Attributes
default
Return default value of hyperparameter corresponding to this search space.
Methods
convert_to_sklearn
Categorical¶
-
class
autogluon.core.space.
Categorical
(*data)[source]¶ - Nested search space for hyperparameters which are categorical. Such a hyperparameter takes one value out of the discrete set of provided options.
The first value in the list of options will be the default value that gets tried first during HPO.
- Parameters
- dataSpace or python built-in objects
the choice candidates
Examples
a = Categorical(‘a’, ‘b’, ‘c’, ‘d’) # ‘a’ will be default value tried first during HPO
- Attributes
default
Return default value of hyperparameter corresponding to this search space.
Methods
convert_to_sklearn