autogluon.core¶
Decorators are designed to apply hyperparameter-tuning on arbitrary user-defined search space.
Toy Example
Create a training function decorated with searchable hyperparameters space:
>>> @ag.args(
... nested=ag.space.Categorical(
... 'test',
... ag.space.Dict(
... name=ag.space.Categorical('auto', 'gluon'),
... idx = ag.space.Int(0, 100),
... )
... ),
... obj_args = ag.space.Dict(
... name=ag.space.Categorical('auto', 'gluon'),
... idx = ag.space.Int(0, 100),
... ),
... fn_args=ag.space.Categorical('mxnet', 'pytorch'))
>>> def train_fn(args, reporter):
... # Wrap parameterizable classes and functions inside train_fn
... # to ensure they are portable for distributed training
... class MyObj:
... def __init__(self, name, idx):
... self.name = name
... self.idx = idx
...
... def my_func(framework):
... return framework
...
... obj = MyObj(**args.obj_args)
... func_result = my_func(args.fn_args)
... nested = args.nested
...
... assert hasattr(nested, 'name') or nested == 'test'
... assert func_result in ['mxnet', 'pytorch']
... assert obj.name in ['auto', 'gluon']
... assert obj.idx >=0 and obj.idx <=100
...
... reporter(epoch=1, accuracy=0)
Create a scheduler and run training trials to search for the best values of the hyperparameters:
>>> scheduler = ag.scheduler.FIFOScheduler(train_fn,
... resource={'num_cpus': 2, 'num_gpus': 0},
... num_trials=20,
... reward_attr='accuracy',
... time_attr='epoch')
>>> scheduler.run()
Core APIs¶
Decorator for a Python training script that registers its arguments as hyperparameters. |
|
Decorator for a Python class that registers its arguments as hyperparameters. |
|
Decorator for a function that registers its arguments as hyperparameters. |
args¶
-
autogluon.core.
args
(default=None, **kwvars)[source]¶ - Decorator for a Python training script that registers its arguments as hyperparameters.
Each hyperparameter takes fixed value or is a searchable space, and the arguments may either be: built-in Python objects (e.g. floats, strings, lists, etc.), AutoGluon objects (see
autogluon.obj()
), or AutoGluon search spaces (seeautogluon.space.Int
,autogluon.space.Real
, etc.).
Examples
>>> import autogluon.core as ag >>> @ag.args(batch_size=10, lr=ag.Real(0.01, 0.1)) >>> def train_func(args): ... print('Batch size is {}, LR is {}'.format(args.batch_size, arg.lr))
obj¶
-
autogluon.core.
obj
(**kwvars)[source]¶ - Decorator for a Python class that registers its arguments as hyperparameters.
Each hyperparameter may take a fixed value or be a searchable space (autogluon.space).
- Returns
- Instance of
autogluon.space.AutoGluonObject
: A lazily initialized object, which allows distributed training.
- Instance of
Examples
>>> import autogluon.core as ag >>> from mxnet import optimizer as optim >>> @ag.obj( >>> learning_rate=ag.space.Real(1e-4, 1e-1, log=True), >>> wd=ag.space.Real(1e-4, 1e-1), >>> ) >>> class Adam(optim.Adam): >>> pass
func¶
-
autogluon.core.
func
(**kwvars)[source]¶ - Decorator for a function that registers its arguments as hyperparameters.
Each hyperparameter may take a fixed value or be a searchable space (autogluon.space).
- Returns
- Instance of
autogluon.space.AutoGluonObject
: A lazily initialized object, which allows for distributed training.
- Instance of
Examples
>>> import autogluon.core as ag >>> from gluoncv.model_zoo import get_model >>> >>> @ag.func(pretrained=ag.space.Categorical(True, False)) >>> def cifar_resnet(pretrained): ... return get_model('cifar_resnet20_v1', pretrained=pretrained)