You are on page 1of 5

Configuration Keras Tuner

1. Define the model


- With function:
o Argument: hp - keras_tuner.HyperParameters()

- With HyperModel base class:


o Method: build (self, hp: keras_tuner.HyperParameters) => Model
2. Turners
Tuners available: relative link
- RandomSearch: relative link
- GridSearch: relative link
- BayesianOptimization: relative link
- Hyperband: relative link
- Sklearn: relative link
Some important arguments:
- Hypermodel: A HyperModel instance.
- Objective: string | [string] | keras_tuner.Objective | [keras_tuner.Objective].
- max_trials: Optional integer, the total number of trials (model configurations) to
test at most.
Some important methods:
- get_best_hyperparameters:
o best_hp = tuner.get_best_hyperparameters()[0]
o model = tuner.hypermodel.build(best_hp)
- get_best_models:
o tuner.get_best_models(num_models=1)
3. HyperParameters
hp = tuner. HyperParameters()
Each method below must have a unique name:
- hp.Boolean(name, default=False, parent_name=None, parent_values=None)
Choice between True and False.
- hp.Choice( name, values, ordered=None, default=None, parent_name=None,
parent_values=None )
Choice of one value among a predefined set of possible values.
values: A list of possible values. Values must be int, float, str, or bool. All values
must be of the same type.
- hp.Fixed(name, value, parent_name=None, parent_values=None)
Fixed, untunable value.
value: The value to use (can be any JSON-serializable Python type).
- hp.Float(name,min_value,max_value,step=None,sampling="linear",default=None
,parent_name=None,parent_values=None)
Floating point value hyperparameter.
o min_value: Float, the lower bound of the range.
o max_value: Float, the upper bound of the range.
o step: Optional float, the distance between two consecutive samples in the
range. If left unspecified, it is possible to sample any value in the interval.
If sampling="linear", it will be the minimum additve between two samples.
If sampling="log", it will be the minimum multiplier between two samples.
o sampling: String. One of "linear", "log", "reverse_log". Defaults to "linear".
When sampling value, it always start from a value in range [0.0, 1.0). The
sampling argument decides how the value is projected into the range of
[min_value, max_value]. "linear": min_value + value * (max_value -
min_value) "log": min_value * (max_value / min_value) ^ value
"reverse_log": (max_value - min_value * ((max_value / min_value) ^ (1 -
value) - 1))
- hp.Int(name,min_value,max_value,step=None,sampling="linear",default=None,p
arent_name=None,parent_values=None)
Integer hyperparameter.
o min_value: Integer, the lower limit of range, inclusive.
o max_value: Integer, the upper limit of range, inclusive.
o step: Optional integer, the distance between two consecutive samples in
the range. If left unspecified, it is possible to sample any integers in the
interval. If sampling="linear", it will be the minimum additve between two
samples. If sampling="log", it will be the minimum multiplier between two
samples.
o sampling: String. One of "linear", "log", "reverse_log". Defaults to "linear".
When sampling value, it always start from a value in range [0.0, 1.0). The
sampling argument decides how the value is projected into the range of
[min_value, max_value]. "linear": min_value + value * (max_value -
min_value) "log": min_value * (max_value / min_value) ^ value
"reverse_log": (max_value - min_value * ((max_value / min_value) ^ (1 -
value) - 1))
- hp. conditional_scope(parent_name, parent_values)
4. Highlight examples
- Define the existence of layers with hp.Boolean:

- Define choice of activation function:

- Define hyper multi-layer and each layer must have unique name:

- Build model separate with HyperParameters:

You might also like