KerasTuner

1.4.8 · active · verified Thu Apr 16

KerasTuner is a hyperparameter optimization library for Keras, making it easy to find the best hyperparameters for your machine learning models. It supports various tuning algorithms like RandomSearch, Hyperband, and BayesianOptimization. The library is actively maintained and frequently updated, with recent versions adding support for Keras 3 (multi-backend) and improving distributed execution.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to use KerasTuner with a `HyperModel` to search for optimal hyperparameters for a simple Keras model on the Fashion MNIST dataset. It defines a model with tunable dense layer units and learning rate, then uses `RandomSearch` to find the best configuration.

import keras
import keras_tuner as kt
import numpy as np

# Define a HyperModel
class MyHyperModel(kt.HyperModel):
    def build(self, hp):
        model = keras.Sequential()
        model.add(keras.layers.Flatten(input_shape=(28, 28)))
        model.add(keras.layers.Dense(units=hp.Int('units', min_value=32, max_value=512, step=32),
                                     activation='relu'))
        model.add(keras.layers.Dense(10, activation='softmax'))
        model.compile(optimizer=keras.optimizers.Adam(hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])),
                      loss='sparse_categorical_crossentropy',
                      metrics=['accuracy'])
        return model

# Load dummy data (Fashion MNIST)
(x_train, y_train), (x_test, y_test) = keras.datasets.fashion_mnist.load_data()
x_train = x_train[:10000].astype('float32') / 255.0
y_train = y_train[:10000]
x_val = x_test[:2000].astype('float32') / 255.0
y_val = y_test[:2000]

# Instantiate a tuner
tuner = kt.RandomSearch(
    MyHyperModel(),
    objective='val_accuracy',
    max_trials=3,
    executions_per_trial=2,
    directory='my_dir',
    project_name='my_intro_to_kt'
)

# Search for the best hyperparameters
tuner.search(x_train, y_train, epochs=2, validation_data=(x_val, y_val))

# Get the best hyperparameters
best_hps = tuner.get_best_hyperparameters(num_trials=1)[0]
print(f"Best units: {best_hps.get('units')}, best learning rate: {best_hps.get('learning_rate')}")

# Get the best model
best_model = tuner.get_best_models(num_models=1)[0]
best_model.evaluate(x_val, y_val)

view raw JSON →