Skip to content

Visualization of TensorFlow Experiment Hyperparameters

This topic shows you how to set experiment hyperparams and their effects. The topic builds on the script that resulted from steps in Getting Started for TensorFlow with steps.

The steps that are covered are:

  • Define a hyperparam.
  • Set hyperparams to MissingLink's callback.

Note

For each framework that MissingLink supports, there are hyperparameters that will be retrieved automatically. See the list.

Preparation

Go through Getting Started for TensorFlow with steps.

Note

Ensure that you can successfully run the mnist.py training script that resulted from integration with the MissingLink SDK. In the steps that follow below, the script is further developed to include hyperparams.

Write code

  1. Add a dropout rate hyperparam:

    # Training params
    DROPOUT_RATE = 0.1
    LEARNING_RATE = 0.01
    
  2. Set hyperparams to an experiment:

    In the base script, edit the create_experiment and add the hyperparams.

    with missinglink_project.create_experiment(
        display_name='MNIST multilayer perception',
        description='Two fully connected hidden layers',
        hyperparams={'dropout_rate': DROPOUT_RATE}) as experiment:
    

You should have added hyperparams to your experiment successfully.

  • Inspect the resulting script here.
  • Run the new script and see how MissingLink's dashboard helps with monitoring the experiment's hyperparams. A description follows.

Viewing the new functionality on the dashboard

You can see the hyperparams across different experiments on your MissingLink dashboard.

Hyperparameters that are automatically retrieved

The following are the hyperparameters that are retrieved automatically if they are defined, regardless of whether a function is used to set them:

  • Algorithm
  • Learning rate
  • Learning rate decay
  • Nesterov
  • Epsilon
  • Rho
  • Beta 1
  • Beta 2
  • Schedule decay
  • Batch size
  • Total batches
  • Epoch size
  • Total epochs
  • Max iterations