This example shows how to configure an experiment that initializes the weights of convolution and fully connected layers using different weight initializers for training. To compare the performance of different weight initializers for your task, create an experiment using this example as a guide.
When training a deep learning network, the initialization of layer weights and biases can have a big impact on how well the network trains. The choice of initializer has a bigger impact on networks without batch normalization layers. For more information on weight initializers, see Compare Layer Weight Initializers.
First, open the example. Experiment Manager loads a project with a preconfigured experiment that you can inspect and run. To open the experiment, in the Experiment Browser pane, double-click the name of the experiment (
Built-in training experiments consist of a description, a table of hyperparameters, a setup function, and a collection of metric functions to evaluate the results of the experiment. For more information, see Configure Built-In Training Experiment.
The Description field contains a textual description of the experiment. For this example, the description is:
Perform transfer learning by initializing the weights of convolution and fully connected layers in a pretrained network.
The Hyperparameters section specifies the strategy (
Exhaustive Sweep) and hyperparameter values to use for the experiment. When you run the experiment, Experiment Manager trains the network using every combination of hyperparameter values specified in the hyperparameter table. This example uses the hyperparameters
BiasInitializer to specify the weight and bias initializers for the convolution and fully connected layers in a pretrained network. For more information about these initializers, see
The Setup Function configures the training data, network architecture, and training options for the experiment. To inspect the setup function, under Setup Function, click Edit. The setup function opens in MATLAB® Editor.
In this example, the setup function:
Downloads and extracts the Flowers data set, which is about 218 MB. For more information on this data set, see Image Data Sets.
Loads a pretrained GoogLeNet network and initializes the input weight in the convolution and fully connected layers by using the initializers specified in the hyperparameter table. The auxiliary function
findLayersToReplace determines which layers in the network architecture can be modified for transfer learning.
object for the experiment. The example trains the network for 10 epochs, using a mini-batch size of 128 and validating the network every 5 epochs.
The Metrics section specifies optional functions that evaluate the results of the experiment. This example does not include any custom metric functions.
When you run the experiment, Experiment Manager trains the network defined by the setup function multiple times. Each trial uses a different combination of hyperparameter values. By default, Experiment Manager runs one trial at a time. If you have Parallel Computing Toolbox™, you can run multiple trials at the same time. For best results, before you run your experiment, start a parallel pool with as many workers as GPUs. For more information, see Use Experiment Manager to Train Networks in Parallel and GPU Support by Release (Parallel Computing Toolbox).
To run one trial of the experiment at a time, on the Experiment Manager toolstrip, click Run.
To run multiple trials at the same time, click Use Parallel and then Run. If there is no current parallel pool, Experiment Manager starts one using the default cluster profile. Experiment Manager then executes multiple simultaneous trials, depending on the number of parallel workers available.
A table of results displays the accuracy and loss for each trial.
While the experiment is running, click Training Plot to display the training plot and track the progress of each trial.
Click Confusion Matrix to display the confusion matrix for the validation data in each completed trial.
When the experiment finishes, you can sort the results table by column, filter trials by using the Filters pane, or record observations by adding annotations. For more information, see Sort, Filter, and Annotate Experiment Results.
To test the performance of an individual trial, export the trained network or the training information for the trial. On the Experiment Manager toolstrip, select Export > Trained Network or Export > Training Information, respectively. For more information, see net and info.
In the Experiment Browser pane, right-click the name of the project and select Close Project. Experiment Manager closes all of the experiments and results contained in the project.