Main Content

Use Experiment Manager to Train Generative Adversarial Networks (GANs)

This example shows how to create a custom training experiment to train a generative adversarial network (GAN) that generates images of flowers. For a custom training experiment, you explicitly define the training procedure used by Experiment Manager. In this example, you implement a custom training loop to train a GAN, a type of deep learning network that can generate data with similar characteristics as the input real data. A GAN consists of two networks that train together:

  • Generator — Given a vector of random values (latent inputs) as input, this network generates data with the same structure as the training data.

  • Discriminator — Given batches of data containing observations from both the training data, and generated data from the generator, this network attempts to classify the observations as "real" or "generated."

To train a GAN, train both networks simultaneously to maximize the performance of both networks:

  • Train the generator to generate data that "fools" the discriminator. To optimize the performance of the generator, maximize the loss of the discriminator when given generated data. In other words, the objective of the generator is to generate data that the discriminator classifies as "real."

  • Train the discriminator to distinguish between real and generated data. To optimize the performance of the discriminator, minimize the loss of the discriminator when given batches of both real and generated data. In other words, the objective of the discriminator is to not be "fooled" by the generator.

Ideally, these strategies result in a generator that generates convincingly realistic data and a discriminator that has learned strong feature representations that are characteristic of the training data. For more information, see Train Generative Adversarial Network (GAN).

Open Experiment

First, open the example. Experiment Manager loads a project with a preconfigured experiment that you can inspect and run. To open the experiment, in the Experiment Browser pane, double-click the name of the experiment (ImageGenerationExperiment).

Custom training experiments consist of a description, a table of hyperparameters, and a training function. For more information, see Configure Custom Training Experiment.

The Description field contains a textual description of the experiment. For this example, the description is:

Train a generative adversarial network (GAN) to generate images of flowers.
Use hyperparameters to specify:
- the probability of the dropout layer in the discriminator network
- the fraction of real labels to flip while training the discriminator network

The Hyperparameters section specifies the hyperparameter values to use for the experiment. When you run the experiment, Experiment Manager trains the network using every combination of hyperparameter values specified in the hyperparameter table. This example uses two hyperparameters:

  • dropoutProb sets the probability of the dropout layer in the discriminator network. By default, the values for this hyperparameter are specified as [0.25 0.5 0.75].

  • flipFactor sets the fraction of real labels to flip when you train the discriminator network. The experiment uses this hyperparameter to add noise to the real data and better balance the learning of the discriminator and the generator. Otherwise, if the discriminator learns to discriminate between real and generated images too quickly, then the generator can fail to train. The values for this hyperparameter are specified as [0.1 0.3 0.5].

The Training Function specifies the training data, network architecture, training options, and training procedure used by the experiment. To inspect the training function, under Training Function, click Edit. The training function opens in MATLAB® Editor.

The input to the training function is a structure with fields from the hyperparameter table and an experiments.Monitor object that you can use to track the progress of the training, record values of the metrics used by the training, and produce training plots. The training function returns a structure that contains the trained generator network, the trained discriminator network, and the execution environment used for training. Experiment Manager saves this output, so you can export it to the MATLAB workspace when the training is complete. The training function has six sections.

  • Initialize Output sets the initial value of the networks to empty arrays to indicate that the training has not started. The experiment sets the execution environment to "auto", so it trains the networks on a GPU if one is available. Using a GPU requires Parallel Computing Toolbox™ and a supported GPU device. For more information, see GPU Support by Release (Parallel Computing Toolbox).

  • Load Training Data defines the training data for the experiment as an imageDatastore object. The experiment uses the Flowers data set, which contains 3670 images of flowers and is about 218 MB. For more information on this data set, see Image Data Sets.

  • Define Generator Network defines the architecture for the generator network as a layer graph that generates images from 1-by-1-by-100 arrays of random values. To train the network with a custom training loop and enable automatic differentiation, the training function converts the layer graph to a dlnetwork object.

  • Define Discriminator Network defines the architecture for the discriminator network as a layer graph that classifies real and generated 64-by-64-by-3 images. The dropout layer uses the dropout probability defined in the hyperparameter table. To train the network with a custom training loop and enable automatic differentiation, the training function converts the layer graph to a dlnetwork object.

  • Specify Training Options defines the training options used by the experiment. In this example, Experiment Manager trains the networks with a mini-batch size of 128 for 50 epochs using an initial learning rate of 0.0002, a gradient decay factor of 0.5, and a squared gradient decay factor of 0.999.

  • Train Model defines the custom training loop used by the experiment. The custom training loop uses minibatchqueue to process and manage the mini-batches of images. For each mini-batch, the minibatchqueue object rescales the images in the range [-1,1], discards any partial mini-batches with fewer than 128 observations, and formats the image data with the dimension labels 'SSCB' (spatial, spatial, channel, batch). By default, the minibatchqueue object converts the data to dlarray objects with underlying type single. For each epoch, the custom training loop shuffles the datastore and loops over mini-batches of data. If you train on a GPU, the data is converted to gpuArray objects. Then, the training function evaluates the model gradients and updates the discriminator and generator network parameters. After each iteration of the custom training loop, the training function saves the trained networks and updates the training progress.

Training GANs can be a challenging task because the generator and the discriminator networks compete against each other during the training. If one network learns too quickly, then the other network can fail to learn. To help you diagnose issues and monitor how well the generator and discriminator achieve their respective goals, this experiment displays a pair of scores in the training plot. The generator score scoreGenerator measures the likelihood that the discriminator can correctly distinguish generated images. The discriminator score scoreDiscriminator measures the likelihood that the discriminator can correctly distinguish all input images, assuming that the numbers of real and generated images passed to the discriminator are equal. In the ideal case, both scores are 0.5. Scores that are too close to zero or one can indicate that one network dominates the other. See Monitor GAN Training Progress and Identify Common Failure Modes.

To help you decide which trial produces the best results, this experiment combines the generator score and discriminator scores into a single numeric value, scoreCombined. This metric uses the L-∞ norm to determine how close the two networks are from the ideal scenario. It takes a value of one if both network scores equal 0.5, and zero if one of the network scores equals zero or one.

Run Experiment

When you run the experiment, Experiment Manager trains the network defined by the training function multiple times. Each trial uses a different combination of hyperparameter values. By default, Experiment Manager runs one trial at a time. If you have Parallel Computing Toolbox, you can run multiple trials at the same time. For best results, before you run your experiment, start a parallel pool with as many workers as GPUs. For more information, see Use Experiment Manager to Train Networks in Parallel.

  • To run one trial of the experiment at a time, on the Experiment Manager toolstrip, click Run.

  • To run multiple trials at the same time, click Use Parallel and then Run. If there is no current parallel pool, Experiment Manager starts one using the default cluster profile. Experiment Manager then executes multiple simultaneous trials, depending on the number of parallel workers available.

A table of results displays the training loss and validation accuracy for each trial.

While the experiment is running, click Training Plot to display the training plot and track the progress of each trial.

Evaluate Results

To find the best result for your experiment, sort the table of results using the combined score.

  1. Point to the scoreCombined column.

  2. Click the triangle icon.

  3. Select Sort in Descending Order.

The trial with the highest combined score appears at the top of the results table.

Evaluate the quality of the GAN by generating and inspecting the images produced by the trained generator.

  1. Select the trial with the highest combined score.

  2. On the Experiment Manager toolstrip, click Export.

  3. In the dialog window, enter the name of a workspace variable for the exported training output. The default name is trainingOutput.

  4. Test the trained generator network by calling the generateTestImages function. Use the exported training output as the input to the function. For instance, in the MATLAB Command Window, enter:

generateTestImages(trainingOutput)

The function creates a batch of 25 random vectors to input to the generator network and displays the resulting images.

Using the combined score to sort your results might not identify the best trial in all cases. For best results, repeat this process for each trial with a high combined score, visually checking that the generator produces a variety of images without many duplicates. If the images have little diversity and some of them are almost identical, then your generator is likely affected by mode collapse. For more information, see Mode Collapse.

To record observations about the results of your experiment, add an annotation.

  1. In the results table, right-click the scoreCombined cell for the best trial.

  2. Select Add Annotation.

  3. In the Annotations pane, enter your observations in the text box.

For more information, see Sort, Filter, and Annotate Experiment Results.

Rerun Experiment

After you identify the combination of hyperparameters that generates the best images, run the experiment a second time to train the network for a longer period of time.

  1. Return to the experiment definition pane.

  2. In the hyperparameter table, enter the hyperparameter values from your best trial. For example, to use the values from trial 2, change the value of dropoutProb to 0.5 and flipFactor to 0.1.

  3. Open the training function and specify a longer training time. Under Specify Training Options, change the value of numEpochs to 500.

  4. Run the experiment using the new hyperparameter values and training function. Experiment Manager runs a single trial. Training takes about 10 times longer than the previous trials.

  5. When the experiment finishes, export the training output and run the generateTestImages function to test the new generator network. As before, visually check that the generator produces a variety of images without many duplicates.

Close Experiment

In the Experiment Browser pane, right-click the name of the project and select Close Project. Experiment Manager closes all of the experiments and results contained in the project.

See Also

Apps

Objects

Related Topics