Main Content

Transfer Data To Amazon S3 Buckets and Access Data Using MATLAB Datastore

To work with data in the cloud, you can upload to Amazon S3, then use datastores to access the data in S3 from the workers in your cluster.

  1. For efficient file transfers to and from Amazon S3, download and install the AWS Command Line Interface tool from

  2. Specify your AWS Access Key ID, Secret Access Key, and Region (or Session Token if you are using an AWS temporary token) of the bucket as system environment variables.

    • For example, on Linux, macOS, or Unix with Bourne-based shell:

      export AWS_DEFAULT_REGION="us-east-1"
      If you are using a C-based shell, replace export with setenv in the command above.

    • On Windows:

      set AWS_DEFAULT_REGION="us-east-1"

      To permanently set these environment variables, set them in your user or system environment.


    For MATLAB® releases prior to R2020a, use AWS_REGION instead of AWS_DEFAULT_REGION.

  3. Create a bucket for your data. Either use the AWS S3 web page or a command like the following:

    aws s3 mb s3://mynewbucket

  4. Upload your data using a command like the following:

    aws s3 cp mylocaldatapath s3://mynewbucket --recursive
    For example:
    aws s3 cp path/to/cifar10/in/the/local/machine s3://MyExampleCloudData/cifar10/ --recursive

  5. After creating a cloud cluster, to copy your AWS credentials to your cluster workers, in MATLAB, select Parallel > Create and Manage Clusters. In the Cluster Profile Manager, select your cloud cluster profile. Scroll to the EnvironmentVariables property and add (environment variable name only) AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_DEFAULT_REGION. If you are using AWS temporary credentials, also add AWS_SESSION_TOKEN. Note that you should set the value of the AWS environment variables in the shell before starting the MATLAB session or set them directly in MATLAB using the setenv command before using the cluster.

After you store your data in Amazon S3, you can use datastores to access the data from your cluster workers. Simply create a datastore pointing to the URL of the S3 bucket. For example, the following sample code shows using an imageDatastore to access an S3 bucket. Replace 's3://MyExampleCloudData/cifar10' with the URL of your S3 bucket.

imds = imageDatastore('s3://MyExampleCloudData/cifar10',...
 'IncludeSubfolders',true, ...
You can use an imageDatastore to read data from the cloud in your desktop client MATLAB, or when running code on your cluster workers, without changing your code. For details, see Work with Remote Data (MATLAB).

For a step-by-step example showing deep learning using data stored in Amazon S3, see the white paper Deep Learning with MATLAB and Multiple GPUs.

Related Topics