Prerequisites for Deep Learning with MATLAB Coder
MathWorks Products
To use MATLAB® Coder™ to generate code for deep learning networks, you must also install:
Deep Learning Toolbox™
MATLAB Coder Interface for Deep Learning
Generate Code That Does Not Use Third-Party Libraries
You can use MATLAB Coder to generate generic C or C++ code for deep learning networks. Such C or C++ code does not depend on third-party libraries. For more information, see Generate Generic C/C++ Code for Deep Learning Networks.
MATLAB Coder locates and uses a supported installed compiler. For the list of supported compilers, see Supported and Compatible Compilers on the MathWorks® website.
You can use mex -setup to change the default compiler. See Change Default Compiler.
The C++ compiler must support C++11.
On Windows®, to generate generic C or C++ code that does not use third-party libraries, use Microsoft® Visual Studio® or the MinGW® compiler.
Generate Code That Uses Third-Party Libraries
You can use MATLAB Coder to generate C++ code for deep learning networks that you deploy to Intel® or ARM® processors. The generated code takes advantage of deep learning libraries optimized for the target CPU. The hardware and software requirements depend on the target platform.
Note
The paths to the required software libraries must not contain spaces or special characters, such as parentheses. On Windows operating systems, special characters and spaces are allowed only if 8.3 file names are enabled. For more information on 8.3 file names, refer to the Windows documentation.
Hardware and Software Requirements
| Intel CPUs | ARM Cortex-A CPUs | |
|---|---|---|
| Hardware Requirements | Intel processor with support for Intel Advanced Vector Extensions 2 (Intel
| ARM
Cortex®-A processors that support the |
| Software Libraries | Intel Math Kernel Library for Deep Neural Networks (MKL-DNN), v1.4. See https://github.com/uxlfoundation/oneDNN. Do not use a prebuilt library because some required files are missing. Instead, build the library from the source code. See instructions for building the library on GitHub®. For more information on build, see this post in MATLAB Answers™: How do I build the intel MKL-DNN library for Deep Learning C++ code generation and deployment. Usage notes:
| ARM Compute Library for computer vision and machine learning, versions 19.05 and 20.02.1. See https://developer.arm.com/ip-products/processors/machine-learning/compute-library. Specify the version number in a Do not use a prebuilt library because it might be incompatible with the compiler on the ARM hardware. Instead, build the library from the source code. Build the library on either your host machine or directly on the target hardware. See instructions for building the library on GitHub. The folder that contains the library files such as
For more information on build, see this post in MATLAB Answers: How do I build the ARM Compute Library for Deep Learning C++ code generation and deployment. To deploy generated code that performs inference computations in 8-bit integers on ARM processors, you must use ARM Compute library version 20.02.1. |
| Operating System Support | Windows, Linux®, and macOS. | Windows and Linux only. |
| Supported Compilers | MATLAB Coder locates and uses a supported installed compiler. For the list of supported compilers, see Supported and Compatible Compilers on the MathWorks website. You can use The C++ compiler must support C++11. On
Windows, to generate code that uses the Intel MKL-DNN library by using the Note On Windows, for generating MEX function that uses the Intel MKL-DNN library, the MinGW compiler is not supported. | |
Environment Variables
MATLAB Coder uses environment variables to locate the libraries required to generate code for deep learning networks.
| Platform | Variable Name | Description |
|---|---|---|
| Windows | INTEL_MKLDNN | Path to the root folder of the Intel MKL-DNN library installation. For example:
|
ARM_COMPUTELIB | Path to the root folder of the ARM Compute Library installation on the ARM target hardware. For example:
Set
| |
CMSISNN_PATH | Path to the root folder of the CMSIS-NN library installation on the ARM target hardware. For example:
Set
| |
PATH | Path to the Intel MKL-DNN library folder. For example:
| |
| Linux | LD_LIBRARY_PATH | Path to the Intel MKL-DNN library folder. For example:
|
Path to the ARM Compute Library folder on the target hardware. For example:
Set
| ||
INTEL_MKLDNN | Path to the root folder of the Intel MKL-DNN library installation. For example:
| |
ARM_COMPUTELIB | Path to the root folder of the ARM Compute Library installation on the ARM target hardware. For example:
Set
| |
CMSISNN_PATH | Path to the root folder of the CMSIS-NN library installation on the ARM target hardware. For example:
Set
| |
| macOS | INTEL_MKLDNN | Path to the root folder of the Intel MKL-DNN library installation. For example:
|
Note
To generate code for Raspberry Pi® using the Raspberry Pi Blockset, you must set the environment variables non-interactively. For instructions, see https://www.mathworks.com/matlabcentral/answers/455591-matlab-coder-how-do-i-setup-the-environment-variables-on-arm-targets-to-point-to-the-arm-compute-li.
Note
You might be able to improve the performance of
the code generated for Intel CPU-s by setting environment variables that control the binding of OpenMP
threads to physical processing units. For example, on the Linux platform, set the KMP_AFFINITY environment variable to
scatter. For other platforms using Intel CPU-s, you might be able to set similar environment variables to improve
the performance of the generated code.