Text Analytics Toolbox Model for BERT-Mini Network

Pretrained BERT-Mini Network for MATLAB
84 Downloads
Updated 25 Nov 2025
BERT-Mini is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 4 self-attention layers and a hidden size of 256.
To load a BERT-Mini model, you can run the following code:
[net, tokenizer] = bert(Model="mini");
MATLAB Release Compatibility
Created with R2023b
Compatible with R2023b to R2026a
Platform Compatibility
Windows macOS (Apple Silicon) macOS (Intel) Linux