Text Analytics Toolbox Model for BERT-Small Network

Pretrained BERT-Small Network for MATLAB.
90 Downloads
Updated 25 Nov 2025
BERT-Small is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 4 self-attention layers and a hidden size of 512.
To load a BERT-Small model, you can run the following code:
[net, tokenizer] = bert(Model="small");
MATLAB Release Compatibility
Created with R2023b
Compatible with R2023b to R2026a
Platform Compatibility
Windows macOS (Apple Silicon) macOS (Intel) Linux