Text Analytics Toolbox Model for BERT-Tiny Network
Pretrained BERT-Tiny Network for MATLAB.
117 Downloads
Updated
25 Nov 2025
BERT-Tiny is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 2 self-attention layers and a hidden size of 128.
To load a BERT-Tiny model, you can run the following code:
[net, tokenizer] = bert(Model="tiny");
MATLAB Release Compatibility
Created with
R2023b
Compatible with R2023b to R2026a
Platform Compatibility
Windows macOS (Apple Silicon) macOS (Intel) LinuxTags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
