Text Analytics Toolbox Model for BERT-Large Network

Pretrained BERT-Large Network for MATLAB
136 Downloads
Updated 25 Nov 2025
BERT-Large is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 24 self-attention layers and a hidden size of 1024.
To load a BERT-Large model, you can run the following code:
[net, tokenizer] = bert(Model="large");
MATLAB Release Compatibility
Created with R2023b
Compatible with R2023b to R2026a
Platform Compatibility
Windows macOS (Apple Silicon) macOS (Intel) Linux