Text Analytics Toolbox Model for BERT-Base Network

Pretrained BERT-Base Network for MATLAB
Updated 20 Mar 2024
BERT-Base is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 12 self-attention layers and a hidden size of 768.
To load a BERT-Base model, run the following code:
[net, tokenizer] = bert(Model="base");
MATLAB Release Compatibility
Created with R2023b
Compatible with R2023b to R2024a
Platform Compatibility
Windows macOS (Apple silicon) macOS (Intel) Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!