Text Analytics Toolbox Model for BERT-Tiny Network

Pretrained BERT-Tiny Network for MATLAB.

You are now following this Submission

BERT-Tiny is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 2 self-attention layers and a hidden size of 128.
To load a BERT-Tiny model, you can run the following code:
[net, tokenizer] = bert(Model="tiny");

MATLAB Release Compatibility

  • Compatible with R2023b to R2026a

Platform Compatibility

  • Windows
  • macOS (Apple Silicon)
  • macOS (Intel)
  • Linux