outdated transformer models

JH on 6 May 2025 (Edited on 14 May 2025)
Latest activity Edit by JH on 14 May 2025

Why is RoBERTa not available as a pretrained model? It is superior to BERT in many fields and has become more popular in the literature. For faster inference, you should offer DistilBERT, which is more modern than BERT but smaller/faster. The respository hasn't been updated in two years, which is a lifetime in the field of deep learning.
https://github.com/matlab-deep-learning/transformer-models
Mike Croucher
Mike Croucher on 12 May 2025
Thanks for your suggestions.I have passed them on to development