Why is RoBERTa not available as a pretrained model? It is superior to BERT in many fields and has become more popular in the literature. For faster inference, you should offer DistilBERT, which is more modern than BERT but smaller/faster. The respository hasn't been updated in two years, which is a lifetime in the field of deep learning.
https://github.com/matlab-deep-learning/transformer-models
1 Comment
Time DescendingThanks for your suggestions.I have passed them on to development
Sign in to participate