ALBERT: A Lite BERT for Self-supervised Learning of Language\n Representations
Increasing model size when pretraining natural language representations often\nresults in improved performance on downstream tasks. However, at some point\nfurther model increas...