BERT and GPT-2: we all love language models… I mean, who doesn’t? Language models like BERT and GPT-2 (and GPT-3) have had an enormous impact on the entire NLP field. Most of the models that obtained groundbreaking results on the famous GLUE benchmark are based on BERT. I, too, have benefited from BERT, since I released a library for topic modeling and some HuggingFace … [Read more...] about Can Too Much BERT Be Bad for You?
On The Gap Between Adoption And Understanding
This blog post describes our recent paper: Federico Bianchi and Dirk Hovy (2021). On the Gap between Adoption and Understanding in NLP. Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. The main focus of this work is to describe issues that currently affect NLP research and hinder scientific development. NLP is driven by … [Read more...] about On The Gap Between Adoption And Understanding