This blog post summarizes EMNLP 2019 paper Revealing the Dark Secrets of BERT by researchers from the Text Machine Lab at UMass Lowell: Olga Kovaleva (LinkedIn), Alexey Romanov (LinkedIn), Anna Rogers (Twitter: @annargrs), and Anna Rumshisky (Twitter: @arumshisky). Here are the topics covered: A brief intro to … [Read more...] about The Dark Secrets Of BERT
Natural Language Processing
Data Labeling For Natural Language Processing
Why Does Training Data Matter? Machine Learning has made significant strides in the last decade. This can be attributed to parallel improvements in processing power and new breakthroughs in Deep Learning research. Another key reason is the abundance of data that has been accumulated. Analysts estimate humankind sits atop 44 zettabytes of information today. The … [Read more...] about Data Labeling For Natural Language Processing
Why Choosing a Heavier NLP Model Might Be a Good Choice?
From Google’s 43 rules of ML. Rule #4: Keep the first model simple and get the infrastructure right. With some opinions floating in the market, I feel it’s a good time to spark a discussion about this topic. Otherwise, the opinions of the popular will just drown other ideas. Note: I work in NLP and these opinions are more focussed towards NLP applications. Cannot … [Read more...] about Why Choosing a Heavier NLP Model Might Be a Good Choice?
Dissecting The Transformer
We saw how attention works and how it improved neural machine translation systems (see the previous blogpost), we are going to unveil the secrets behind the power of the most famous NLP models nowadays (a.k.a BERT and friends), the transformer. In this second part, we are going to dive into the details of this architecture with the aim of getting a solid … [Read more...] about Dissecting The Transformer
Decoding NLP Attention Mechanisms
Arguably more famous today than Michael Bay’s Transformers, the transformer architecture and transformer-based models have been breaking all kinds of state-of-the-art records. They are (rightfully) getting the attention of a big portion of the deep learning community and researchers in Natural Language Processing (NLP) since their introduction in 2017 by the … [Read more...] about Decoding NLP Attention Mechanisms
Document Embedding Techniques
Word embedding — the mapping of words into numerical vector spaces — has proved to be an incredibly important method for natural language processing (NLP) tasks in recent years, enabling various machine learning models that rely on vector representation as input to enjoy richer representations of text input. These representations preserve more semantic and syntactic information … [Read more...] about Document Embedding Techniques
NLP Interview Questions
Are you hiring technical AI talent for your company? Post your openings on the TOPBOTS jobs board (go to jobs board) to reach thousands of engineers, data scientists, and researchers currently looking for work. It's one thing to practice NLP and another to crack interviews. Giving an interview for NLP role is very different from a generic data science profile. In just a … [Read more...] about NLP Interview Questions
Semantic Search: Theory And Implementation
It took me a long time to realise that search is the biggest problem in NLP. Just look at Google, Amazon and Bing. These are multi-billion dollar businesses possible only due to their powerful search engines. My initial thoughts on search were centered around unsupervised ML, but I participated in Microsoft Hackathon 2018 for Bing and came to know the various ways a … [Read more...] about Semantic Search: Theory And Implementation
Better Sentiment Analysis with BERT
Imagine you have a bot answering your clients, and you want to make it sound a little bit more natural, more human. To achieve that, you have to make the answers more personalized. One way to learn more about the customers you’re talking to is to analyze the polarity of their answers. By polarity here I mean detecting if the sentence (or group of sentences) is written with … [Read more...] about Better Sentiment Analysis with BERT
An Ultimate Guide To Transfer Learning In NLP
Natural language processing is a powerful tool, but in real-world we often come across tasks which suffer from data deficit and poor model generalisation. Transfer learning solved this problem by allowing us to take a pre-trained model of a task and use it for others. Today, transfer learning is at the heart of language models like Embeddings from Language … [Read more...] about An Ultimate Guide To Transfer Learning In NLP