I have already demonstrated how to create a knowledge graph out of a Wikipedia page. However, since the post got a lot of attention, I’ve decided to explore other domains where using NLP techniques to construct a knowledge graph makes sense. In my opinion, the biomedical field is a prime example where representing the data as a graph makes sense as you are often analyzing … [Read more...] about Construct A Biomedical Knowledge Graph With NLP
Natural Language Processing
Can Too Much BERT Be Bad for You?
BERT and GPT-2: we all love language models… I mean, who doesn’t? Language models like BERT and GPT-2 (and GPT-3) have had an enormous impact on the entire NLP field. Most of the models that obtained groundbreaking results on the famous GLUE benchmark are based on BERT. I, too, have benefited from BERT, since I released a library for topic modeling and some HuggingFace … [Read more...] about Can Too Much BERT Be Bad for You?
On The Gap Between Adoption And Understanding
This blog post describes our recent paper: Federico Bianchi and Dirk Hovy (2021). On the Gap between Adoption and Understanding in NLP. Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. The main focus of this work is to describe issues that currently affect NLP research and hinder scientific development. NLP is driven by … [Read more...] about On The Gap Between Adoption And Understanding
Topical Language Generation With Transformers
Full Paper Codes Large-scale transformer-based language models (LMs) demonstrate impressive capabilities in open text generation. However, controlling the generated text’s properties such as the topic, style, and sentiment is challenging and often requires significant changes to the model architecture or retraining and fine-tuning the model on new supervised data. We … [Read more...] about Topical Language Generation With Transformers
The Secret Guide To Human-Like Text Summarization
Summarization has become a very helpful way of tackling the issue of data overburden. In my earlier story, I shared how you can create your personal text summarizer using the extractive method — if you have tried that, you may have noticed that, because no new sentences were generated from the original content, at times you may have difficulties understanding the generated … [Read more...] about The Secret Guide To Human-Like Text Summarization
Is Attention What You Really Need In Transformers?
In recent years there has been an explosion of methods based on self-attention and in particular Transformers, first in the field of Natural Language Processing and recently also in the field of Computer Vision. If you don’t know what Transformers are, or if you want to know more about the mechanism of self-attention, I suggest you have a look at my first article on this … [Read more...] about Is Attention What You Really Need In Transformers?
On Transformers, TimeSformers, And Attention
Transformers are a very powerful Deep Learning model that has been able to become a standard in many Natural Language Processing tasks and is poised to revolutionize the field of Computer Vision as well. It all began in 2017 when Google Brain published the paper destined to change everything, Attention Is All You Need [4]. Researchers apply this new architecture to … [Read more...] about On Transformers, TimeSformers, And Attention
AI Approaches For Text Generation In Marketing & Advertising Use Cases
This research summary is part of our AI for Marketing series which covers the latest AI & machine learning approaches to 5 aspects of marketing automation: AttributionOptimizationPersonalizationAnalyticsContent Generation: ImagesContent Generation: VideosContent Generation: Text Can AI help you write high converting copy for your advertising and marketing … [Read more...] about AI Approaches For Text Generation In Marketing & Advertising Use Cases
To ROUGE Or Not To ROUGE?
In this article, we will learn about … … the difference between extractive and abstractive text summarization. … what the ROUGE score is. … why and where it fails. Text Summarization We refer to text summarization as the process of training an Artificial Intelligence (AI) model to produce a smaller chunk of text out of a bigger chunk of text. Where “smaller … [Read more...] about To ROUGE Or Not To ROUGE?
BERT Inner Workings
I created this notebook to better understand the inner workings of Bert. I followed a lot of tutorials to try to understand the architecture, but I was never able to really understand what was happening under the hood. For me it always helps to see the actual code instead of just simple abstract diagrams that a lot of times don’t match the actual implementation. If you’re like … [Read more...] about BERT Inner Workings