This article describes how to extend the simplest formulation of Graph Neural Networks (GNNs) to encode the structure of multi-relational data, such as Knowledge Graphs (KGs). The article includes 4 main sections: an introduction to the key idea of multi-relational data, which describes the peculiarity of KGs;a summary of the standard components included in a GNN … [Read more...] about Graph Neural Networks for Multi-Relational Data
Technical Guide
Graph Attention Networks Under the Hood
Graph Neural Networks (GNNs) have emerged as the standard toolbox to learn from graph data. GNNs are able to drive improvements for high-impact problems in different fields, such as content recommendation or drug discovery. Unlike other types of data such as images, learning from graph data requires specific methods. As defined by Michael Bronstein: [..] these methods … [Read more...] about Graph Attention Networks Under the Hood
Graph Transformer: A Generalization of Transformers to Graphs
This blog is based on the paper A Generalization of Transformer Networks to Graphs with Xavier Bresson at 2021 AAAI Workshop on Deep Learning on Graphs: Methods and Applications (DLG-AAAI’21). We present Graph Transformer, a transformer neural network that can operate on arbitrary graphs. BLOG OUTLINE: BackgroundObjectiveKey Design Aspects for … [Read more...] about Graph Transformer: A Generalization of Transformers to Graphs
Autoencoders: Overview of Research and Applications
Since the early days of machine learning, it has been attempted to learn good representations of data in an unsupervised manner. The hypothesis underlying this effort is that disentangled representations translate well to downstream supervised tasks. For example, if a human is told that a Tesla is a car and he has a good representation of what a car looks like, he can probably … [Read more...] about Autoencoders: Overview of Research and Applications
Variational Methods in Deep Learning
Deep neural networks are a flexible family of models wide applications in AI and other fields. Even though these networks often encompass millions or even billions of parameters, it is still possible to train them effectively using the maximum likelihood principle as well as stochastic gradient descent techniques. Unfortunately, this learning procedure only gives us a … [Read more...] about Variational Methods in Deep Learning
Step-By-Step Implementation of GANs on Custom Image Data in PyTorch: Part 2
In Part 1 on GANs, we started to build intuition regarding what GANs are, why we need them, and how the entire point behind training GANs is to create a generator model that knows how to convert a random noise vector into a (beautiful) almost real image. Since we have already discussed the pseudocode in great depth in Part 1, be sure to check that out as … [Read more...] about Step-By-Step Implementation of GANs on Custom Image Data in PyTorch: Part 2
How I Would Explain GANs From Scratch to a 5-Year Old: Part 1
Note: Quite frankly, there are already a zillion articles out there explaining the intuition behind GANs. While I will briefly touch upon it, the rest of the article will be an absolute deep dive into the GAN architecture and mainly coding — but with a very very detailed explanation of the pseudocode (open-sourced as an example by PyTorch on Github). Why do I need … [Read more...] about How I Would Explain GANs From Scratch to a 5-Year Old: Part 1
Fine-tune Transformers in PyTorch Using Hugging Face Transformers
This notebook is designed to use a pretrained transformers model and fine-tune it on a classification task. The focus of this tutorial will be on the code itself and how to adjust it to your needs. This notebook is using the AutoClasses from transformer by Hugging Face functionality. This functionality can guess a model’s configuration, tokenizer and … [Read more...] about Fine-tune Transformers in PyTorch Using Hugging Face Transformers
Pretrain Transformers Models in PyTorch Using Hugging Face Transformers
This notebook is used to pretrain transformers models using Hugging Face on your own custom dataset. What do I mean by pretrain transformers? The definition of pretraining is to train in advance. That is exactly what I mean! Train a transformer model to use it as a pretrained transformers model which can be used to fine-tune it on a specific … [Read more...] about Pretrain Transformers Models in PyTorch Using Hugging Face Transformers
From Text to Knowledge: The Information Extraction Pipeline
I am thrilled to present my latest project I have been working on. In this blog post, I will present my implementation of an information extraction data pipeline, following my passion for combining natural language processing and knowledge graphs. Later on, I will also explain why I see the combination of NLP and graphs as one of the paths to explainable AI. If this in-depth … [Read more...] about From Text to Knowledge: The Information Extraction Pipeline