A brief history: ImageNet was first published in 2009 and over the next four years would go on to form the bedrock of most computer vision models. To this day whether you are training a model to detect pneumonia or classify models of cars you will probably start with a model pre-trained on ImageNet or some other large (and general image) dataset. More recently papers … [Read more...] about Transfer Learning for Time Series Forecasting and Classification
Attention For Time Series Forecasting And Classification
Transformers (specifically self-attention) have powered significant recent progress in NLP. They have enabled models like BERT, GPT-2, and XLNet to form powerful language models that can be used to generate text, translate text, answer questions, classify documents, summarize text, and much more. With their recent success in NLP one would expect widespread adaptation to … [Read more...] about Attention For Time Series Forecasting And Classification