LSTM cell

Transforme-XL

LSTM cell

Transforme-XL

Abstract

This talk summarizes the paper Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context. It assumes that audience are already familier with Attention Is All You Need paper and also discuss some high level concepts of it.

Date
Location
MICL Lab, Singapore
Avatar
M Saiful Bari
Senior Research Scientist

Computer Science enthusiast working in deep learning, natural language processing.