Transformers — Intuitively and Exhaustively Explained | by Daniel Warfield | Sep, 2023


Exploring the modern wave of machine learning: taking apart the transformer step by step

Daniel Warfield
Towards Data Science
Image by author using MidJourney. All images by the author unless otherwise specified.

In this post you will learn about the transformer architecture, which is at the core of the architecture of nearly all cutting-edge large language models. We’ll start with a brief chronology of some relevant natural language processing concepts, then we’ll go through the transformer step by step and uncover how it works.

Who is this useful for? Anyone interested in natural language processing (NLP).

How advanced is this post? This is not a complex post, but there are a lot of concepts, so it might be daunting to less experienced data scientists.

Pre-requisites: A good working understanding of a standard neural network. Some cursory experience with embeddings, encoders, and decoders would probably also be helpful.

The following sections contain useful concepts and technologies to know before getting into transformers. Feel free to skip ahead if you feel confident.

Word Vector Embeddings

A conceptual understanding of word vector embeddings is pretty much fundamental to understanding natural language processing. In essence, a word vector embedding takes individual words and translates them into a vector which somehow represents its meaning.

The job of a word to vector embedder: turn words into numbers which somehow capture their general meaning.

The details can vary from implementation to implementation, but the end result can be thought of as a “space of words”, where the space obeys certain convenient relationships. Words are hard to do math on, but vectors which contain information about a word, and how they relate to other words, are significantly easier to do math on. This task of converting words to vectors is often referred to as an “embedding”.

Word2Vect, a landmark paper in the natural language processing space, sought to create an embedding which obeyed certain useful characteristics. Essentially…



Source link

This post originally appeared on TechToday.