Loading video player...
King minus man plus woman equals queen. That's not a metaphor. That's an equation — and a machine solved it. With no dictionary. No grammar rules. No human telling it what "king" means. In this video, you'll learn exactly how that's possible — and why the idea behind it, called an embedding, might be the most important concept in modern AI. We'll cover: Why feeding words into a computer is harder than it sounds How word2vec learns meaning from billions of sentences Why geometry appears — out of nowhere — from raw text How the same idea powers AlphaFold, recommendation systems, and drug discovery What BERT changed, and why context changes everything By the end, you'll understand embeddings well enough to explain them to someone else. And you'll never look at a search engine, a language model, or an AI recommendation the same way again. Chapters 00:00 Introduction 00:57 Chapter 1 — How to feed a word into a computer 02:12 Chapter 2 — Geometry Appears 04:25 Chapter 3 — It's Not Just Words 05:45 Chapter 4 — Context Changes Everything 06:25 Conclusion References Mikolov et al. (2013) — Efficient Estimation of Word Representations in Vector Space: arxiv.org/abs/1301.3781 Mikolov et al. (2013) — Distributed Representations of Words and Phrases (NeurIPS): proceedings.neurips.cc Pennington et al. (2014) — GloVe: Global Vectors for Word Representation: nlp.stanford.edu/pubs/glove.pdf Devlin et al. (2018) — BERT: Pre-training of Deep Bidirectional Transformers: arxiv.org/abs/1810.04805 Goldberg & Levy (2014) — word2vec Explained: arxiv.org/abs/1402.3722 Firth, J.R. (1957) — "You shall know a word by the company it keeps." A Synopsis of Linguistic Theory, Philological Society of London #ai #machinelearning #embeddings #artificialintelligence #deeplearning