 How do we feed language to a transformer? Transformer is a model. It doesn't understand characters. It understands matrices, vectors, and numbers. So we need to convert input words to numbers. Each word is mapped to an embedding. Embeddings are vectors that encapsulate the meaning of a word. And we can create these meaningful vectors using PyTorch embeddings. And so we can encode a sentence as a sequence of vectors that are eventually passed into the architecture. Check the channel for a full video.