14

Lesson 14 of 20 ยท Smart Helpers

Tokenization and embeddings

Before processing text, LLMs split it into tokens (word pieces) and convert them to number vectors called embeddings that capture meaning.

  • Tokenization splits text into processable pieces.
  • Embeddings are numerical representations of meaning.

Think about it

What is the context window in an LLM?

Your Cart (0)

Your cart is empty

Browse our shop to find activities your kids will love

Tokenization and embeddings โ€” Smart Helpers | 7th Grade AI for Kids | LittleActivity | LittleActivity