12

Lesson 12 of 20 ยท Asking Questions

How LLMs process prompts

LLMs tokenize prompts, embed tokens as vectors, and process them through transformer layers. The prompt's structure directly affects attention patterns.

  • Prompts are tokenized and embedded.
  • Prompt structure affects model attention.

Think about it

What is the difference between zero-shot, one-shot, and few-shot prompting?

Your Cart (0)

Your cart is empty

Browse our shop to find activities your kids will love