19

Lesson 19 of 20 ยท What is AI?

Transformers and attention

The transformer architecture uses an attention mechanism that lets the model focus on relevant parts of input. This powers GPT, BERT, and similar models.

  • Attention lets models focus on relevant input.
  • Transformers power modern language models like GPT.

Think about it

What is the attention mechanism in transformers?

Your Cart (0)

Your cart is empty

Browse our shop to find activities your kids will love