This is a video series about the most happening "Transformers" architecture. The first video is about self-attention in general. The second one goes on to explaining how self-attention is used in transformers. The third one is about the concept of positional encoding and input embedding.