LLM Everything
Ctrlk
Github
  • 📃前言
  • 🎚️基础部分
  • 🐬Prompt Engineering
  • 🦖Transformer
    • Tokenizer
    • Embeddings
    • Positional Encoding
    • Self Attention
    • Multi-Head Attention
    • Add & Norm
    • FeedForward
    • Linear & Softmax
    • Decoding Strategy
  • 🪿LLM应用
  • 🎄LLM训练
  • 🐒MoE
Powered by GitBook
On this page

🦖Transformer

TokenizerEmbeddingsPositional EncodingSelf AttentionMulti-Head AttentionAdd & NormFeedForwardLinear & SoftmaxDecoding Strategy
PreviousTree of ThoughtsNextTokenizer