📄️ 1.1 What is an LLM?
Understand neural networks, transformers, and LLMs — parameters, training vs inference.
📄️ 1.2 Tokenization
Tokens, BPE encoding, context length limits, and why context explosion happens.
📄️ 1.3 Text Generation
Autoregressive generation, temperature, top-p, and sampling strategies explained.
📄️ 1.4 Model Landscape
GPT, Claude, DeepSeek, Llama, Mistral — open vs closed models and size trade-offs.
📄️ 1.5 First LLM
Ollama local setup, first API call to OpenAI/Anthropic, and comparing outputs.