the curriculum
Pick a path. Climb it. Walk away knowing how AI works.
Four tracks, four reasons to be here. The whole site is structured around your starting point — not ours.
If you have 30 minutes
Five hand-picked interactive demos, in order. No prerequisites — open one in a tab, lose track of time. Each is built on real models or real algorithms; nothing's a cartoon.
The full curriculum
15 stages, top to bottom. Each stage is a coherent group of articles plus one or more demos that bring the lesson to life. Click any article to start; the demo pills are shortcuts.
Foundations
linear algebra → probability → neural nets → language modeling.
Math Foundations
Linear algebra, probability, calculus, info theory.
ML Fundamentals
Supervised, unsupervised, evaluation, regularization.
Neural Networks
Backprop, activations, optimizers, regularization.
Language Modeling
n-grams to RNNs to why transformers won.
LLM Internals
tokenization → attention → transformer block → modern architectures.
Tokens & Embeddings
How text becomes vectors.
Transformers
Self-attention, multi-head, KV caching, GPT from scratch.
Modern LLMs
Scaling laws, MoE, reasoning models, long context.
Builder track
prompting, RAG, fine-tuning, agents, multimodal — the production toolbox.
Prompting
Few-shot, CoT, structured outputs, sampling.
RAG
Chunking, embeddings, hybrid search, reranking, evals.
Fine-Tuning
SFT, LoRA, RLHF, DPO, GRPO, embedding fine-tuning.
Agents
Loops, tools, memory, planning, multi-agent, browser/vision.
Multimodal
CLIP, VLMs, diffusion, video, speech, synthetic data.
Production track
cost, latency, evals, guardrails, observability, applications.
Production
Deployment, evals, guardrails, observability, cost, data systems.
Applications
Text-to-SQL, code, browser agents, finance, case studies.
Career
Roles, learning roadmap, staying current.