Orchestra-Research/AI-Research-SKILLs/nanogpt
Educational GPT implementation in ~300 lines. Reproduces GPT-2 (124M) on OpenWebText. Clean, hackable code for learning transformers. By Andrej Karpathy. Perfect for understanding GPT architecture from scratch. Train on Shakespeare (CPU) or OpenWebText (multi-GPU).
Risk Score
0
out of 100
Popularity
2,669
Stars
223
Forks
Feb 10, 2026
Updated
CodeThreat AppSec
Full SAST + SCA agentic security analysis for MCP servers and Skills.