🔮The Codex
Chain-of-Thought (CoT)
A prompting technique that makes AI show its reasoning step by step.
📖 Apprentice Explanation
Chain-of-thought is when you ask AI to 'think step by step.' This often leads to better answers because the AI works through the problem logically instead of jumping to a conclusion.
🧙 Archmage Notes
CoT prompting improves performance on reasoning tasks by 10-40%. Variants include zero-shot CoT ('Let's think step by step'), self-consistency (multiple reasoning paths), and tree-of-thought (branching exploration). Most effective for math, logic, and multi-step reasoning.
