🔮The Codex

Temperature

A setting that controls how creative or predictable an AI's responses are.

📖 Apprentice Explanation

Temperature is like a creativity dial for AI. Low temperature (0.1) gives predictable, focused answers. High temperature (0.9) gives more creative, varied, and sometimes surprising responses.

🧙 Archmage Notes

Temperature scales the logits before softmax, affecting the probability distribution over tokens. T=0 is greedy decoding (deterministic), T=1 is standard sampling, T>1 increases randomness. Often combined with top-p (nucleus) sampling for better control.