Background Temperature: New Metric for Hidden Randomness in LLMs
A new arXiv preprint (2604.22411) introduces the concept of 'background temperature' (T_bg) to characterize nondeterminism in large language models (LLMs) even when decoding at nominal temperature T=0. The work, by Thinking Machines Lab, identifies implementation-level sources of randomness such as batch-size variation, kernel non-invariance, and floating-point non-associativity. The authors formalize T_bg as the effective temperature induced by implementation-dependent perturbations, relate it to a stochastic process governed by the inference environment I, and propose an empirical protocol to estimate T_bg via the equivalent temperature T_n(I) of an ideal reference system. Pilot experiments across major LLM providers demonstrate the concept.
Key facts
- arXiv:2604.22411
- Introduces background temperature T_bg
- Nondeterminism persists at nominal T=0
- Sources: batch-size variation, kernel non-invariance, floating-point non-associativity
- Thinking Machines Lab is the author
- Empirical protocol to estimate T_bg via T_n(I)
- Pilot experiments on major LLM providers
- Published as a short note
Entities
Institutions
- Thinking Machines Lab
- arXiv