ARTFEED — Contemporary Art Intelligence

Llama-3.1-8B Uses Base-10 Addition for Cyclic Reasoning

ai-technology · 2026-05-06

A new study on arXiv reveals that Llama-3.1-8B, despite having circularly structured representations for cyclic concepts like months, does not directly compute modular addition within the concept's period (e.g., 12 for months). Instead, it re-uses a generic base-10 addition mechanism: it sums the two inputs (e.g., six + August = 14) and then maps the result back to the cyclic concept space (14 → February). The model employs task-agnostic Fourier features with periods respecting base-10 addition (2, 5, 10) rather than the cyclic concept period. A sparse set of 28 MLP neurons is identified as key to this process.

Key facts

  • Study published on arXiv with ID 2605.01148
  • Focuses on Llama-3.1-8B model
  • Model uses base-10 addition for cyclic concepts like months
  • Example: six months after August is computed as 6+8=14, then mapped to February
  • Fourier features have periods 2, 5, 10 instead of 12
  • 28 MLP neurons identified as critical
  • Model re-uses generic addition mechanism across tasks
  • Representations are circularly structured but not used for direct modular addition

Entities

Institutions

  • arXiv

Sources