ARTFEED — Contemporary Art Intelligence

Shannon's Mutual Information Quantifies Creative Writing Quality

publication · 2026-04-30

A new paper on arXiv (2604.26269) proposes an information-theoretic model for evaluating creative writing quality. The authors argue that good writing achieves "calibrated surprise," where constraints from author intent, reader expectation, and reality converge, narrowing the solution space. Using Shannon's mutual information I(X;Y) = H(X) - H(X|Y), they define "calibrated" as conditional entropy approaching zero and "surprise" as increasing entropy. The paper claims full-dimensional accuracy and mediocrity are mutually exclusive, two sides of the same constraint structure. The abstract was released as a cross announcement on arXiv.

Key facts

  • Paper ID: arXiv:2604.26269
  • Release type: cross abstract
  • Uses Shannon's mutual information as analysis tool
  • Defines 'calibrated' as conditional entropy going to zero
  • Defines 'surprise' as entropy going up
  • Claims full-dimensional accuracy and mediocrity are mutually exclusive
  • Focuses on creative writing quality
  • Published on arXiv

Entities

Institutions

  • arXiv

Sources