ARTFEED — Contemporary Art Intelligence

Randomized Hadamard Transforms Proven Effective for Quantization

other · 2026-05-09

A recent study available on arXiv (2605.06014) demonstrates that applying two randomized Hadamard transforms (RHTs) to any input vector of size d results in a marginal distribution for each fixed coordinate that is within O(d^{-1/2}) of a standard Gaussian. This finding tackles the worst-case performance limitations associated with individual RHTs, which are frequently utilized as effective substitutes for uniform random rotations (URRs) in various applications, including gradient compression, inference speed enhancement, KV-cache compression, model weight quantization, and approximate nearest-neighbor search. While URRs yield coordinates that approach Gaussian distributions in high dimensions, a single RHT may significantly diverge for certain worst-case inputs. This outcome supports the strategy of employing two RHTs to achieve near-Gaussian characteristics.

Key facts

  • Paper arXiv:2605.06014 proves two composed RHTs yield Gaussian-like marginals.
  • Single RHT can perform poorly on worst-case inputs.
  • URRs converge to Gaussian in high dimensions.
  • RHTs preserve orthogonality and admit fast implementations.
  • Quantization applications include gradient compression and inference acceleration.
  • Also used for KV-cache compression and model weight quantization.
  • Approximate nearest-neighbor search in vector databases benefits.
  • Result shows marginal distribution within O(d^{-1/2}) of standard Gaussian.

Entities

Institutions

  • arXiv

Sources