ARTFEED — Contemporary Art Intelligence

Exploratory Sampling Boosts Semantic Diversity in LLMs

ai-technology · 2026-04-30

Researchers have introduced Exploratory Sampling (ESamp), a decoding technique designed for large language models that actively promotes semantic diversity in text generation. Traditional stochastic sampling results in minimal lexical variation, which restricts exploration. ESamp employs a lightweight Distiller during testing to forecast deep-layer hidden representations based on shallow-layer data, effectively modeling transitions in representation depth. This Distiller adjusts according to the current generation context, utilizing prediction error as a signal for novelty to modify token probabilities. This method is based on the insight that neural networks tend to make more accurate predictions on familiar inputs while struggling with novel ones, ultimately aiming to enhance test-time scaling for richer semantic exploration.

Key facts

  • ESamp is a decoding approach for LLMs
  • It encourages semantic diversity during generation
  • Standard stochastic sampling yields only lexical variation
  • A lightweight Distiller is trained at test time
  • Distiller predicts deep-layer representations from shallow-layer ones
  • Prediction error serves as a novelty signal
  • Method is motivated by neural network error patterns
  • Aims to improve test-time scaling

Entities

Institutions

  • arXiv

Sources