ARTFEED — Contemporary Art Intelligence

Neural Combinatorial Solvers Efficiency Threshold Defined

other · 2026-05-16

A recent study published on arXiv presents the Amortized Efficiency Threshold (AET), a new metric designed to evaluate neural and heuristic combinatorial optimization solvers. The AET indicates the deployment volume at which a neural solver matches the total energy or carbon cost of a heuristic baseline, given a constraint on solution quality. The authors contend that the prevalent criticism labeling neural solvers as less energy-efficient than CPU metaheuristics overlooks the significance of deployment volume. While training a neural network incurs a substantial fixed GPU energy cost, metaheuristics utilize minimal CPU energy per instance. When the neural network outperforms, the cumulative energy ratio between the two solvers approaches a constant below one, suggesting that neural solvers can be more efficient with adequate deployment. The paper establishes a framework for equitable comparison and asserts that the leap from 'training is costly' to 'net-inefficient' is erroneous.

Key facts

  • Paper introduces Amortized Efficiency Threshold (AET) for comparing neural and heuristic solvers.
  • AET is defined as deployment volume where neural solver breaks even with heuristic in total energy or carbon.
  • Common critique of neural solvers' energy efficiency is challenged.
  • Training neural network costs large fixed GPU energy.
  • Metaheuristics cost small CPU energy per instance.
  • Cumulative energy ratio between solvers tends to constant below one when network wins.
  • Paper provides framework for fair comparison under solution quality constraint.
  • Inferential step from 'training is expensive' to 'net-inefficient' is identified as flawed.

Entities

Institutions

  • arXiv

Sources