ARTFEED — Contemporary Art Intelligence

TRAM: Joint Optimization of Approximate Multipliers and AI Models for Low-Power Accelerators

ai-technology · 2026-05-12

TRAM (Training Approximate Multiplier Structures) is a new method that jointly optimizes approximate multiplier (AxM) architecture and AI model parameters to reduce power consumption in AI accelerators. Unlike prior approaches that design AxMs separately from training, TRAM integrates the optimization process, achieving up to 25.05% power reduction on CNNs with CIFAR-10 and up to 27.09% on vision transformers with ImageNet compared to state-of-the-art AxMs. The paper is published on arXiv under computer science and machine learning.

Key facts

  • TRAM jointly optimizes AxM structure and AI model parameters.
  • Achieves up to 25.05% power reduction on CNNs with CIFAR-10.
  • Achieves up to 27.09% power reduction on vision transformers with ImageNet.
  • Published on arXiv.
  • Focuses on low-power AI accelerators.
  • Approximate computing reduces power with minimal accuracy loss.
  • Multipliers are power-hungry components in AI models.
  • TRAM outperforms state-of-the-art AxMs.

Entities

Institutions

  • arXiv

Sources