Physics-Informed AI Model Forecasts GPU Power in Data Centers
A new physics-informed deep learning model, PI-DLinear, predicts GPU power consumption in AI data centers 5–80 minutes ahead. The model integrates a multi-node lumped thermal RC network based on Newton's law of cooling, using time-dependent ODEs to link power, compute, memory, and temperature. It addresses rapid power fluctuations from heterogeneous LLM inference and training tasks that can destabilize electricity grids. The approach is claimed to be the first physics-informed DLinear time-series model for short-term GPU power forecasting.
Key facts
- PI-DLinear is a physics-informed DLinear time-series model.
- It forecasts GPU power utilization 5–80 minutes into the future.
- The model uses a multi-node lumped thermal RC network consistent with Newton's law of cooling.
- Time-dependent ordinary differential equations (ODEs) interlink power consumption with GPU compute, memory utilization, and temperature.
- AI data centers experience rapid power demand fluctuations due to heterogeneous computational tasks.
- Power profiles of LLM inference and training are distinct and can cause grid instability.
- The paper is published on arXiv with ID 2605.04074.
- The model is designed for short-term forecasting of AI data center power.
Entities
Institutions
- arXiv