ARTFEED — Contemporary Art Intelligence

Random Cloud: Training-Free Neural Architecture Search

ai-technology · 2026-04-30

A novel technique known as Random Cloud introduces a training-free strategy for neural architecture search, identifying minimal feedforward network structures through stochastic exploration and incremental structural reduction. In contrast to traditional post-training pruning methods that necessitate a complete train-prune-retrain cycle, Random Cloud assesses networks initialized randomly without backpropagation, gradually simplifying their structure and ultimately training only the most promising minimal candidate. This approach was tested on seven classification benchmarks against magnitude pruning and random pruning methods. Random Cloud either matches or surpasses both alternatives in six out of seven datasets, achieving a notable improvement of 4.9 percentage points in accuracy on Sonar (p=0.017 vs magnitude pruning) while reducing parameters by 87%. Additionally, it demonstrates faster performance than both pruning methods in four out of five datasets (0.67–0.94× the cost of full training) by eliminating the need to train the entire network.

Key facts

  • Random Cloud is a training-free neural architecture search method.
  • It discovers minimal feedforward network topologies via stochastic exploration and progressive structural reduction.
  • It evaluates randomly initialized networks without backpropagation.
  • It only trains the best minimal candidate at the end.
  • Evaluated on 7 classification benchmarks against magnitude pruning and random pruning.
  • Matches or outperforms both baselines in 6 of 7 datasets.
  • Achieves +4.9 percentage points accuracy on Sonar (p=0.017 vs magnitude pruning).
  • Achieves 87% parameter reduction on Sonar.
  • Faster than both pruning baselines in 4 of 5 datasets (0.67–0.94× cost of full training).

Entities

Sources