iPhoneBlur Benchmark Exposes Motion Deblurring Performance Gaps
Researchers have introduced a new benchmark called iPhoneBlur, aimed at evaluating motion deblurring on mobile devices with varying levels of challenge. This dataset features 7,400 image pairs derived from high-frame-rate videos shot on the iPhone 17 Pro in diverse real-life settings. The images are categorized into three groups: Easy, Medium, and Hard, using PSNR-guided adaptive temporal windowing, which revealed a 2.2x increase in optical flow magnitude between the categories. Each image pair is accompanied by extensive metadata for analyzing ISP-aware and difficulty-adaptive restoration methods. Spectral analysis shows that the simulated blur mirrors real motion degradation. Testing on six architectures highlighted a consistent performance decline of 7-9 dB across the difficulty levels, emphasizing the need for better deblurring techniques.
Key facts
- iPhoneBlur is a difficulty-stratified benchmark for motion deblurring on consumer devices.
- The dataset contains 7,400 image pairs from iPhone 17 Pro videos.
- Samples are categorized into Easy, Medium, and Hard tiers.
- Stratification is validated by a 2.2x increase in optical flow magnitude across tiers.
- Each sample includes metadata for ISP-aware and difficulty-adaptive strategies.
- Spectral analysis confirms realistic high-frequency suppression in synthesized blur.
- Six architectures show 7-9 dB performance degradation across difficulty levels.
- The benchmark addresses performance variation obscured by aggregate metrics.
Entities
—