ARTFEED — Contemporary Art Intelligence

Cross-Modal 2D-3D Change Detection Method Proposed for Urban Monitoring

ai-technology · 2026-05-11

arXiv paper 2605.07151 introduces DPG-CD, a depth-prior-guided cross-modal joint 2D-3D change detection method. The approach addresses urban spatial evolution by combining pre-event Digital Surface Model (DSM) data with post-event imagery to detect both 2D semantic and 3D height changes. This cross-modal input is practical for high-frequency urban monitoring, disaster assessment, and emergency response, as 3D observations are costly and infrequent. The method tackles spectral-geometric representation gaps between imagery and DSM, and distinguishes modality differences from actual changes.

Key facts

  • DPG-CD is a depth-prior-guided cross-modal joint 2D-3D change detection method.
  • The method uses pre-event DSM and post-event imagery as input.
  • It jointly captures 2D semantic changes and 3D height changes.
  • The approach is designed for urban morphology analysis and emergency management.
  • Collecting 3D observations is constrained by high acquisition costs.
  • The cross-modal setting addresses spectral-geometric representation gaps.
  • Modality differences may be confused with actual changes.
  • The paper is published on arXiv with ID 2605.07151.

Entities

Institutions

  • arXiv

Sources