To CLEAN or not to CLEAN: Data Processing in the ngVLA era

Hendrik Müller

公開日: 2025/9/18

Abstract

Radio interferometric imaging has long relied on the CLEAN algorithm, valued for its speed, robustness, and integration with calibration pipelines. However, next-generation facilities such as the ngVLA, SKA, and ALMAs Wideband Sensitivity Upgrade will produce data volumes and dynamic ranges that exceed the scalability of traditional methods. CLEAN remains dominant due to its simplicity and accumulated expertise, yet its assumption of modeling the sky as point sources limits its ability to recover extended emission and hampers automation. We review CLEANs limitations and survey alternatives, including multiscale extensions, compressive sensing, Regularized Maximum Likelihood, Bayesian inference, and AI-driven approaches. Forward-modeling methods enable higher fidelity, flexible priors, and uncertainty quantification, albeit at greater computational cost. Hybrid approaches such as Autocorr-CLEAN, CG-CLEAN, and PolyCLEAN retain CLEANs workflow while incorporating modern optimization. We argue hybrids are best suited for the near term, while Bayesian and AI-based frameworks represent the long-term future of interferometric imaging.

To CLEAN or not to CLEAN: Data Processing in the ngVLA era | SummarXiv | SummarXiv