A Fog and Haze Data Domain Transfer Model Based on Style-Content Decoupling
-
Graphical Abstract
-
Abstract
To address the domain gap between synthetic hazy data used for training and real-world haze, we propose a frequency-domain feature–guided haze data domain translation model. Specifically, content features from clear images and haze-related features from hazy images are separately extracted via wavelet transform and aligned through a cross-attention mechanism. The proposed domain translation model is further optimized using a depth consistency loss and a color consistency loss, enabling effective transfer of haze characteristics from hazy images to clear ones. Based on the translated results, a large-scale hazy dataset is synthesized to augment the training domain, thereby reducing domain discrepancy and enhancing the performance of dehazing models. Extensive experiments on the O-HAZE, NH-HAZE, and OTS datasets demonstrate that the proposed approach significantly improves the generalization ability of existing dehazing methods in real-world scenarios. In particular, on the O-HAZE dataset, our method outperforms the second-best approach by 1.13 dB in PSNR and 0.005 4 in SSIM.
-
-