Abstract:
Color accuracy is a crucial factor in determining perceived visual quality. Existing no-reference image quality assessment (NR-IQA) methods primarily focus on structural distortions such as blur and noise, leading to a lack of effective models and benchmark datasets specifically for evaluating color accuracy. To address this issue, we propose a no-reference image color accuracy assessment method built on the SAM2 vision foundation, named SAM2ICAA. This method aims to achieve objective quantification of image color accuracy under no-reference conditions. First, we construct the ICAA-4K benchmark dataset for color ac-curacy assessment, which encompasses eight common types of color distortion in both synthetic and re-al-world scenarios. Next, the method utilizes the SAM2 as its backbone and integrates local details with global semantic information through a multi-level feature fusion module. Finally, a dual-task regression module is designed to simultaneously predict the color quality score and the distortion type, simulating the human cognitive mechanism of identifying distortion before assessing quality. Extensive experiments conducted on the ICAA-4K dataset demonstrate that SAM2ICAA outperforms existing mainstream NR-IQA methods in terms of SRCC and PLCC, and is expected to become a new technical benchmark in the field of no-reference color quality assessment.