http://scholars.ntou.edu.tw/handle/123456789/26359| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Tsai, Yu-Shiuan | en_US |
| dc.contributor.author | Chang, Keng-Wei | en_US |
| dc.contributor.author | Lan, Yinjie | en_US |
| dc.date.accessioned | 2026-03-12T03:36:15Z | - |
| dc.date.available | 2026-03-12T03:36:15Z | - |
| dc.date.issued | 2025/7/1 | - |
| dc.identifier.issn | 0932-8092 | - |
| dc.identifier.uri | http://scholars.ntou.edu.tw/handle/123456789/26359 | - |
| dc.description.abstract | This study presents an advanced approach to underwater image color restoration, building on the FUnIE-GAN network with the integration of residual blocks and a novel image segmentation and linear blending technique. Evaluated on a diverse dataset of 109 underwater images collected at varying depths and conditions, our method consistently outperformed state-of-the-art models, including UICRN, FUnIE-GAN, URanker, and CCL-Net. Key metrics demonstrate the efficacy of our approach, achieving SSIM values of 0.817, 0.885, and 0.830 across scenarios of minimal color deviation, blue-green color cast, and broad blue color shift, respectively. The method also recorded PSNR values of 28.332, 28.970, and 28.977, showcasing superior clarity and noise reduction, and UIQM scores of 2.941, 2.730, and 2.975, highlighting balanced perceptual quality. The incorporation of residual blocks enhanced feature preservation, ensuring fine details and textures were retained during restoration, while image segmentation and linear blending minimized gridline artifacts, achieving seamless image continuity. Notably, our approach reduced Delta E 2000 values to as low as 16.1 in challenging scenarios, underscoring its ability to restore accurate and natural colors in environments where red wavelengths are heavily attenuated. These innovations were validated on both real-world underwater images captured with a 4 K GoPro 8 and simulated datasets generated from atmospheric images, ensuring robust model training and evaluation. By addressing the inherent challenges of underwater imaging, this study provides a reliable and effective solution for restoring color fidelity and structural integrity in underwater imagery. The proposed method holds significant potential for applications in ecological monitoring, marine research, and underwater object recognition, contributing valuable advancements to the field of underwater imaging. | en_US |
| dc.language.iso | English | en_US |
| dc.publisher | SPRINGER | en_US |
| dc.relation.ispartof | MACHINE VISION AND APPLICATIONS | en_US |
| dc.subject | Underwater image restoration | en_US |
| dc.subject | Generative adversarial network | en_US |
| dc.subject | Color correction | en_US |
| dc.subject | Image segmentation | en_US |
| dc.subject | Residual blocks | en_US |
| dc.subject | Oceanographic imaging | en_US |
| dc.title | Advancing underwater image clarity: a GAN-based approach with residual blocks and linear blending | en_US |
| dc.type | journal article | en_US |
| dc.identifier.doi | 10.1007/s00138-025-01698-5 | - |
| dc.identifier.isi | WOS:001507038100001 | - |
| dc.relation.journalvolume | 36 | en_US |
| dc.relation.journalissue | 4 | en_US |
| dc.identifier.eissn | 1432-1769 | - |
| item.cerifentitytype | Publications | - |
| item.grantfulltext | none | - |
| item.openairetype | journal article | - |
| item.fulltext | no fulltext | - |
| item.languageiso639-1 | English | - |
| item.openairecristype | http://purl.org/coar/resource_type/c_6501 | - |
| crisitem.author.dept | College of Electrical Engineering and Computer Science | - |
| crisitem.author.dept | Department of Computer Science and Engineering | - |
| crisitem.author.dept | National Taiwan Ocean University,NTOU | - |
| crisitem.author.orcid | 0000-0001-8264-9601 | - |
| crisitem.author.parentorg | National Taiwan Ocean University,NTOU | - |
| crisitem.author.parentorg | College of Electrical Engineering and Computer Science | - |
| Appears in Collections: | 資訊工程學系 | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.