计算机科学 ›› 2023, Vol. 50 ›› Issue (8): 133-141.doi: 10.11896/jsjkx.220600065

• 计算机图形学&多媒体 • 上一篇    下一篇

基于生成对抗网络的遥感图像锐化方法

闫艳1, 隋毅1, 司建伟2   

  1. 1 青岛大学计算机科学技术学院 山东 青岛 266071
    2 中国海洋大学信息科学与工程学院 山东 青岛 266100
  • 收稿日期:2022-06-07 修回日期:2022-10-11 出版日期:2023-08-15 发布日期:2023-08-02
  • 通讯作者: 隋毅(suiyi@qdu.edu.cn)
  • 作者简介:(2020020641@qdu.edu.cn)
  • 基金资助:
    国家自然科学基金青年科学基金(41706198)

Remote Sensing Image Pan-sharpening Method Based on Generative Adversarial Network

YAN Yan1, SUI Yi1, SI Jianwei2   

  1. 1 College of Computer Science and Technology,Qingdao University,Qingdao,Shandong 266071,China
    2 Faculty of Information Science and Engineering,Ocean University of China,Qingdao,Shandong 266100,China
  • Received:2022-06-07 Revised:2022-10-11 Online:2023-08-15 Published:2023-08-02
  • About author:YAN Yan,born in 1996,master candidate.Her main research interests include deep learning and image proces-sing.
    SUI Yi,born in 1984,Ph.D,associate professor,master supervisor.Her main research interests include big data mo-deling and analysis,artifical intelligence and image processing.
  • Supported by:
    Young Scientists Fund of the National Natural Science Foundation of China(41706198).

摘要: 现有遥感图像锐化方法普遍采用Wald协议,导致重建图像存在空间纹理细节和颜色模糊、边缘过于平滑的问题。针对该问题,提出基于生成对抗网络(Generative Adversarial Networks,GAN)的遥感图像锐化方法PAN-GAN。该方法将多光谱图像作为参考图像,使用灰度化的参考图像模拟全色图像,并与模糊化的参考图像共同作为生成器输入,由生成器分别提取前者的纹理细节特征和后者的光谱特征并进行融合重构;引入感知损失,联合对抗损失和像素损失共同优化重构图像,使重构图像具有更加逼近参考图像的光谱和纹理细节特征。在QuickBird,GaoFen-2和WorldView-2这3个遥感卫星的图像数据集上进行实验,结果表明:与常用方法相比,使用PAN-GAN得到的重构图像具有更加逼真的光谱和空间纹理细节;使用灰度化的参考图像能够显著提升原有方法的性能并且平均灰度化提升最为明显;感知损失的引入进一步优化了重构结果,验证了所提方法的有效性。

关键词: 遥感图像, 锐化, 对抗性生成网络, 感知特征

Abstract: Remote sensing image pan-sharpening methods are generally based on Wald protocol,resulting in blurred texture details,colors and ambiguous boundaries of the reconstructed images.To solve the problem,a remote sensing image pan-sharpening method based on generative adversarial networks(GAN),PAN-GAN,is proposed in this paper.The multispectral image is employed as the reference image.The grayscale reference image is applied to simulate the panchromatic image and the blurred reference image is adpoted as input of the generator.The generator extracts the texture details of the grayscale reference image and spectral features of the blurred reference image for the fusion reconstruction.Meanwhile,the perceptual loss is introduced to optimize the reconstruction results with adversarial loss and pixel loss,so that the reconstructed images have spectral and texture detail features closer to the reference image.Experiments are carried out on the datasets of three remote sensing satellites including QuickBird,GaoFen-2 and WorldView-2.The results show that the reconstructed images obtained by PAN-GAN have more realistic spectral and spatial texture details compared with common methods.The usage of grayscale reference images can significantly improve the performance of the original method,and the average grayscale improvement is the most obvious.The perceptual loss can further optimize the reconstruction results and verify the effectiveness of the proposed method.

Key words: Remote sensing images, Pan-sharpening, Generative adversarial networks, Perceptual features

中图分类号: 

  • TP391
[1]WANG X,YONG L,JIANG Z.An IHS Fusion Method based on Structural Similarity[J].Remote Sensing Technology and Application,2011,26(5):670-676.
[2]PING S,LEI D,JUAN N.Multi-scale Remote Sensing ImageFusion Method based on Region Segmentation[J].Remote Sen-sing Technology and Application,2013,27(6):844-849.
[3]ZENG D,HU Y,HUANG Y,et al.Pan-sharpening with structural consistency and 1/2 gradient prior[J].Remote Sensing Letters,2016,7(12):1170-1179.
[4]GILLESPIE A R,KAHLE A B,WALKER R E.Color enhancement of highly correlated images.II.Channel ratio and “chromaticity” transformation techniques[J].Remote Sensing of Environment,1987,22(3):343-365.
[5]PADWICK C,DESKEVICH M,PACIFICI F,et al.WorldView-2 pan-sharpening[C]//Proceedings of the ASPRS 2010 Annual Conference.San Diego,CA,USA,2010:1-14.
[6]FENG J,ZEREN L,XIA C,et al.Remote sensing image fusion method based on PCA and NSCT transform[J].Journal of Graphics,2017,38(2):247-252.
[7]RAHMANI S,STRAIT M,MERKURJEV D,et al.An adaptive IHS pan-sharpening method[J].IEEE Geoscience and Remote Sensing Letters,2010,7(4):746-750.
[8]LIU J,LI X,ZHU K,et al.Distributed Compressed SensingBased Remote Sensing Image Fusion Algorithm[J].Journal of Electronics & Information Technology,2017,39(10):2374-2381.
[9]RANCHIN T,WALD L.Fusion of high spatial and spectral re-solution images:The ARSIS concept and its implementation[J].Photogrammetric Engineering and Remote Sensing,2000,66(1):49-61.
[10]GIUSEPPE M,DAVIDE C,LUISA V,et al.Pansharpening by Convolutional Neural Networks[J].Remote Sensing,2016,8(7):594.
[11]SCARPA G,VITALE S,COZZOLINO D.Target-adaptiveCNN-based pansharpening[J].IEEE Transactions on Geo-science and Remote Sensing,2018,56(9):5443-5457.
[12]YANG J,FU X,HU Y,et al.PanNet:A Deep Network Architecture for Pan-Sharpening[C]//2017 IEEE International Conference on Computer Vision(ICCV).IEEE,2017:5448-5457.
[13]DONG W Q,ZHANG T Z,QU J H,et al.Laplacian Pyramid Dense Network for Hyperspectral Pansharpening[J].IEEE Transactions on Geoscience and Remote Sensing,2022,(60):1-13.
[14]LEI D J,DU J H,ZHANG L P,et al.Multi-stream Architecture and Multi-scale Convolutional Neural Network for Remote Sensing Image Fusion[J].Journal of Electronics & Information Technology,2022,44(1):8.
[15]AZARANG A,MANOOCHEHRI H E,KEHTARNAVAZ N.Convolutional autoencoder-based multispectral image fusion[J].IEEE Access,2019,7:35673-35683.
[16]WANG L,YANG X M.Remote Sensing Image Pansharpening Feedback Network Based on Perceptual Loss[J].Computer Science,2021,48(8):91-98.
[17]GOODFELLOW I,POUGET-ABADIE J,MIRZA M,et al.Ge-nerative Adversarial Nets[C]//Proceedings of the 27th International Conference on Neural Information Processing Systems-Volume 2.2014:2672-2680.
[18]DONG W Q,HOU S X,XIAO S,et al.Generative Dual-Adversarial Network With Spectral Fidelity and Spatial Enhancement for Hyperspectral Pansharpening[J].IEEE Transactions on Neural Networks and Learning Systems,2021 (99):1-15.
[19]YANG J,LI W J,WANG W G,et al.Generative adversarial network for image super-resolution combining perceptual loss[J].Journal of Image and Graphics,2019,24(8):1270-1282.
[20]MA J,XU H,JIANG J,et al.DDcGAN:A dual-discriminator conditional generative adversarial network for multi-resolution image fusion[J].IEEE Transactions on Image Processing,2020,29:4980-4995.
[21]LIU Q,ZHOU H,XU Q,et al.PSGAN:A generative adversa-rial network for remote sensing image pan-sharpening[J].IEEE Transactions on Geoscience and Remote Sensing,2020,59(12):10227-10242.
[22]OZCELIK F,ALGANCI U,SERTEL E,et al.Rethinking CNN-based pansharpening:Guided colorization of panchromatic images via GANs[J].IEEE Transactions on Geoscience and Remote Sensing,2020,59(4):3486-3501.
[23]WALD L.Quality of high resolution synthesised images:Isthere a simple criterion?[C]//Proceedings of the Third Conference “Fusion of Earth Data:Merging Point Measurements,Raster Maps and Remotely Sensed Images”.2000:99-103.
[24]JOLICOEUR-MARTINEAU A.The relativistic discriminator:a key element missing from standard GAN[J].arXiv:1807,00734,2018.
[25]LIN M,CHEN Q,YAN S.Network in network[J].arXiv:1312,4400,2013.
[26]NAIR V,HINTON G E.Rectified Linear Units Improve Re-stricted Boltzmann Machines Vinod Nair[C]//Proceedings of the 27th International Conference on Machine Learning (ICML-10).Haifa,Israel,2010:807-814.
[27]XU B,WANG N,CHEN T,et al.Empirical evaluation of rectified activations in convolutional network[J].arXiv:1505.00853,2015.
[28]ISOLA P,ZHU J Y,ZHOU T,et al.Image-to-image translation with conditional adversarial networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.Honolulu,HI,USA:IEEE Press,2017:1125-1134.
[29]YUHAS R H,GOETZ A F,BOARDMAN J W.Discrimination among semi-arid landscape endmembers using the spectral angle mapper (SAM) algorithm[C]//Proceedings of the JPL,Summaries of the Third Annual JPL Airborne Geoscience Workshop Volume 1:AVIRIS Workshop.1992:147-149.
[30]ZHOU J,CIVCO D L,SILANDER J.A wavelet transformmethod to merge Landsat TM and SPOT panchromatic data[J].International Journal of Remote Sensing,1998,19(4):743-757.
[31]WANG Z,BOVIK A C.A universal image quality index[J].IEEE Signal Processing Letters,2002,9(3):81-84.
[32]ZHANG R,ISOLA P,EFROS A A,et al.The unreasonable effectiveness of deep features as a perceptual metric[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.2018:586-595.
[33]ALPARONE L,AIAZZI B,BARONTI S,et al.Multispectraland Panchromatic Data Fusion Assessment Without Reference[J].Photogrammetric Engineering & Remote Sensing,2008,74(2):193-200.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!