logo

SCIENCE CHINA Information Sciences, Volume 63 , Issue 12 : 222101(2020) https://doi.org/10.1007/s11432-020-2880-3

Color and direction-invariant nonlocal self-similarity prior and its application to color image denoising

More info
  • ReceivedJan 4, 2020
  • AcceptedApr 14, 2020
  • PublishedNov 2, 2020

Abstract


Supplement

Appendixes A–D.


References

[1] Buades A, Coll B, Morel J M. A non-local algorithm for image denoising. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2005. Google Scholar

[2] Dabov K, Foi A, Katkovnik V. Image Denoising by Sparse 3-D Transform-Domain Collaborative Filtering. IEEE Trans Image Process, 2007, 16: 2080-2095 CrossRef ADS Google Scholar

[3] Gu S, Xie Q, Meng D. Weighted Nuclear Norm Minimization and Its Applications to Low Level Vision. Int J Comput Vis, 2017, 121: 183-208 CrossRef Google Scholar

[4] Ji H, Liu C C, Shen Z W, et al. Robust video denoising using low rank matrix completion. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2010. Google Scholar

[5] Guoshen Yu , Sapiro G, Mallat S. Solving Inverse Problems With Piecewise Linear Estimators: From Gaussian Mixture Models to Structured Sparsity. IEEE Trans Image Process, 2012, 21: 2481-2499 CrossRef Google Scholar

[6] Mairal J, Bach F, Ponce J, et al. Non-local sparse models for image restoration. In: Proceedings of the 12th International Conference on Computer Vision, 2009. 2272--2279. Google Scholar

[7] Dong W, Zhang L, Shi G. Nonlocally Centralized Sparse Representation for Image Restoration. IEEE Trans Image Process, 2013, 22: 1620-1630 CrossRef ADS Google Scholar

[8] Dabov K, Foi A, Katkovnik V, et al. Color image denoising via sparse 3D collaborative filtering with grouping constraint in luminance-chrominance space. In: Proceedings of IEEE International Conference on Image Processing, 2007. Google Scholar

[9] Xu J, Zhang L, Zhang D, et al. Multi-channel weighted nuclear norm minimization for real color image denoising. In: Proceedings of IEEE International Conference on Computer Vision, 2017. 1096--1104. Google Scholar

[10] McLachlan G J, Basford K E. Mixture Models: Inference and Applications to Clustering. New York: Marcel Dekker, 1988. Google Scholar

[11] Rudin L I, Osher S, Fatemi E. Nonlinear total variation based noise removal algorithms. Physica D-NOnlinear Phenomena, 1992, 60: 259-268 CrossRef Google Scholar

[12] Zuo W M, Zhang L, Song C W, et al. Texture enhanced image denoising via gradient histogram preservation. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2013. 1203--1210. Google Scholar

[13] Do M N, Vetterli M. Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance. IEEE Trans Image Process, 2002, 11: 146-158 CrossRef ADS Google Scholar

[14] Portilla J, Strela V, Wainwright M J. Image denoising using scale mixtures of gaussians in the wavelet domain. IEEE Trans Image Process, 2003, 12: 1338-1351 CrossRef ADS Google Scholar

[15] Peyré G, Bougleux S, Cohen L. Non-local regularization of inverse problems. In: Proceedings of European Conference on Computer Vision, 2008. 57--68. Google Scholar

[16] Zoran D, Weiss Y. From learning models of natural image patches to whole image restoration. In: Proceedings of International Conference on Computer Vision, 2011. 479--486. Google Scholar

[17] Dong W, Shi G, Li X. Nonlocal Image Restoration With Bilateral Variance Estimation: A Low-Rank Approach. IEEE Trans Image Process, 2013, 22: 700-711 CrossRef ADS Google Scholar

[18] Rajwade A, Rangarajan A, Banerjee A. Image Denoising Using the Higher Order Singular Value Decomposition. IEEE Trans Pattern Anal Mach Intell, 2013, 35: 849-862 CrossRef Google Scholar

[19] Zhang K, Zuo W, Chen Y. Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising. IEEE Trans Image Process, 2017, 26: 3142-3155 CrossRef ADS arXiv Google Scholar

[20] Chen Y J, Yu W, Pock T. On learning optimized reaction diffusion processes for effective image restoration. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2015. 5261--5269. Google Scholar

[21] Chen F, Zhang L, Yu H M. External patch prior guided internal clustering for image denoising. In: Proceedings of IEEE International Conference on Computer Vision, 2015. 603--611. Google Scholar

[22] Elad M, Aharon M. Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries. IEEE Trans Image Process, 2006, 15: 3736-3745 CrossRef ADS Google Scholar

[23] Pang Y, Xie J, Li X. Visual Haze Removal by a Unified Generative Adversarial Network. IEEE Trans Circuits Syst Video Technol, 2019, 29: 3211-3221 CrossRef Google Scholar

[24] Sun H, Pang Y. GlanceNets - efficient convolutional neural networks with adaptive hard example mining. Sci China Inf Sci, 2018, 61: 109101 CrossRef Google Scholar

[25] Pang Y W, Li Y Z, Shen J B, et al. Towards bridging semantic gap to improve semantic segmentation. In: Proceedings of IEEE International Conference on Computer Vision, 2019. 4230--4239. Google Scholar

[26] Zhou Z H. Abductive learning: towards bridging machine learning and logical reasoning. Sci China Inf Sci, 2019, 62: 76101 CrossRef Google Scholar

[27] Kervrann C, Boulanger J, Coupé P. Bayesian non-local means filter, image redundancy and adaptive dictionaries for noise removal. In: Proceedings of International Conference on Scale Space and Variational Methods in Computer Vision, 2007. 520--532. Google Scholar

[28] Zhu F Y, Chen G Y, Heng P A. From noise modeling to blind image denoising. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2016. 420--429. Google Scholar

[29] Lebrun M, Colom M, Morel J M. Multiscale Image Blind Denoising. IEEE Trans Image Process, 2015, 24: 3149-3161 CrossRef ADS Google Scholar

[30] Lebrun M, Colom M, Morel J M. The Noise Clinic: a Blind Image Denoising Algorithm. Image Processing On Line, 2015, 5: 1-54 CrossRef Google Scholar

[31] Nam S, Hwang Y, Matsushita Y, et al. A holistic approach to cross-channel image noise modeling and its application to image denoising. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2016. 1683--1691. Google Scholar

[32] Elmoataz A, Lezoray O, Bougleux S. Nonlocal Discrete Regularization on Weighted Graphs: A Framework for Image and Manifold Processing. IEEE Trans Image Process, 2008, 17: 1047-1060 CrossRef ADS Google Scholar

[33] Gilboa G, Osher S. Nonlocal Operators with Applications to Image Processing. Multiscale Model Simul, 2009, 7: 1005-1028 CrossRef Google Scholar

[34] International Telecommunication Union. Parameter values for the HDTV standards for production and international programme exchange. BT.709-5. https://www.itu.int/rec/R-REC-BT.709/en. Google Scholar

[35] Roth S, Black M J. Fields of Experts. Int J Comput Vis, 2009, 82: 205-229 CrossRef Google Scholar

[36] Cao X Y, Chen Y, Zhao Q, et al. Low-rank matrix factorization under general mixture noise distributions. In: Proceedings of IEEE International Conference on Computer Vision, 2015. 1493--1501. Google Scholar

[37] Yong H W, Meng D Y, Zuo W M, et al. Robust online matrix factorization for dynamic background subtraction. In: Proceedings of IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018. Google Scholar

[38] Dempster A P, Laird N M, Rubin D B. J R Statistical Soc-Ser B (Methodological), 1977, 39: 1-22 CrossRef Google Scholar

[39] Boyd S, Parikh N, Chu E, et al. Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers. Boston: Now Foundations and Trends, 2011. Google Scholar

[40] Krishnan D, Fergus R. Fast image deconvolution using hyper-laplacian priors. In: Proceedings of International Conference on Neural Information Processing Systems, 2009. 1033--1041. Google Scholar

[41] Wang Z, Bovik A C, Sheikh H R. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans Image Process, 2004, 13: 600-612 CrossRef ADS Google Scholar

[42] Lin Zhang , Lei Zhang , Xuanqin Mou . FSIM: A Feature Similarity Index for Image Quality Assessment. IEEE Trans Image Process, 2011, 20: 2378-2386 CrossRef ADS Google Scholar

[43] Wang Z, Simoncelli E P, Bovik A C. Multiscale structural similarity for image quality assessment. In: Proceedings of the 37th Asilomar Conference on Signals, Systems & Computers, 2003. 1398--1402. Google Scholar

[44] Chen G Y, Zhu F Y, Heng P A. An efficient statistical method for image noise level estimation. In: Proceedings of IEEE International Conference on Computer Vision, 2015. 477--485. Google Scholar

  • Figure 1

    Illustration of the CDI-NSS in a color image. (a) A natural color image. (b) Image patches with similar structure but different colors. (c) Image patches with similar structure but different texture directions. (d) Angel-adjusting results of patches in (c). (e) The core structural patch of image patches in (b) and (d). (f) Several core patches obtained by adopting the proposed method to the clean image in (a). (g), (h) Illustrations of color vectors ${\boldsymbol~c}$s and ${\boldsymbol~b}$s estimated by the proposed method. Since they are $H\times~W$ 3-dimensional vectors, we illustrate them as $H\times~W\times3$ RGB images. (i) Rotation angels $\boldsymbol{\theta}\in\mathbb{R}^{H\times~W}$ estimated by the proposed method, where the correspondence between angle and color is shown at the top left corner. (j) The class labels of patches obtained by the proposed method, where different color refer to different class label.

  • Figure 2

    (a) An example of circle-like patch with diameter $m=9$ and $p=4$. (b) Illusion of fitting function $z(\cdot,\cdot)$, whose polynomial coefficients is ${\boldsymbol~a}$. (c) Illusion of function $\tilde{z}(\cdot,\cdot)$, whose polynomial coefficients is $f_\theta({\boldsymbol~a})$. (d) Rotated result of the patch shown in (a).

  • Figure 3

    (a) Performing ${\boldsymbol~f}_k^{\,\rm~T}{\boldsymbol~z}$ for all image patches in an image ${\boldsymbol~X}$ is equivalent to performing the convolution ${\boldsymbol~F}_k\ast{\boldsymbol~X}$.protectłinebreak (b) An image matrix ${\boldsymbol~X}$. (c) Polynomial coefficients of ${\boldsymbol~X}$. (d) Illusion of rotating a set of polynomial coefficients by rotation matrix ${\boldsymbol~U}_\theta$, where we take 3 order case as an example. (e) The rotated polynomial coefficients.

  • Figure 4

    16 employed color images in synthetic experiments.

  • Figure 5

    (a) Clean image. (b) Corresponding noisy image. (c)–(j) Restored images obtained by 8 competing methods. The demarcated areas are for easy observing the detail.

  • Figure 6

    (a) A real noisy color image from dataset [29]. (b)–(i) The restored images obtained by 8 competing methods.

  • Figure 7

    (a) A real noisy color image from dataset [30]. (b)–(i) The restored images obtained by 8 competing methods.

  • Table 1  

    Table 1Average performance of 8 competing methods with respectto 4 PQIs. For both settings, the results are obtained by averaging through the 16 images

    Competing $\lambda~=~0.10^2$ $\lambda~=~0.15^2$Average
    method PSNR SSIM FSIM MS-SSIM PSNR SSIM FSIM MS-SSIMtime (s)
    Noisy 19.998 0.484 0.812 0.295 16.476 0.350 0.728 0.272
    BM3D 27.913 0.845 0.925 0.320 25.781 0.782 0.893 0.312 0.47
    WNNM 28.249 0.852 0.928 0.321 26.185 0.791 0.896 0.313 484.94
    NCSR 27.963 0.843 0.923 0.320 25.769 0.777 0.883 0.311 1427.69
    PCLR 28.341 0.855 0.930 0.321 26.281 0.796 0.897 0.314 279.19
    EPLL 27.973 0.850 0.932 0.321 25.972 0.789 0.902 0.313 105.28
    CBM3D 29.094 0.881 0.935 0.324 26.793 0.823 0.904 0.317 0.45
    MCWNNM 29.119 0.874 0.930 0.323 26.926 0.819 0.896 0.316 115.60
    CDI-MoG 29.257 0.882 0.937 0.324 27.130 0.831 0.910 0.317 984.05
  •   

    Algorithm 1 Algorithm for CDI-MoG method

    Input: Noisy image $\mathcal{Y}$.

    Output: Denoised image $\mathcal{X}^{(L)}$. normalsize

    Initialize ${\mathcal{X}^{(0)}}$, $\boldsymbol\theta^{(0)}$, ${\boldsymbol~c}^{(0)}$, ${\boldsymbol~b}^{(0)}$, $\boldsymbol\mu^{(0)}$, $\boldsymbol\pi^{(0)}$, and $\sigma^{(0)}$;

    for $l=1:L$

    Update E step by Eq. (21);

    Update $\boldsymbol{\mu}$, ${\boldsymbol~c}$, ${\boldsymbol~b}$, $\sigma$, $\boldsymbol\pi$ and $\boldsymbol\theta$ by Eqs. (23)–(25);

    while not convergence do

    Update $~\mathcal{X}$ by Eq. (36) and update $~\mathcal{B}$ by Eq. (38);

    Update $~\mathcal{L}$ by Eq. (39) and let $\mu~:=~\rho\mu$;

    end while

    end for