Low-rank tensor completion using nonconvex total variation

dc.citation.epage374
dc.citation.issue2
dc.citation.journalTitleМатематичне моделювання та комп'ютинг
dc.citation.spage365
dc.contributor.affiliationУніверситет Каді Айяд
dc.contributor.affiliationCadi Ayyad University
dc.contributor.authorМохауї, С.
dc.contributor.authorК Ель Кате.
dc.contributor.authorХакім, А.
dc.contributor.authorРагей, С.
dc.contributor.authorMohaoui, S.
dc.contributor.authorEl Qate, K.
dc.contributor.authorHakim, A.
dc.contributor.authorRaghay, S.
dc.coverage.placenameЛьвів
dc.coverage.placenameLviv
dc.date.accessioned2025-03-04T11:14:23Z
dc.date.created2022-02-28
dc.date.issued2022-02-28
dc.description.abstractУ цій роботі вивчається задача тензорного доповнення, в якій головним є передбачення відсутніх значень у візуальних даних. Щоб отримати максимальну користь із гладкої структури та властивості збереження країв у візуальних зображеннях, пропонується модель тензорного доповнення, яка шукає розрідженість градієнта за допомогою l0-норми. Пропозиція поєднує в собі матричну факторізацію низького рангу, яка гарантує властивість низького рангу та неопуклі повні варіації (ПВ). Подано декілька експериментів, щоб продемонструвати ефективність запропонованої моделі порівняно з популярними методами тензорного доповнення з точки зору візуальних і кількісних показників.
dc.description.abstractIn this work, we study the tensor completion problem in which the main point is to predict the missing values in visual data. To greatly benefit from the smoothness structure and edge-preserving property in visual images, we suggest a tensor completion model that seeks gradient sparsity via the l0-norm. The proposal combines the low-rank matrix factorization which guarantees the low-rankness property and the nonconvex total variation (TV). We present several experiments to demonstrate the performance of our model compared with popular tensor completion methods in terms of visual and quantitative measures.
dc.format.extent365-374
dc.format.pages10
dc.identifier.citationLow-rank tensor completion using nonconvex total variation / S. Mohaoui, K. El Qate, A. Hakim, S. Raghay // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2022. — Vol 9. — No 2. — P. 365–374.
dc.identifier.citationenLow-rank tensor completion using nonconvex total variation / S. Mohaoui, K. El Qate, A. Hakim, S. Raghay // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2022. — Vol 9. — No 2. — P. 365–374.
dc.identifier.doidoi.org/10.23939/mmc2022.02.365
dc.identifier.urihttps://ena.lpnu.ua/handle/ntb/63437
dc.language.isoen
dc.publisherВидавництво Львівської політехніки
dc.publisherLviv Politechnic Publishing House
dc.relation.ispartofМатематичне моделювання та комп'ютинг, 2 (9), 2022
dc.relation.ispartofMathematical Modeling and Computing, 2 (9), 2022
dc.relation.references[1] Kolda T. G., Bader B. W. Tensor decompositions and applications. SIAM review. 51 (3), 455–500 (2009).
dc.relation.references[2] Xu Y., Hao R., Yin W., Su Z. Parallel matrix factorization for low-rank tensor completion. Preprint arXiv:1312.1254 (2013).
dc.relation.references[3] He W., Zhang H., Zhang L., Shen H. Total-variation-regularized low-rank matrix factorization for hyperspectral image restoration. IEEE transactions on geoscience and remote sensing. 54 (1), 178–188 (2015).
dc.relation.references[4] Ji T.-Y., Huang T.-Z., Zhao X.-L., Ma T.-H., Liu G. Tensor completion using total variation and low-rank matrix factorization. Information Sciences. 326, 243–257 (2016).
dc.relation.references[5] Jiang T.-X., Huang T.-Z., Zhao X.-L., Ji T.-Y., Deng L.-J. Matrix factorization for low-rank tensor completion using framelet prior. Information Sciences. 436–437, 403–417 (2018).
dc.relation.references[6] Ben-Loghfyry A., Hakim A. Time-fractional diffusion equation for signal and image smoothing. Mathematical Modeling and Computing. 9 (2), 351–364 (2022).
dc.relation.references[7] Alaa H., Alaa N. E., Atounti M., Aqel F. A new mathematical model for contrast enhancement in digital images. Mathematical Modeling and Computing. 9 (2), 342–350 (2022).
dc.relation.references[8] Alaa H., Alaa N. E., Aqel F., Lefraich H. A new Lattice Boltzmann method for a Gray-Scott based model applied to image restoration and contrast enhancement. Mathematical Modeling and Computing. 9 (2), 187–202 (2022).
dc.relation.references[9] Mohaoui S., Hakim A., Raghay S. Bi-dictionary learning model for medical image reconstruction from undersampled data. IET Image Processing. 14 (10), 2130–2139 (2020).
dc.relation.references[10] Mohaoui S., Hakim A., Raghay S. A combined dictionary learning and TV model for image restoration with convergence analysis. Journal of Mathematical Modeling. 9 (1), 13–30 (2021).
dc.relation.references[11] Rudin L. I., Osher S., Fatemi E. Nonlinear total variation based noise removal algorithms. Physica D: Nonlinear Phenomena. 60 (1–4), 259–268 (1992).
dc.relation.references[12] Wang M., Wang Q., Chanussot J. Tensor low-rank constraint and l0 total variation for hyperspectral image mixed noise removal. IEEE Journal of Selected Topics in Signal Processing. 15 (3), 718–733 (2021).
dc.relation.references[13] Banouar O., Mohaoui S., Raghay S. Collaborating filtering using unsupervised learning for image reconstruction from missing data. EURASIP Journal on Advances in Signal Processing. 2018, 72 (2018).
dc.relation.references[14] Mohaoui S., Hakim A., Raghay S. Tensor completion via bilevel minimization with fixed-point constraint to estimate missing elements in noisy data. Advances in Computational Mathematics. 47 (1), 10 (2021).
dc.relation.references[15] Liu J., Musialski P., Wonka P., Ye J. Tensor completion for estimating missing values in visual data. IEEE transactions on pattern analysis and machine intelligence. 35 (1), 208–220 (2012).
dc.relation.references[16] Xu L., Zheng S., Jia J. Unnatural l0 sparse representation for natural image deblurring. 2013 IEEE Conference on Computer Vision and Pattern Recognition. 1107–1114 (2013).
dc.relation.references[17] Ono S. l0 gradient projection. IEEE Transactions on Image Processing. 26 (4), 1554–1564 (2017).
dc.relation.references[18] Xue S., Qiu W., Liu F., Jin X. Low-rank tensor completion by truncated nuclear norm regularization. 2018 24th International Conference on Pattern Recognition (ICPR). 2600–2605 (2018).
dc.relation.references[19] Wright J., Ganesh A., Rao S., Ma Y. Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization. Advances in Neural Information Processing Systems 22 (NIPS 2009). 22 (2009).
dc.relation.referencesen[1] Kolda T. G., Bader B. W. Tensor decompositions and applications. SIAM review. 51 (3), 455–500 (2009).
dc.relation.referencesen[2] Xu Y., Hao R., Yin W., Su Z. Parallel matrix factorization for low-rank tensor completion. Preprint arXiv:1312.1254 (2013).
dc.relation.referencesen[3] He W., Zhang H., Zhang L., Shen H. Total-variation-regularized low-rank matrix factorization for hyperspectral image restoration. IEEE transactions on geoscience and remote sensing. 54 (1), 178–188 (2015).
dc.relation.referencesen[4] Ji T.-Y., Huang T.-Z., Zhao X.-L., Ma T.-H., Liu G. Tensor completion using total variation and low-rank matrix factorization. Information Sciences. 326, 243–257 (2016).
dc.relation.referencesen[5] Jiang T.-X., Huang T.-Z., Zhao X.-L., Ji T.-Y., Deng L.-J. Matrix factorization for low-rank tensor completion using framelet prior. Information Sciences. 436–437, 403–417 (2018).
dc.relation.referencesen[6] Ben-Loghfyry A., Hakim A. Time-fractional diffusion equation for signal and image smoothing. Mathematical Modeling and Computing. 9 (2), 351–364 (2022).
dc.relation.referencesen[7] Alaa H., Alaa N. E., Atounti M., Aqel F. A new mathematical model for contrast enhancement in digital images. Mathematical Modeling and Computing. 9 (2), 342–350 (2022).
dc.relation.referencesen[8] Alaa H., Alaa N. E., Aqel F., Lefraich H. A new Lattice Boltzmann method for a Gray-Scott based model applied to image restoration and contrast enhancement. Mathematical Modeling and Computing. 9 (2), 187–202 (2022).
dc.relation.referencesen[9] Mohaoui S., Hakim A., Raghay S. Bi-dictionary learning model for medical image reconstruction from undersampled data. IET Image Processing. 14 (10), 2130–2139 (2020).
dc.relation.referencesen[10] Mohaoui S., Hakim A., Raghay S. A combined dictionary learning and TV model for image restoration with convergence analysis. Journal of Mathematical Modeling. 9 (1), 13–30 (2021).
dc.relation.referencesen[11] Rudin L. I., Osher S., Fatemi E. Nonlinear total variation based noise removal algorithms. Physica D: Nonlinear Phenomena. 60 (1–4), 259–268 (1992).
dc.relation.referencesen[12] Wang M., Wang Q., Chanussot J. Tensor low-rank constraint and l0 total variation for hyperspectral image mixed noise removal. IEEE Journal of Selected Topics in Signal Processing. 15 (3), 718–733 (2021).
dc.relation.referencesen[13] Banouar O., Mohaoui S., Raghay S. Collaborating filtering using unsupervised learning for image reconstruction from missing data. EURASIP Journal on Advances in Signal Processing. 2018, 72 (2018).
dc.relation.referencesen[14] Mohaoui S., Hakim A., Raghay S. Tensor completion via bilevel minimization with fixed-point constraint to estimate missing elements in noisy data. Advances in Computational Mathematics. 47 (1), 10 (2021).
dc.relation.referencesen[15] Liu J., Musialski P., Wonka P., Ye J. Tensor completion for estimating missing values in visual data. IEEE transactions on pattern analysis and machine intelligence. 35 (1), 208–220 (2012).
dc.relation.referencesen[16] Xu L., Zheng S., Jia J. Unnatural l0 sparse representation for natural image deblurring. 2013 IEEE Conference on Computer Vision and Pattern Recognition. 1107–1114 (2013).
dc.relation.referencesen[17] Ono S. l0 gradient projection. IEEE Transactions on Image Processing. 26 (4), 1554–1564 (2017).
dc.relation.referencesen[18] Xue S., Qiu W., Liu F., Jin X. Low-rank tensor completion by truncated nuclear norm regularization. 2018 24th International Conference on Pattern Recognition (ICPR). 2600–2605 (2018).
dc.relation.referencesen[19] Wright J., Ganesh A., Rao S., Ma Y. Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization. Advances in Neural Information Processing Systems 22 (NIPS 2009). 22 (2009).
dc.rights.holder© Національний університет “Львівська політехніка”, 2022
dc.subjectтензорне доповнення
dc.subjectпропущені значення
dc.subjectпаралельна матрична факторізація
dc.subjectнеопукла ПВ
dc.subjecttensor completion
dc.subjectmissing values
dc.subjectparallel matrix factorization
dc.subjectnonconvex TV
dc.titleLow-rank tensor completion using nonconvex total variation
dc.title.alternativeДоповнення тензора низького рангу з використанням неопуклої повної варіації
dc.typeArticle

Files

Original bundle

Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
2022v9n2_Mohaoui_S-Low_rank_tensor_completion_365-374.pdf
Size:
1.46 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
2022v9n2_Mohaoui_S-Low_rank_tensor_completion_365-374__COVER.png
Size:
462.74 KB
Format:
Portable Network Graphics

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.84 KB
Format:
Plain Text
Description: