Font Size: a A A

Adaptive Total Variation And Second-order Total Variation-based Model For Low-rank Tensor Completion And Its Applications

Posted on:2022-06-21Degree:MasterType:Thesis
Country:ChinaCandidate:X LiFull Text:PDF
GTID:2480306524481314Subject:Mathematics
Abstract/Summary:PDF Full Text Request
In recent years,with the rapid advancement of the information age,the data involved in various industries is more and more complex,and the data scale is also increasing,such as ultra clear color images,videos,traffic information,remote sensing data and so on.For these higher dimensional data,we call them tensors.However,in the process of tensor acquisition and transmission,the problem of data loss is inevitable due to the influence of various factors.How to infer the unknown information from the known data is very important for the subsequent application.This task is called tensor completion.Since most data in the real world is low-rank,this problem is also known as low-rank tensor completion(LRTC).This paper mainly studies low-rank completion in tensor processing,and the main work contents are as follows:In this paper,a parallel matrix decomposition model based on the first and second order total variation is proposed to solve this problem.Although low-rank regularization has made great achievements in tensor completion recently.However,only considering the global low-rankness is not sufficient because that it is limited to exploit the structure of the tensor,especially for a low sampling rate(SR).And the recovered data may be missing many critical details.Since the total variational regularization method can mine the local prior information of the data,it is effective for ill-conditioned inverse problems of tensor completion class.However,total variation as a segment constant function usually leads to undesirable ladder effect and false edges.Of course,the higher-order total variation can eliminate the ladder effect by using the higher-order information,but it will also omit more details and destroy the original discontinuity.Considering the above problems,our model integrates local smoothness and global low-rank by using first-order and second-order total variational regular terms and data fidelity terms,so that the model can achieve a balance between data discontinuity and smoothness on the basis of capturing global information.To solve this model,an efficient algorithm based on proximal alternate optimization(PAO)is proposed,and the convergence of the algorithm is analyzed theoretically.In addition,we also introduce a regularization parameter selection strategy to automatically update the two regularization parameters,which can balance the two regularization terms better to achieve the best effect.The algorithm is applied to three tensor data of color im-age,grayscale video and magnetic resonance image(MRI)at different sampling rates.Experimental results show that the proposed method is effective,especially at a very low sampling rate.Compared with other methods,the proposed method can obtain better nu-merical results,alleviate the ladder effect and obtain a good visual effect on the basis of keeping the edges.
Keywords/Search Tags:Low rank tensor completion, first order total variation, staircase effect, second order total variation, proximal alternate optimization
PDF Full Text Request
Related items