Font Size: a A A

Projective Non-negative Matrix Factorization And Its Applications

Posted on:2016-08-20Degree:DoctorType:Dissertation
Country:ChinaCandidate:X ZhaFull Text:PDF
GTID:1310330536967162Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Projective non-negative matrix factorization(PNMF)learns the projective matrix to project high-dimensional data into a lower-dimensional subspace.Moreover,it also straightforwardly derives the coefficients of high-dimensional data.The former is used for classification and the latter for clustering.Different from traditional decomposition methods such as eigenvalue decomposition,PNMF is designed for decomposing the nonnegative data meanwhile remains the non-negativity property of the original data.It learns the parts-based representation consistent with the human brain intuition,the parts forming the whole,because of the additive and non-subtractive operation,i.e.,the pure addition.This representation owns physical interpretation as well.In contrast with non-negative matrix factorization(NMF),PNMF employs the transpose of the basis as the projective matrix to guarantee the non-negativity of the coefficients of both the training and testing set.Since NMF cannot achieve this purpose,it induces the so-called ”out-of-the-sample”problem.Besides,PNMF derives the sparse representation and learns the coefficients as the class indicator matrix.Benefit from these properties,PNMF has been widely applied in various fields such as image classification,image segmentation,text clustering and visual tracking.In summary,PNMF makes a significant meaning in practice because it not only provides the theoretical complement to NMF but also has potential applicability.This paper involves a strand of study about PNMF and its applications.To make use of the label of data,we propose a discriminant PNMF(DPNMF)to enhance classification performance of PNMF.To improve the scalability of DPNMF,we also develop the online DPNMF method which incrementally updates both the within-class scatter and between-class scatter to learn the non-negative projective matrix.Since it is non-trivial to analyze the convergence of the optimization method for PNMF,this paper presents the box-constrained PNMF(BPNMF)method.To enhance PNMF for classification and image retrieval,we further propose the box-constrained DPNMF(BDPNMF)and the relevance feedback-based biased BDPNMF(B2DPNMF);meanwhile,we developed an efficient algorithm for BDPNMF and then proved this algorithm converges to the local minima when the regularization term of BDPNMF is convex.PNMF performs unsatisfactory results in clustering because it completely ignores the label of data.This paper proposes the semi-supervised PNMF(Semi-PNMF)to avoid the above deficiency.To handle multi-modal features,we propose the online multi-modal robust non-negative dictionary learning(OMRNDL)for visual tracking.Our major contributions can be summarized as follows:1.Discriminative projective non-negative matrix factorization Since PNMF completely ignores the label of data,it does not work well in classification.To overcome this deficiency,discriminative NMF was developed by incorporating Fisher's criterion into NMF(DNMF).Analogue to NMF,DNMF suffers from the so-called“out-of-the-sample” problem.This makes a contradiction with the non-negativity property.Besides,another alternative is the non-negative projective graph embedding(NPGE)based on the graph embedding framework.Since NPGE is difficult to segment the subspace,it has the limited performance in classification.Thus,we propose a discriminant PNMF method(DPNMF)which incorporates Fisher's criterion into PNMF.Moreover,it adopts the weight strategy to guarantee the convexity of the Fisher's criterion such that it can avoid the singular problem of Fisher linear discriminant analysis(FLDA)method.To solve DPNMF,we developed a multiplicative update rule(MUR)for DPNMF and provided the convergent proof of the MUR method.Experimental results on four popular face datasets show that DPNMF outperforms DNMF and PNGE in quantity.2.Online discriminant PNMF Neither NMF nor PNMF handle the large-scale or streaming datasets,and thus this motivates online NMF(ONMF)and online PNMF(OPNMF)in an incremental fashion to avoid the above deficiency.However,they still underperform in classification.This paper proposes online discriminant PNMF(ODPNMF)method to boost both ONMF and OPNMF.ODPNMF develops an online MUR method via the following strategies: 1)it keeps the historical knowledge incrementally and meanwhile updates both scatters,and 2)it utilizes the randomized algorithm to obtain the weight of the Fisher's criterion while reducing time overhead of the SVD operation.Experiments on four face datasets suggest that ODPNMF consistently outperforms both ONMF and OPNMF.To show the promise of ODPNMF,we develop an ODPNMF tracker based on the particle filter framework.To remove the effect of the background template to the object template,ODPNMF merely updates the within-class scatter matrix to guarantee the examples of the same class close to each other.Experimental results on four video sequences show that ODPNMF tracker tracks stably.3.Box-constrained PNMF The reason that the multiplicative update rule(MUR)of PNMF does not converge lies in the non-convex fourth-order term.Thus,PNMF needs to bound the basis via the normalization step in each iteration round.Recently,both convergent PNMF(CPNMF)and fast PNMF(FPNMF)have been proposed to address this issue.The former approximates the objective of PNMF using the higher Taylor expansion to remove the fourth-order term,while the latter adaptively tunes the step size of the MUR to avoid the normalization step.However,they do not prove that their optimization methods converge to the local minima.In this paper,we propose the box-constrained PNMF to defeat the above deficiencies.BPNMF incorporates two types of constrains: 1)the box constraint bounds the basis,and 2)we introduce both the coefficient matrix and the equality constraint to guarantee its objective equivalent to PNMF.To optimize BPNMF,we developed a optimization method in the frame of augmented Lagrangian method(ALM)and proved that the ALM-based method converges to the local minima of BPNMF.Experimental results suggest that BPNMF performs very well in terms of both accuracy and time cost.4.Box-constrained DPNMF PNMF neglects the label of data and thus performs unsatisfactorily in classification.Although DPNMF can enhance PNMF,DPNMF is so expensive in time and suffers from the aforementioned convergent problem analogue to PNMF.Although BPNMF can alleviate this problem faced by PNMF,BPNMF still underperforms in classification.In this paper,we propose a boxconstrained DPNMF(BDPNMF)to avoid the deficiencies of both BPNMF and DPNMF.BDPNMF guarantees Fisher's criterion to be positive semi-definite such that the optimization algorithm converges.Experiments on four face datasets suggest that BDPNMF outperforms DPNMF in terms of both accuracy and time overhead.To show the promise of BDPNMF in content-based image retrieval(CBIR),we propose the relevance feedback-based biased BDPNMF(B2DPNMF).CBIR is the so-called ”1+x” problem rather than ”1+1” problem and thus it needs to bias the positive class to improve the discriminant power.Extensive experiments of image retrieval suggest that B2 DPNMF outperforms NMF,BDNMF,PNMF and BDPNMF in terms of precision.5.Semi-supervised PNMF PNMF directly learns the coefficients of examples for classification but ignores the label of examples.Thus,there is enough room for improvement on classification.Recently,the proposed semi-supervised NMF methods have been applied for both image and text clustering but still underperform as well.To address this issue,this paper proposes the semi-supervised PNMF(SemiPNMF)method which efficiently utilizes both the labeled and unlabeled samples to boost PNMF in classification.To solve Semi-PNMF,this paper developed a multiplicative update rule and proved its convergence.Experimental results of cancer classification on two multiclass cancer gene expression profile datasets suggest that Semi-PNMF consistently outperforms the representative classification methods.6.Online Multi-modal Robust Non-negative Dictionary Learning Dictionary learning aims to derive the ability to represent any vector,especially in sparse representation.Most dictionary learning methods incorporate the discriminant information to enhance the classification performance.However,these methods cannot learn the dictionary in an online fashion.To overcome such limitation,online discriminative dictionary learning(ODDL),online robust dictionary learning(ORDL)and online robust non-negative dictionary learning(ONNDL)have been proposed to handle large-scale or streaming datasets in an incremental fashion.However,they cannot handle multi-modal features.To address this issue,this paper proposes an online multi-modal robust non-negative dictionary learning(OMRNDL)for visual tracking.To achieve such purpose,OMRNDL learns each aspect of individual dictionary for each modality and shares the common semantic representation to update the object template.This is because each modality describes one aspect of each example.Thus,it is reasonable to assume that all the modalities share the common semantic content.For efficient learning,we introduce multiple auxiliary variables to maintain the space memory constant such that OMRNDL deals with large-scale or streaming dataset.Experiments of visual tracking on a benchmark suggest the effectiveness of OMRNDL in both quality and quantity.
Keywords/Search Tags:Non-negative matrix factorization, projective non-negative matrix factorization, Fisher's criterion, online learning, augmented Lagrangian method, semi-supervised classification, visual tracking, dictionary learning
PDF Full Text Request
Related items