| In the real world, dates with multi-label can be seen everywhere, and those dates often belongs to multiple categories at the same time. The result of multi-label learning provides reference for next step of study, so dates are classified correctly is very important. This thesis focuses on the study of applying sparse representation to multi-label learning, and put forward two methods as followers:A multi-label learning algorithm based on non-negative sparse representation which use non-negative to constraint to sparse representation; And a non-negative sparse neighbor representation for multi-label learning algorithm which combines non-negative sparse representation and K neighbors.The main work and innovations in this thesis are shown as follows:(1) In order to avoid the influence of discarding the negative of sparse coefficient in multi-label learning, a new multi-label learning algorithm based on non-negative sparse representation is proposed (ML-SRC). Because discarding the negative of sparse coefficient will lead to loss of classified information, this thesis introduce the nonnegative sparse representation which can avoid negative sparse coefficient and retain more information for later classification as far as possible. To get the non-negative sparse coefficient, a fast iterative algorithm and its corresponding analysis of converging to global minimum are provided. The proposed algorithm is tested on several public multi-label databases, and compared with classical multi-label learning algorithms ML-SRC and ML-KNN (A Lazy Learning Approach to Multi-Label Learning). Experimental results show that the proposed method can effectively classify multi-label data, and achieves better performances than ML-SRC and ML-KNN.(2) In order to avoid the influence of the nonlinear structure in training data and preserve more discriminate information in the sparse representation based multi-label learning, a new multi-label learning algorithm based on non-negative sparse neighbor representation is proposed. The k nearest neighbors among each class is found by Euclidean distance for the test sample, and form the new training set which alleviate the impact of the data set of nonlinear structure. Based on non-negative LASSO-type sparse minimization, a fast iterative algorithm and its corresponding analysis of converging to global minimum are provided. According to the fast iterative algorithm, the test sample is non-negative linearly reconstructed by the new training set. A method of calculating the membership is proposed which can get every membership of test sample of each class using the reconstruction errors. The classification is performed by ranking these memberships. The proposed algorithm is tested on several public multi-label databases, and better than classical multi-label learning ML-KNN and ML-SRC. The proposed algorithm is better than ML-NSRC (A Multi-Label Learning Algorithm Based on Non-negative Sparse Representation) on Time complexity and performance of classification. |