Font Size: a A A

Study Of Rejection Algorithm And Feature Extraction Technique On Radar HRRP Target Recognition

Posted on:2011-10-22Degree:DoctorType:Dissertation
Country:ChinaCandidate:J ChaiFull Text:PDF
GTID:1118360305964259Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Radar high-resolution range profile (HRRP) denotes the sum vector of projections of the complex returned echoes from target scattering centers onto the radar line-of-sight (LOS), and it may reflect the structural information of scattering targets in the direction of LOS. Compared with synthetic aperture radar (SAR) images or inverse synthetic aperture radar (ISAR) images, HRRPs are more easily accessible and require much smaller storage, and hence, attract general attentions in the radar automatic target recognition (RATR) community. By considering the engineering background of HRRP recognition, this dissertation gives our researches on theories and techniques of RATR, mainly focusing on following two aspects, i.e., outlier target rejection and feature extraction, which are supported by Advanced Defense Research Programs of China and National Science Foundation of China. This dissertation consists of five sections, in which Section 1 gives a brief introduction of this dissertation, Section 2 refers to the outlier target rejection problem, Section 3, Section 4 and Section 5 refer to the feature extraction problem.1,In Section 1, a brief analysis of the physical property of HRRPs is discussed, firstly. Next, we introduce the background of the outlier target rejection problem based on real engineering considerations, and then analyze the main difference of this problem from traditional pattern recognition problems and give the main difficulties for sovling this problem.2,In Section 2, we propose a method to artificially generate outlier training samples, which provides a data supportment for following classifier design procedure. Based on the drawback that support vector domain description (SVDD) has a too simple kernel form, we extend SVDD from single kernel to the form with linear combination of multiple kernels, and thus obtain two extended versions of SVDD, i.e., Multikernel-SVDD1 and Multikernel-SVDD2, according to different degrees of freedom on combining coefficients. SVDD, Multikernel-SVDD1 and Multikernel-SVDD2 can be solved with global optimal solutions, by oprating quadratic programming (QP), second-order cone programming (SOCP) and semidifinite programming (SDP) problems, respectively. Experimental results show that:(1) due to the adoption of more complicated kernel formation, Multikernel-SVDD1 and Multikernel-SVDD2 have better rejection performance than SVDD; (2) due to more degrees of freedom on the combinational coefficients of multiple kernel matrices, Multikernel-SVDD2 has better rejection performance than Multikernel-SVDD1. SVDD,Multikernel-SVDD1 and Multikernel-SVDD2 aim at seeking hypersphere boundaries in the high-dimensional kernel spaces, and their difference just lies in that they perform in different kernel spaces. Different from above hypersphere boundary, in this section, we propose three new algorithms, i.e., the nearest neighbor (NN) classifier, the average K nearest neighbors (A-KNN) classifier and the weighted K nearest neighbors (W-KNN) classifier, all of which adopt neighboring boundary to deal with the rejection problem. The experimental results show that adopting the neighboring boundary outperforms adopting the hypersphere one for the radar HRRP outlier rejection problem. By comparing above three neighboring algorithms, we find that W-KNN outperforms both NN and A-KNN, perhaps due to W-KNN can utilize more information and preserve strong local learning ability simultaneously.3,A large margin nearest local mean (LMNLM) algorithm is proposed in Section 3. LMNLM projects the initial Euclidean distance space to the Mahalanobis one by a linear transformation, and then introduces large classification margins to the nearest local mean (NLM) classifier in the projected space, with the expectation the generalization ability of NLM classifier can be improved. By oprating generalized eigenvalue decomposition on the obtained Mahalanobis matrix, we can recover the projection matrix and realize the feature extraction task on HRRP data. LMNLM can be expressed as a SDP problem, which assures the accessibility of global optimal solutions. The experimental results show that LMNLM can reduce data's dimensionality and enhance data's discriminability simultaneously, which makes it especially suitable for the multimodal distributed and noisy/redundant components corrupted HRRP data.4,Linear discriminant analysis (LDA) is a represental feature extraction algorithm optimized by a global criteria and widely utilized in the pattern recognition field. To make up for the unsuitability of global criteria to multimodal distributed data, researchers propose some local criteria related algorithms, like marginal fisher analysis (MFA) and local discriminant embedding (LDE), to treat with the feature extraction and classification of multimodal distributed data. In this section, we make an analysis on algorithms from two aspects, i.e., robustness and flexibility, and conclude that global algorithms have stronger robustness and weaker flexibility, in contrast with local algorithms'weaker robustness and stronger flexibility. According to the analysis of the effection of training data's sampling extent on classifications, we propose a new algorithm, namely, combinatorial discriminant analysis (CDA), to seek a proper tradeoff between robustness and flexibility, and then successfully apply it to the radar HRRP target recognition community.5,In Section 5, we show that LDA has four drawbacks:(1) homogeneous samples should be Gaussian distributed; (2) the number of available projection vectors is limited; (3) different discrepant vectors are treated equivalently, and their different effections on classification do not attract necessary attentions; (4) the effection of the norm of projection vectors on classification is neglected. Based on above analysis, we propose a new feature extraction algorithm, namely, local mean discriminant analysis (LMDA), to make up for the disadvantages caused by first three drawbacks, and a generalized re-weighting (GRW) framework to make up for the disadvantage of the fourth drawback. LMDA and GRW can be solved by operating generalized eigenvalue decomposition and linear programming (LP), respectively. The combination of LMDA and GRW can significantly enhance data's discriminability, which is justified by extensive experiments on synthetic data, benchmark data, and radar HRRP data, respectively.
Keywords/Search Tags:Radar automatic target recognition (RATR), High-resolution range profile (HRRP), Outlier target rejection, Feature extraction, Hypersphere, Neighbor, Large margin, Local mean, Discriminant analysis
PDF Full Text Request
Related items