Font Size: a A A

Joint Tuning Method Of Feature Selection And Classifier Parameters Based On Multi-objective Optimization

Posted on:2024-06-12Degree:MasterType:Thesis
Country:ChinaCandidate:Y Y PangFull Text:PDF
GTID:2568307064985689Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Feature selection can be a good way to improve machine learning performance,and feature selection methods based on wrapper methods outperform filter and embedded methods in terms of classification accuracy,but in most studies,the classifier parameters used in wrapper methods are fixed,without considering that parameter settings are also a direct factor affecting classification accuracy.To this end,this paper establishes a multi-objective minimisation feature selection problem that minimises the number of features while minimising the classification error rate through joint feature selection and classifier parameter optimisation.A Mixed Grey Wolf Optimization(MGWO)algorithm with mixed solution vectors is proposed for the feature selection problem characteristics to achieve the objective of selecting a smaller subset of features and maximizing the classification performance of the algorithm.The hybrid solution is a new row vector formed by combining the feature vector and the classifier parameter vector.Using the hybrid solution,the classifier parameters can be adjusted while the features are selected,thus maximising the classification accuracy and improving the classification performance of the algorithm.Since the traditional Grey Wolf Optimization(GWO)algorithm cannot solve multi-objective feature selection problems,MGWO uses a linear weighting method to convert multiple objective functions into fitness functions for subsequent solving of the algorithm.The traditional GWO is designed for continuous optimisation problems and is therefore not suitable for discrete feature selection problems,so a transfer function was introduced into the MGWO position update mechanism.Based on the evaluation of the performance of the tanh and sigmoid transfer functions after binary transformation,the tanh transfer function was finally chosen to enhance the GWO.Twenty commonly used datasets from the UC Irvine(UCI)machine learning repository were selected and compared to an unoptimised classifier version of the benchmark,and the results show that MGWO has the best overall performance.To address the shortcomings of the traditional Multi-objective Grey Wolf Optimizer(MOGWO)in terms of poor population diversity,slow convergence in the early stage and the tendency to fall into local optimum in the late stage,an Improved Multi-Objective Gray Wolf Optimizer(IMOGWO)is proposed to solve the formulaic multi-objective feature selection problem.To improve the quality of the initial solution and to distribute the solution as evenly as possible in the solution space,tent,sinusoidal chaotic mapping and Opposition-Based Learning(OBL)based initialisation strategies are introduced.A local search strategy is used to mutate the classifier parameters of the solutions on the Pareto front in the late iterative stage to increase the probability of finding an optimum and improve the classifier performance.To avoid degradation of the performance of the algorithm caused by the leader in the wolf pack falling into a local optimum,the algorithm sets a threshold to vary a characteristic dimension of the leader,thus improving its robustness and search ability.Sixteen commonly used datasets from the public UCI machine learning library were selected for simulation experiments,demonstrating that IMOGWO outperforms other multi-objective optimisation algorithms,and that multi-objective IMOGWO has better overall performance than single-objective MGWO.
Keywords/Search Tags:Feature selection, Classification, Multi-objective optimization, Grey wolf optimizer
PDF Full Text Request
Related items