| Optimization methods play a crucial role in “intelligent decision-making” and are an important part of machine learning and operational optimization process.As the concept of“intelligence” becomes more widespread in various fields,traditional gradient-based optimization methods are no longer applicable due to the stringent requirements of continuity,differentiability,and convexity of the objective function of the problem.In contrast,intelligent optimization methods are widely used and have made significant contributions to the advancement of “intelligence” in various fields,as they do not have the above-mentioned limitations in solving optimization problems.Despite the remarkable progress in solving complex optimization problems,intelligent optimization methods still face some challenges,such as the performance of the algorithm being more sensitive to the initial population,the search efficiency being lower than traditional optimization algorithms,and it being easy to fall into local optimal.Gravitation Field Algorithm(GFA)is an intelligent optimization algorithm that simulates the planet formation process with good performance and a simple model.This paper proposes the Explosion Gravitation Field Algorithm(EGFA)to address the shortcomings of the existing intelligent optimization methods and introduce the corresponding improvement strategies base on the model of GFA.Based on the proposed EGFA,this paper focuses on two typical complex optimization tasks: feature selection(FS)and Neural Architecture Search(NAS)and explores the corresponding solutions.The main contributions of this paper are as follows:(1)To address the problems that the swarm intelligence algorithm is sensitive to the initial population,has low search efficiency,and easily falls into local optimal,this paper proposes the Explosion Gravitation Field Algorithm(EGFA)by introducing the dust sampling strategy(DS)and explosion operator based on GFA.Specifically,the dust sampling strategy(DS)aims to quickly locate a target subspace containing the optimal solution,and then the algorithm is initialized based on this target subspace.The DS strategy alleviates the deficiency that the performance of the algorithm is sensitive to the initial population.The explosion operator is designed to help the algorithm jump out of this unfavorable situation when the algorithm performance improvement stalls,or when it falls into local optimal,for the purpose of improving the algorithm performance.In addition,compared with the original GFA,the proposed EGFA in this paper merges the movement and rotation operators to simplify the processes of the algorithm.The experimental results show that EGFA has excellent global search capability and good search efficiency;it performs well in dealing with complex real-world optimization problems.(2)To address the problem that datasets with high-dimensional features and small sample sizes,such as gene expression datasets,are difficult to deal with,this paper improves EGFA,applies it to the feature selection task,and proposes a feature selection method based on the Explosion Gravitation Field Algorithm(EGFA-FS).The method scores the importance of features by a series of random forests(RFs)and selects part of the features with more importance to construct a feature recommended pool(FRP).Then,the method is initialized based on the FRP and focuses the view on part of the features with more importance to search a subset of features quickly and improve the search efficiency.Then,the view is dispersed to the whole feature space by the explosion operator to reduce the probability that the searched feature subset is a local optimal.The experimental results show that the feature subset searched by EGFA-FS has a good classification effect,and its selected genes(features)play important roles in the differential co-expression network and are involved in important biological functions.(3)To address the problems of long computation time,high demand for computational resources,and limited application and promotion of NAS methods,this paper improves EGFA,applies it to the task of NAS,and proposes an efficient neural architecture search method by using the working mechanism of EGFA(EGFA-NAS),which combines a gradient descent method to jointly optimize the candidate operations weights of the architectures.Specifically,the method chooses a cell-based micro search space,represents the cell structure by a directed acyclic graph,and encodes the candidate operation weights of the cell with a tensor of size e×|O|.The discrete search space is made continuous by a search space relaxation strategy.A training strategy is proposed using the population mechanism of EGFANAS to reduce the computational cost and improve the search efficiency.During the explosion operation,the weight inheritance strategy is proposed to improve the performance and efficiency.Experimental results in NAS-Bench-201 and DARTS search spaces show that EGFA-NAS has good global search capability and search efficiency,and its discovered network architecture has very competitive learning accuracy. |