Meta-heuristic intelligent optimization algorithm is widely used in many fields.The optimization of nonlinear objective and constraint problems has always been the key to the research.Butterfly optimization algorithm has the advantages of high performance and easy operation.However,it has the problems of slow convergence and easy convergence to local solutions.Therefore,this paper studies the butterfly optimization algorithm based on the adaptive technique,disturbance,and hybrid algorithm.The proposed algorithm is verified by testing smooth function optimization,statistical parameter optimization and engineering design optimization.The main work is as follows.Firstly,an adaptive weight butterfly optimization algorithm(AWBOA)is proposed,based on adaptive parameters and variables,Gaussian disturbance term and dynamic switching probability.The algorithm uses Gaussian perturbation,adaptive weight,and dynamic switching probability to improve the convergence accuracy and speed up the convergence speed.The performance of AWBOA is evaluated by testing mean,standard deviation,Wilcoxon rank sum test and high-dimensional performance on 15 benchmark function set.The results show that the proposed AWBOA algorithm is significantly better than ant lion optimization algorithm,butterfly optimization algorithm,chicken swarm optimization algorithm,dragonfly algorithm and whale optimization algorithm in optimization accuracy,convergence speed,effectiveness,and stability.AWBOA is employed to solve two practical application problems,including credit evaluation and engineering design.The research results show that AWBOA has strong competitiveness and can be used to solve practical problems,compared to the classical optimization algorithm.Then,based on butterfly optimization algorithm,improved salp swarm algorithm,neighborhood centroid opposition-based learning and dynamic switching probability,a hybrid salp swarm and butterfly optimization algorithm(HSSBOA)is proposed.10 mathematical functions are employed to evaluate the performance of HSSBOA by testing the mean,standard deviation,two tailed t-test and high-dimensional performance.Compared to four optimization algorithms,experimental results show that HSSBOA has better accuracy,speed,effectiveness,and stability.The Ablation experiment on HSSBOA verifies that all improvements have a positive effect on the algorithm,which the improved SSA has the greatest impact.In addition,HSSBOA is used to fit the Richards model.Richards based on HSSBOA was used to predict the growth of glutamate colonies and the growth of Chinese population.The example results show that the Richards model fitted by HSSBOA has strong competitiveness compared with many algorithms.It also proves that HSSBOA can be used to fit the parameters in the statistical model and solve the actual optimization problem.Finally,the work of this paper is summarized,and the direction of future research is described. |