| Neural Architecture Search(NAS)is the process of automatically designing for the best neural network architecture for task requirements.Designing a neural network requires experts to spend a lot of labor and time.The automated process of NAS can effectively solves this problem.Although NAS has achieved excellent performance,its search process is still very time consuming.The contribution of this paper is to propose three efficient NAS methods.Firstly,this paper proposes an efficient NAS method based on Estimation of Distribution Algorithm(EDA),called EDNAS,to solve the time-consuming problem of NAS.EDA is an optimization method based on probability statistics and population based evolutionary strategies.It has the ability to quickly solve complex optimization problems.In EDNAS,it assumes that the best performing architecture obeys a certain probability distribution in search space.Therefore,NAS can be transformed to learning this probability distribution.EDNAS first constructs a probability model on the search space,then this probability distribution is learned through EDA method.Meanwhile,EDNAS combines the shared parameters method to reduce the computational cost of training networks therefor speed up the evaluation of architectures.Secondly,this paper proposes CO-EDNAS,a NAS method based on EDNAS and combine the idea of cooperative optimization and simulated annealing,so as to improve the search capability under similar search time.This algorithm uses multiple probability models for cooperative optimization,so that can search more broadly in the search space and further strengthens the global search capability.Besides,CO-EDNAS can dynamically adjust the size of the population according to the convergence of probability models by means of simulated annealing,which not only ensures that the algorithm has a large initial population size to explore a large range in the search space,but also allows the algorithm to reduce the population size and search time as the probability gradually converges.Moreover,training shared parameters and learning probability models in CO-EDNAS are decoupled into two independent processes and uses the learning rate compensation method to train the shared parameters,thereby making the evaluation of architectures more fair and reducing the bias in the search process.Thirdly,this paper combines the fast non-dominant sorting strategy with CO-EDNAS to propose MO-EDNAS method,which can perform multi-objective optimization for NAS task.MO-EDNAS search for multiple network architectures widely distributed in the target space in one execution.The experimental results in different datasets prove the effectiveness of these three methods.The significance of the research in this paper is to propose three efficient NAS methods,which not only promotes the efficiency and performance of NAS,but also benefits the in-depth study of NAS.Besides,this work is meaningful to many realistic applications of NAS. |