| Model averaging is different from traditional machine learning methods,and the theory of model averaging not only focuses on predicting the outcome,but also pays attention to the uncertainty of the outcome.Model averaging minimizes uncertainty,minimizes the loss of useful information,and avoids selecting poor models.For the first time,since the L1 norm in the elastic network can be variable selection and dimensionality reduction,the L2 norm can compress the coefficients to obtain more stable predictions.Therefore,this paper combines model averaging with elastic networks,and writes the estimation as least squares estimation with L1 regular terms and L2 regular terms,and denotes this method as MMAe.Secondly the objective function and constraint of MMAe are convex functions,and this paper discusses its sparsity and proves its asymptotic optimality,which proves the asymptotic optimality of the candidate model set that can be exponentially amplified by Gaussian noise.In addition,the elastic network contains parameters for adjusting the L1 norm and L2 norm,so the selection of the pair will be derived by the generalized cross-validation method.We further demonstrate that under more relaxed conditions,MMAe adjusted by generalized cross-validation has a lower risk than the Mallows’ model on average.Finally,the selection of the optimal weight is effectively realized by the coordinate descent algorithm.Through a large number of simulation studies,this paper shows that MMAe has certain advantages over some commonly used methods.In the empirical analysis part,this paper applies this method to the practical problem of predicting the volume of China’s trade imports and exports to SCO countries in 2020. |