Font Size: a A A

Transformer Surrogate Model Based On Attention And Long Short-term Memory Network

Posted on:2022-09-29Degree:MasterType:Thesis
Country:ChinaCandidate:Y L FengFull Text:PDF
GTID:2512306494490844Subject:Electrical engineering
Abstract/Summary:PDF Full Text Request
Transformer performance analysis needs to consider multiple factors such as energy conversion efficiency,noise,volume,and weight,and its design parameters are gradually evolving toward higher dimensions.Therefore,the establishment of a transformer performance analysis method suitable for high-dimensional data has become an urgent problem to be solved.The surrogate model method can effectively solve the time-consuming problem of numerical simulation,but the traditional surrogate model is difficult to accurately extract deep-level features from high-dimensional data,which makes the prediction results of the traditional surrogate model difficult to meet the requirements.Based on the Attention-LSTM deep learning method,this paper designs a transformer proxy model suitable for high-dimensional data,which achieves high-precision prediction and analysis of transformer performance,and effectively reduces the calculation cycle of transformer performance analysis.The main research content of this article includes the following aspects.(1)The basic theory of surrogate model and deep learning is studied.It expounds the principle and components of the surrogate model and other related knowledge,analyzes the basic ideas and related characteristics of different surrogate models,and the problems of the surrogate model in high-dimensional data problems.According to the theoretical analysis of the deep learning model,the application of deep learning in the high-dimensional data problem is introduced,which provides a theoretical foundation for the transformer surrogate model suitable for high-dimensional data proposed later.(2)The amorphous transformer is selected as the research object,and the finite element simulation model is established based on the structural size of the transformer prototype.The material properties of the prototype are determined through experimental measurement,and a vibration experimental measurement platform is established to verify the accuracy of the simulation model,which establishes a foundation for obtaining sample data for the surrogate model.(3)Aiming at the traditional high-dimensional surrogate model widely used at present,a traditional high-dimensional surrogate model based on PCA-SVM is designed.It analyzes and compares the prediction accuracy with other common surrogate models,and studies the influence of principal component analysis on the prediction accuracy of traditional surrogate models and the limitations of this method,and analyzes the scope of application of the PCA-SVM surrogate model.(4)The basic idea of attention mechanism and long-short-term memory network,and the basic principle of this method to extract deep-level features from high-dimensional data are studied.Using classic high-dimensional data sets to evaluate and analyze the performance of the Attention-LSTM model in processing high-dimensional data problems,which proves that the model has a good predictive effect in high-dimensional data problems.Combined with the finite element simulation calculation,a sample data set of the proxy model was established,and the proxy model was designed according to the characteristics of the transformer data set,and the five types of structural parameters of the proxy model were tuned.The prediction effect of the model is compared with the traditional high-dimensional proxy model,and the results show that the model can accurately extract deep-level features from high-dimensional data.Compared with the traditional proxy model,the Attention-LSTM model has more advantages in terms of prediction accuracy and calculation time.
Keywords/Search Tags:surrogate model, amorphous transformer, deep learning, Long and Short Term Memory network, attention mechanism
PDF Full Text Request
Related items