Font Size: a A A

Tensor Network Machine Learning Modeling

Posted on:2020-11-14Degree:DoctorType:Dissertation
Country:ChinaCandidate:S ChengFull Text:PDF
GTID:1360330596978175Subject:Theoretical Physics
Abstract/Summary:PDF Full Text Request
The strong correlation electron system has always been one of the central problems in the field of condensed matter for nearly half a century.Many interesting and yet to be understood phenomena are closely related to it,such as high temperature supercon-ductivity,Mott insulators,fractional excitation,spin liquids,etc.Its essential difficulty lies in its inability to use the perturbation theory,the mean field and other methods as the interaction of the one-body approximation.The strong interaction that cannot be ignored makes the general many-body wave function need to be constructed by the di-rect product of the single wave function.Such a construction method will make the dimension of the Hilbert space exponentially increase with the size of the system,and soon exceeds the range that the real computer can bear.To study the many-body wave function,approximation is necessary.Under the condition that the one-body approximation is not available,the density matrix renormal-ization and tensor network renormalization method based on system locality approxi-mation become practical and effective numerical algorithms of the quantum many-body system.In recent years,physicists have begun to notice the similar challenges existing in the field of machine learning.The configuration space of the image increases exponen-tially with the size of the image.The machine learning model needs to approximate the feature distribution of the natural image set from the configuration space by using model only finite parameters,which is close related to the idea of many-body wave function modeling.Therefore,with comparison and research between those two fields,we expect that physical algorithms can bring more interpretable learning theories and models to the field of machine learning,and machine learning models will also bring new technic to physics research.Specifically,the comparative study of the two can be divided into three parts:data,model and algorithm.The data section examines whether the quantum wave function is consistent with the complexity of classical natural pictures.Finding the theoretical origin of the approximation of natural image sets provides a physical explanation for the success of machine learning,and also provides motivation for applying quantum al-gorithms to natural picture sets.The model part examines whether traditional machine learning models,such as probability graph models,have a corresponding relationship with quantum many-body models,such as tensor networks.Whether the tensor net-work algorithm can be used as an extension of the probability graph algorithm,how to construct a quantum machine learning algorithm based on a tensor network.The algo-rithm part can be used to model natural pictures by constructing a new model that can be calculated and calculating common multi-body algorithms such as DMRG.Whether the methods in machine learning,such as the feedback of the latter,can further optimize the existing quantum many-body algorithm.The following are the specific arrangements for this article:In the first chapter,we first make a necessary introduction to the tensor network state and machine learning.We will introduce the area law,the essential reason for the success of tensor network state,common tensor network states such as matrix prod-uct state,tree tensor network,projection entanglement pair state,etc.;common ten-sor network algorithms,such as renormalization algorithm,variational algorithm,etc.;common paradigms of machine learning,such as supervised learning,unsupervised learning,discriminative learning and generative learning;common supervised learn-ing models and generative learning models.We will also briefly discuss the reasons for the success of machine learning algorithms and the similarities between the problems studied in machine learning and the problems studied in quantum many-body physics.In the second chapter,we first examine the motivation and feasibility of the tensor network state to deal with machine learning problems from the perspective of informa-tion theory.Specifically,we point out the formal similarity between the Renyi entropy in the quantum state and the mutual information of the classical picture.We estimate the mutual information of the image set and find that the mutual information of the natural image set is much smaller than its theoretical upper bound.It indicates that the feature distribution of a natural picture is a function that can be effectively approximated.By constructing a restricted Boltzmann machine with only neighbor connections,we point out that the association of natural picture sets is dominated by neighbor associations,while a small number of sparse long-range relationships.This implies that the tensor network model based on local approximation is suitable for natural picture sets.At the same time,we also found that the tensor network has a close correspondence with the probability map model.We give a strict mapping that limits the Boltzmann machine to the matrix product state.In the third chapter,we will give a brief review of the supervised learning algo-rithm based on tensor network.We will introduce various models,algorithms,mapping methods of image to tensor networks used by tensor networks in supervised learning algorithms,and the existing results of tensor network supervised learning algorithms and their possible future development directions.In addition,we will discuss the du-ality between the probability graph model and the factor graph model with the tensor network state in this chapter.In the fourth chapter,based on the observation and verification of the locality of the above-mentioned natural image set,we try to optimize the very successful residual network and Dense network in the classification supervision learning with the simplest tensor network,that is,the matrix product state and the matrix product operator.We replace the fully connected layers in these deep networks with matrix product opera-tors.After doing this,the number of parameters of the entire learning model can be significantly reduced,and the learning effect of the model is hardly affected.This also indicates that the vast majority of connections in the machine learning model that char-acterize long-range correlations are completely redundant.In the fifth chapter,we apply the tensor network to unsupervised generation learn-ing.Based on the previous matrix product generation learning model,we propose a gen-eration model based on tree tensor network.The tree tensor network model can retain all the advantages of the matrix product state model,and can realize two-dimensional modeling.At the same time,the efficiency of model learning is not affected by the size of the image.For the same data set,the test NLL of the binarized MNIST data set is reduced to a lower level.The generation model based on the tensor network has been advanced further.In the sixth chapter,we will make a summary and outlook on this emerging field.The key points will be 1.explore the future development potential and specific develop-ment direction of the tensor-based generation model.2.Inspiration that tensor network machine learning model could give to quantum machine learning.3.The prospect of tensor network algorithms in other types of machine learning problems other than images.4.Machine learning algorithms may improve and optimize various quantum many-body algorithms such as the traditional tensor network states in the future.
Keywords/Search Tags:Tensor Network, Tensor Network Renormalization Group, Machine Learning, Supervised Learning, Generative Learning, Probability Graph Model, Factor Graph
PDF Full Text Request
Related items