| In recent years,with the development of information technology,the rapid rise of artificial intelligence,machine learning the core of artificial intelligence technology,has attracted the attention of computer science,physics,neuro science,cognitive science and so on,and has gradually applied to various aspects such as society,economy,affect People’s Daily life.However,traditional machine learning algorithms need lots of time and data if they want to get the well result which can be use on practical application,which cause the machine learning’s field is limited.For this problem,a new method is proposed in the machine learning filed which is called multi-task learning.Multi-task leaning can learn from many task at the same time and can get the relationship between different task that will reduce the need of data and time.With the continuous research and exploration on multi-task learning,some models of multi-task learning play an important role in some fields,such as computer vision,nature language process etc.But the multi-task learning model people have found all need to solve a problem,how to make a balance between self-task’s feature and the common relationship with other tasks.Based on the above considerations,we propose two multi-task model based on latent variable:expectation maximization latent variable multi-task model and neural network latent variable multi-task model.The two models that differ from other multi-task model which is based on regularization and neural network use the latent variable represent the relationship between tasks.The two models also have a better result in regression task on the school data and sarcos than other models.Firstly,we introduce the implementation process and calculation method of the model in detail,theoretically analyzes the effectiveness of the two models.Then,we testify the ability of combining other multi-task learning model to improve the generalization ability.We conduct on the simulated data and real data experiments,compared with other models do not use the hidden variables,confirms the advantage of the hidden variable model:without affecting the original time complexity of algorithm on the basis of speed up the convergence ability of the model.Finally,we discuss the development of two latent variables. |