| In recent years,with the growing concern about personal data privacy and the implementation of related privacy protection laws,it has become increasingly difficult for companies to collect sufficient data to train machine learning models.Federated learning provides a possible solution to make data available while keeping data privacy.According to the different kind of data distribution,federated learning can be divided into horizontal federation,vertical federation and federated transfer learning.Among them,vertical federated learning is mostly used between enterprises,and the current mainstream scheme of vertical federated learning is based on secure multi-party computing.Specifically,the data of participants and model parameters are all secretly shared,and the training and prediction of the model are directly performed on the encrypted shares.Although secure multi-party computation protects data privacy perfectly,it also brings high communication rounds or loss of calculation precision.Secret sharing based on the integer ring can only efficiently handle the addition and multiplication operations.But there are many nonlinear operators in the machine learning models,such as Sigmoid,Re LU,Exp,Log,etc.At present,for calculating nonlinear operators,those operators are converted into multiple rounds of addition and multiplication operators mainly by fitting or iterative approximation.However,such methods either lose too much accuracy,have a bounded convergence region,or have too many communication rounds.This makes the federated model either lose the prediction accuracy,or higher communication cost.The innovations proposed in this paper are listed below:1.A mixed secret sharing mechanism of addition and multiplication over the real field.Since the operations defined on the integer ring are too simple to meet the requirements of the various operators in the machine learning models,this paper replace the domain of secret sharing with the real field,and proposed the multiplicative sharing on the basis of the additive sharing.In order to combine the advantages of additive sharing and multiplicative sharing,a transformation method based on homomorphic encryption is proposed,which constitutes a mixed secret sharing mechanism over the real field.Based on the proposed mechanism,the communication complexity of many nonlinear functions,such as Exp,Log,Sign,Sin,P ow,can be reduced to O(1)and the precision is almost lossless.The upper bound of privacy leakage caused by blinding factor over real filed are proofed in this paper.2.A series of private algorithms of machine learning nonlinear functions based on the proposed mechanism.Based on the proposed mechanism and the privacy algorithm of basic functions,this paper adopts it to machine learning models,and proposes the privacy algorithm of Re LU,Sigmoid,Softmax and their derivatives.A solution to the overflow of exponential function in Sigmoid and Softmax is given in this paper.Experiments have verified that compared with the open source federated learning frameworks,Crypten and Rosetta,the method in this paper is 2 orders higher in the computational accuracy of nonlinear operators and saved 74% to 87% in communication rounds,the model prediction speed of single sample is increased by 2.1 to 5.0 times.In general,this paper proposed a federated learning scheme that trades a small cost of privacy for efficiency. |