| Broad learning system has been widely applied to machine learning and time series analysis in recent years.Considering that deep neural networks suffer from complicated structure,difficult theoretical analysis and time-consuming training process,broad learning system aims to offer an alternative way with broad structure.It only needs to seek the optimal weights of a single layer,and thus has fast training speed.The existing broad learning system belongs to centralized processing,which requires a powerful central node to store and process the whole dataset.However,in practical applications,the data can be collected distributively by different nodes in a network.Due to the limitations in energy consumption,computation capabilities and communication bandwidth,it is infeasible to transmit all the data to a central node to construct broad learning systems.Therefore,the application of broad learning system to distributed networks has practical significance.Based on the above considerations,this thesis studies distributed implementations of broad learning system specifically.The main work of this thesis is as follows:Firstly,based on the consensus strategy,the objective function of broad learning system is decentralized,and the alternating direction method of multipliers is used to solve the problem.A consensus-based distributed broad learning system is obtained.Considering the redundancy and over-fitting problems caused by the expansion of the structure,thel2,1-norm is introduced as sparse regularization,and a sparse consensus-based distributed broad learning system is further proposed.Then,as online learning is preferred in dealing with streaming data,a diffusion recursive least squares method is used to update the weights,and an online diffusion-based distributed broad learning system is obtained.In addition,an event-triggered communication strategy is proposed to further reduce the communication cost while maintaining good learning performance.Finally,referring to the idea of local receptive field,broad learning system is extended through feature extraction using convolution layer and pooling layer.With the above distributed implemen-tation methods,several locally-connected distributed broad learning systems are obtained.This thesis tests the proposed methods in large-scale image classification problems to verify their effectiveness.Simulation results show that the proposed distributed broad learning systems show good performance compared with some existing learning methods,and their performance approaches the corresponding centralized models. |