| Broad Learning System(BLS)is a novel flat feedforward neural network,compared with deep structure neural network,BLS is unique in that the hidden layers can be expanded in a broad manner instead of stacking layers in deep.BLS only calculate the output weights between hidden layer and output layer by pseudoinverse and other network parameters are randomly generated,and the model can be updated quickly without going through a retrain process.Due to the great learning efficiency,BLS has been applied in many fields,such as computer vision,complex industrial process control and optimization,medical health.However,there are some problems in BLS that affect its application.First,the generalization of BLS model is easily affected by noise and outliers.Next,there are many redundant nodes in BLS network,it may increase model complexity and lead to over-fitting problem to reduce model generalization.In order to solve the above problems,we will propose the corresponding methods in this paper,the main work are as follows:(1)In order to eliminate the disturbance of noise and outliers on generalization of BLS model,Robust Broad Learning(RBLS)is proposed in this paper.RBLS adaptively assigns the appropriate weight factors to different samples for controlling their contribution to modeling,the normal samples get higher weight factors to increase their contribution,while the suspected outliers get lower weight factors to reduce their contribution,thus,the negative effect of noise and outliers can be eliminated.For eliminating the interference of noise and outliers on the incremental learning process and updating the model quickly and effectively,we further propose robust incremental learning algorithms by applying weight factor in incremental learning algorithms.We also provide several methods for computing weight factor,and a unified application framework is designed to enable RBLS to use different calculation methods easily.The experimental results show that the proposed RBLS and robust incremental learning algorithms are effective.(2)In order to reduce the redundancy of BLS network structure,a Broad network structure sparse technique based on fusion penalty will be presented in this paper.The proposed compression technique combines l1-loss function,l1-norm regularizer and fusion regularizer to form a new objective function.l1-loss function can eliminate interference of noise and outlier on output weights.l1-norm regularizer is used to sparse output weights to select useful nodes,and fusion regularizer is able to smooth output weights to make up for the defect of l1-norm regularizer in dealing with the selection of correlated nodes.Through iterative optimization of new objective function,the proposed compression technique can select the important nodes to build a compact model and ensure the robustness of model.According to the fusion regularizer used,we further propose two different sparse strategies and designs a unified implement process framework for flexibly using these strategies.The simulation results show that the proposed technique can achieve a better balance between model complexity and generalization. |