Weight decay training, fuzzy set techniques, and rule extraction for backpropagation: Learning from incomplete and imprecise data |
| Posted on:1995-08-02 | Degree:Ph.D | Type:Thesis |
| University:The University of Wisconsin - Madison | Candidate:Lam, Siuwa Monica | Full Text:PDF |
| GTID:2478390014991554 | Subject:Information Science |
| Abstract/Summary: | PDF Full Text Request |
| The major purpose of this thesis is to improve the backpropagation learning algorithm in the following aspects: (1) Apply weight decay training to improve performance for undersized or oversized networks; (2) Apply fuzzy set techniques to determine learning rates; (3) Apply rule extraction to explain training results; (4) Apply backpropagation for reconstruction of missing values; (5) Apply weight decay backpropagation to learning from noisy data; and (6) Apply fuzzy set techniques to learning from borderline cases.;Experimental results confirm the superiority of weight decay training over standard backpropagation regardless of network architectures. The proposed RE algorithm provides a viable technique for extracting rules from a feedforward network trained using backpropagation. The RE algorithm has several advantages: (1) Classification rules from RE are highly tractable and interpretable; (2) RE is computationally efficient; (3) RE does not require domain knowledge; and (4) As RE utilizes both the strengths of connection weights and the activation levels of hidden nodes for rule extraction, it avoids the problem of tampering with the learning process caused by converting the sigmoid function into a step function.;Fuzzy set techniques are applied to determine the backpropagated errors and the learning rate for each input pattern based on its class membership values. The technique of variable learning rate is extended from two-class problems to multi-class problems. Borderline cases are technically defined using the standard deviations of class membership values. Fuzzy set techniques are shown to be effective in improving correct classification rates for data sets with high percentages of borderline cases as well as cases with compatible given crisp targets and fuzzy class membership values.;The standard backpropagation algorithm outperforms other statistical techniques for reconstruction of missing values. Moreover, reconstructed data has a positive effect on improving correct classification rates. The percentage of missing values has greater differential effect on reconstruction and classification methods than the randomness of missing values does. For noisy data, experimental results suggest: (1) Weight decay training achieves correct classification rates better than or equivalent to standard backpropagation; (2) Weight decay training tends to require fewer training epochs than standard backpropagation to converge; and (3) Noise in future cases deteriorates correct classification rates more significantly than noise in training cases does for backpropagation training. |
| Keywords/Search Tags: | Backpropagation, Training, Fuzzy set techniques, Correct classification rates, Rule extraction, Data, Cases, Apply |
PDF Full Text Request |
Related items |