Skip to main content

A backpropagation scheme with and without boosting, 1999

 Item — Call Number: MU Thesis Pro
Identifier: b2088058

Scope and Contents

From the Collection:

The collection consists of theses written by students enrolled in the Monmouth College and Monmouth University graduate Electronic Engineering programs. The holdings are bound print documents that were submitted in partial fulfillment of requirements for the Master of Science degree.

Dates

  • Creation: 1999

Creator

Conditions Governing Access

All analog collection holdings are limited to library use only.

Researchers seeking to photocopy collection materials must complete an Application to Photocopy Form.

In some cases, photocopying of collection materials may be performed by the Monmouth University Library staff.

The Monmouth University Library reserves the right to limit or refuse duplication requests subject to the condition of collection materials and/or restrictions imposed by the collection creators or by the United States Copyright Act.

Permission to examine, or copy, collection materials does not imply permission to publish or quote. It is the responsibility of the researcher to obtain such permissions from both the copyright holder and Monmouth University.

Extent

1 Items (print book) : 104 pages ; 8.5 x 11.0 inches (28 cm).

Language of Materials

English

Abstract

Neural networks, simply stated, are machines that learn. A neural network consisting of several neurons, which are modeled after the human brain, uses synaptic weights, or interneuron connection strengths, to store acquired knowledge, which can be used in the future. One form of neural networks, feedforward networks, uses multilayer perceptrons, which are layered sets of neurons. This type of network contains an input layer, one or more hidden layers, and an output layer. Further, these multilayer perceptrons can be trained in a supervised manner using the back propagation algorithm, which uses the output of a forward pass through a multilayer perceptron network with fixed synaptic weights to adjust the weights based on an error correction rule. The forward and backward passes are repeated until improvement in the training (or test) set is no longer realized. The error rate that this type of network produces can be further reduced by a method known as boosting. Boosting uses weak learners from the previous training set to construct a new training set for each round of forward and backward passes. Again, forward and backward passes are repeated until improvement in the training (or test) set is no longer realized.

For this paper, schemes for backpropagation alone and for backpropagation with boosting are constructed using the MATLAB technical computing language. Two architectures are used to test the alogrithms. Three variations to each architecture are tested several times to obtain statistically significant results. Comparisons between the two architectures reveal that backpropagation with boosting performs drastically better than backpropagation alone. Also, as the test variation becomes more difficult, the degree of improvement that boosting provides over backpropagation alone decreases.

Partial Contents

1. Background -- 2. Network architecture -- 3. Software development -- 4. Discussion of results -- 5. Conclusion -- 6. References -- Appendix A. Sample figures of three circles: all cases -- Appendix B. Sample figures for concentric circles: all cases -- Appendix C. Tabular results for three circles: all cases -- Appendix D. Tabular results for concentric circles: all cases -- Appendix E. MATLAB program "project_tp2.m" -- Appendix F. MATLAB program "concentric_circles1.m" -- Appendix G. MATLAB function "train_circles2.m".

Source

Repository Details

Part of the Monmouth University Library Archives Repository

Contact:
Monmouth University Library
400 Cedar Avenue
West Long Branch New Jersey 07764 United States
732-923-4526