AN IMPROVED DESIGN FOR BACK PROPAGATION NEURAL NETWORKS USING A HIERARCHAL SUBNET DESIGN (HSD) APPROACH
Original Publication Date: 1994-Oct-01
Included in the Prior Art Database: 2002-Mar-12
The common approach to the sizing and train- ing of Back Propagation neural networks is some- what adhoc. The optimal architecture and number of processing elements necessary to efficiently rep- resent the important features ofa particular data set depends on the nature of the data. The rate of net- work convergence and its optimality depend on the presentation sequence of the data, the data features, and the network architecture. The Hierarchal Sub- net Design (HSD) approach described here is an attempt to address these factors in a structured way, In the HSD approach the network is built and trained in an automatic, systematic, hierarchal manner such that the coarse features of the data are learned first followed by the finer details. Experimental results show that this approach can result in a faster rate of convergence and a lower minimum error than is attained with networks built and trained in a con- ventional manner.