Acceleration of back propagation through initial weight pre-training with delta rule

Gang Li, Hussein Alnuweiri, Yuejian Wu, Hongbing Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Pages580-585
Number of pages6
ISBN (Print)0780312007
Publication statusPublished - 1 Jan 1993
Externally publishedYes
Event1993 IEEE International Conference on Neural Networks - San Francisco, CA, USA
Duration: 28 Mar 19931 Apr 1993

Other

Other1993 IEEE International Conference on Neural Networks
CitySan Francisco, CA, USA
Period28/3/931/4/93

    Fingerprint

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Li, G., Alnuweiri, H., Wu, Y., & Li, H. (1993). Acceleration of back propagation through initial weight pre-training with delta rule. In Anon (Ed.), 1993 IEEE International Conference on Neural Networks (pp. 580-585). Publ by IEEE.