Acceleration of back propagation through initial weight pre-training with delta rule

Gang Li, Hussein Alnuweiri, Yuejian Wu, Hongbing Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Pages580-585
Number of pages6
ISBN (Print)0780312007
Publication statusPublished - 1 Jan 1993
Externally publishedYes
Event1993 IEEE International Conference on Neural Networks - San Francisco, CA, USA
Duration: 28 Mar 19931 Apr 1993

Other

Other1993 IEEE International Conference on Neural Networks
CitySan Francisco, CA, USA
Period28/3/931/4/93

Fingerprint

Backpropagation
Neural networks

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Li, G., Alnuweiri, H., Wu, Y., & Li, H. (1993). Acceleration of back propagation through initial weight pre-training with delta rule. In Anon (Ed.), 1993 IEEE International Conference on Neural Networks (pp. 580-585). Publ by IEEE.

Acceleration of back propagation through initial weight pre-training with delta rule. / Li, Gang; Alnuweiri, Hussein; Wu, Yuejian; Li, Hongbing.

1993 IEEE International Conference on Neural Networks. ed. / Anon. Publ by IEEE, 1993. p. 580-585.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, G, Alnuweiri, H, Wu, Y & Li, H 1993, Acceleration of back propagation through initial weight pre-training with delta rule. in Anon (ed.), 1993 IEEE International Conference on Neural Networks. Publ by IEEE, pp. 580-585, 1993 IEEE International Conference on Neural Networks, San Francisco, CA, USA, 28/3/93.
Li G, Alnuweiri H, Wu Y, Li H. Acceleration of back propagation through initial weight pre-training with delta rule. In Anon, editor, 1993 IEEE International Conference on Neural Networks. Publ by IEEE. 1993. p. 580-585
Li, Gang ; Alnuweiri, Hussein ; Wu, Yuejian ; Li, Hongbing. / Acceleration of back propagation through initial weight pre-training with delta rule. 1993 IEEE International Conference on Neural Networks. editor / Anon. Publ by IEEE, 1993. pp. 580-585
@inproceedings{ec6415a326bb4d9191bc5c25b4a69810,
title = "Acceleration of back propagation through initial weight pre-training with delta rule",
abstract = "A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.",
author = "Gang Li and Hussein Alnuweiri and Yuejian Wu and Hongbing Li",
year = "1993",
month = "1",
day = "1",
language = "English",
isbn = "0780312007",
pages = "580--585",
editor = "Anon",
booktitle = "1993 IEEE International Conference on Neural Networks",
publisher = "Publ by IEEE",

}

TY - GEN

T1 - Acceleration of back propagation through initial weight pre-training with delta rule

AU - Li, Gang

AU - Alnuweiri, Hussein

AU - Wu, Yuejian

AU - Li, Hongbing

PY - 1993/1/1

Y1 - 1993/1/1

N2 - A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

AB - A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

UR - http://www.scopus.com/inward/record.url?scp=0027147336&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027147336&partnerID=8YFLogxK

M3 - Conference contribution

SN - 0780312007

SP - 580

EP - 585

BT - 1993 IEEE International Conference on Neural Networks

A2 - Anon, null

PB - Publ by IEEE

ER -