Acceleration of back propagations through initial weight pre-training with delta rule

Gang Li, Hussein Alnuweiri, Yuejian Wu, Hongbing Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Citations (Scopus)

Abstract

A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. Since the DPT deals only with initial weight settings, most variations of the standard BP algorithm (aiming at increasing convergence rate) can be combined with the DPT so as to further speed up convergence. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks, ICNN 1993
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages580-585
Number of pages6
Volume1993-January
ISBN (Electronic)0780309995
DOIs
Publication statusPublished - 1 Jan 1993
Externally publishedYes
EventIEEE International Conference on Neural Networks, ICNN 1993 - San Francisco, United States
Duration: 28 Mar 19931 Apr 1993

Other

OtherIEEE International Conference on Neural Networks, ICNN 1993
CountryUnited States
CitySan Francisco
Period28/3/931/4/93

Fingerprint

Backpropagation
Backpropagation algorithms
Neural networks

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Artificial Intelligence

Cite this

Li, G., Alnuweiri, H., Wu, Y., & Li, H. (1993). Acceleration of back propagations through initial weight pre-training with delta rule. In 1993 IEEE International Conference on Neural Networks, ICNN 1993 (Vol. 1993-January, pp. 580-585). [298622] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICNN.1993.298622

Acceleration of back propagations through initial weight pre-training with delta rule. / Li, Gang; Alnuweiri, Hussein; Wu, Yuejian; Li, Hongbing.

1993 IEEE International Conference on Neural Networks, ICNN 1993. Vol. 1993-January Institute of Electrical and Electronics Engineers Inc., 1993. p. 580-585 298622.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, G, Alnuweiri, H, Wu, Y & Li, H 1993, Acceleration of back propagations through initial weight pre-training with delta rule. in 1993 IEEE International Conference on Neural Networks, ICNN 1993. vol. 1993-January, 298622, Institute of Electrical and Electronics Engineers Inc., pp. 580-585, IEEE International Conference on Neural Networks, ICNN 1993, San Francisco, United States, 28/3/93. https://doi.org/10.1109/ICNN.1993.298622
Li G, Alnuweiri H, Wu Y, Li H. Acceleration of back propagations through initial weight pre-training with delta rule. In 1993 IEEE International Conference on Neural Networks, ICNN 1993. Vol. 1993-January. Institute of Electrical and Electronics Engineers Inc. 1993. p. 580-585. 298622 https://doi.org/10.1109/ICNN.1993.298622
Li, Gang ; Alnuweiri, Hussein ; Wu, Yuejian ; Li, Hongbing. / Acceleration of back propagations through initial weight pre-training with delta rule. 1993 IEEE International Conference on Neural Networks, ICNN 1993. Vol. 1993-January Institute of Electrical and Electronics Engineers Inc., 1993. pp. 580-585
@inproceedings{d0450e2a15fc4566896ba03ccbcac0d5,
title = "Acceleration of back propagations through initial weight pre-training with delta rule",
abstract = "A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. Since the DPT deals only with initial weight settings, most variations of the standard BP algorithm (aiming at increasing convergence rate) can be combined with the DPT so as to further speed up convergence. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.",
author = "Gang Li and Hussein Alnuweiri and Yuejian Wu and Hongbing Li",
year = "1993",
month = "1",
day = "1",
doi = "10.1109/ICNN.1993.298622",
language = "English",
volume = "1993-January",
pages = "580--585",
booktitle = "1993 IEEE International Conference on Neural Networks, ICNN 1993",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Acceleration of back propagations through initial weight pre-training with delta rule

AU - Li, Gang

AU - Alnuweiri, Hussein

AU - Wu, Yuejian

AU - Li, Hongbing

PY - 1993/1/1

Y1 - 1993/1/1

N2 - A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. Since the DPT deals only with initial weight settings, most variations of the standard BP algorithm (aiming at increasing convergence rate) can be combined with the DPT so as to further speed up convergence. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

AB - A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. Since the DPT deals only with initial weight settings, most variations of the standard BP algorithm (aiming at increasing convergence rate) can be combined with the DPT so as to further speed up convergence. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

UR - http://www.scopus.com/inward/record.url?scp=84943245391&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84943245391&partnerID=8YFLogxK

U2 - 10.1109/ICNN.1993.298622

DO - 10.1109/ICNN.1993.298622

M3 - Conference contribution

VL - 1993-January

SP - 580

EP - 585

BT - 1993 IEEE International Conference on Neural Networks, ICNN 1993

PB - Institute of Electrical and Electronics Engineers Inc.

ER -