الخلاصة:
Artificial neural networks (ANN) are universal approximators that allow to express the
correlation between input data and output data. Learning by ANN is based on the adaptation of the free parameters of the network by iteratively varying the values of the latter,
from arbitrary initial values until reaching final values. The latter should, in principle,
attribute to the network an optimal behavior, in accordance with certain pre-established
criteria.
With the advent of neural methods, the initialization of ANNs has become a central
problem and has been studied by many researchers. The Nguyen-Widrow method (1990),
although old, has established itself as a reference method. It is widely recognized and is
the most used.
Unfortunately, the benefits of the Nguyen-Widrow method are not always guaranteed.
The repetition of certain experiments, with apparently identical conditions, gives satisfactory results but others are disappointing, in terms of learning time, but also in terms
of precision of the estimates. For this reason, it has always been recommended to make
several training attempts and choose the one that would have produced the best results.
The purpose of this work is to reveal the insidious reasons that cause unsuccessful
trials and prevent taking full advantage of the Nguyen-Widrow method. It reveals the
existence of a defect, imperceptible during initialization, but which sets in at the very
beginning of training, which will have a negative impact on the quality of training, in
terms of training time. execution and performance.