Initializes network weights using equal weighting.

Namespace: Imsl.DataMining.Neural
Assembly: ImslCS (in ImslCS.dll) Version: 6.5.0.0

Syntax

C#
public virtual void SetEqualWeights(
	double[,] xData
)
Visual Basic (Declaration)
Public Overridable Sub SetEqualWeights ( _
	xData As Double(,) _
)
Visual C++
public:
virtual void SetEqualWeights(
	array<double,2>^ xData
)

Parameters

xData
Type: array< System..::.Double ,2>[,](,)[,]
An input double matrix containing training patterns. The number of columns in xData must equal the number of nodes in the InputLayer.

Remarks

The equal weights approach starts by assigning equal values to the inputs of each Perceptron. If a Perceptron has 4 inputs, then this method starts by assigning the value 1/4 to each of the Perceptron's input weights. The bias weight is initially assigned a value of zero.

The weights for the first layer of Perceptrons, either the first HiddenLayer if the number of layers is greater than 1 or the OutputLayer, are scaled using the training patterns. Scaling is accomplished by dividing the initial weights for the first layer by the standard deviation, s, of the potential for that Perceptron. The bias weight is set to -avg/s, where avg is the average potential for that Perceptron. This makes the average potential for the Perceptrons in this first layer approximately 0 and its standard deviation equal to 1.

This reduces the possibility of saturation during network training resulting from very large or small values for the Perceptrons potential. During training random noise is added to these intial values at each training stage. If the EpochTrainer is used, noise is added to these initial values at the start of each epoch.

See Also