Initializes network weights using random weights.

Namespace: Imsl.DataMining.Neural
Assembly: ImslCS (in ImslCS.dll) Version: 6.5.0.0

Syntax

C#
public virtual void SetRandomWeights(
	double[,] xData,
	Random random
)
Visual Basic (Declaration)
Public Overridable Sub SetRandomWeights ( _
	xData As Double(,), _
	random As Random _
)
Visual C++
public:
virtual void SetRandomWeights(
	array<double,2>^ xData, 
	Random^ random
)

Parameters

xData
Type: array< System..::.Double ,2>[,](,)[,]
An input double matrix containing training patterns. The number of columns in xData must equal the number of nodes in the InputLayer.
random
Type: System..::.Random
A Random object.

Remarks

The random weights algorithm assigns equal weights to all Perceptrons, except those in the first layer connected to the InputLayer. Like the equal weights algorithm, Perceptrons not in the first layer are assigned weights 1/k, where k is the number of InputNodes connected to that Perceptron.

For the first layer Perceptronweights are initially assigned values from the uniform random distribution on the interval [-0.5, +0.5]. These are then scaled using the training patterns. The random weights for a perceptron are divided by s, the standard deviation of the potential for that Perceptron calculated using the initial random values. Its bias value is set to -avg/s, where avg is the average potential for that Perceptron. This makes the average potential for the Perceptrons in this first layer approximately 0 and its standard deviation equal to 1.

This reduces the possibility of saturation during network training resulting from very large or small values for the Perceptrons potential. During training random noise is added to these intial values at each training stage. If the EpochTrainer is used, noise is added to these initial values at the start of each epoch.

See Also