Neural Network Settings

Parent Previous Next

Neural Network Settings


The toolbox comes with fully configurable neural networks so if you want to, you can change almost any aspect of them. This, however, is not required as the defaults are more than enough to get you started. The following functions can be called from AFL to change the different settings:


SetPercentTestingData (Percent) - This function instructs the neural network training algorithm to set aside a certain percent of data purely for testing the NN performance. The performance on the training and testing data sets is then displayed as training progresses. If the MSE for the training set keeps improving while the MSE on the testing set goes nowhere or worsens and there is a big gap between the two, then you have a serious curve fitting problem.


EnableNetworkToAFL() - When neural network training is complete the neural network is saved to a file in WiseTraderToolbox\NeuralNetworks in C:\Program Files\AmiBroker\ . If this option is enabled the neural network will saved along with a converted version, which is an amibroker formula. This formula can be pasted into any Amibroker formula and used directly with no dependency on the plugin. All that is required is that the inputs to the neural network be in variables called input0...input1 etc. The output of the neural network will be assigned to output0.


DisableNetworkToAFL () - No AFL code will be generated along with the saved neural network.


DisableProgress() - Don't display a progress dialog box. Useful if training on multiple stocks via exploration window. (Use with care because if you disable the progress window you will not be able to terminate long training sessions).


EnableProgress() - Display the progress dialog box during training.


SetLearningAlgorithm (Algorithm) - This function sets the type of learning algorithm that will be used for training the neural networks.


0 is for the standard backpropation

1 is for an adaptive algorithm that learns faster but is a bit unstable. When choosing the adaptive algorithm the recommended starting learning rate is 0.3 and momentum is ignored as it is not needed.

2 RPROP - Resilient Backpropagation.

3 SARPROP - Resilient Backpropagation with simulated annealing to decrease chances of getting stuck in local minima. (This is probably the best algorithm you can choose it is highly recommended)

4 IRPROP - Resilient Backpropagation with back tracking.

5 ARPROP - Resilient Backpropagation with global error back tracking.


RestoreDefaults () - Restore all default values. If you use any of the functions below don't forget to call this at the end of the AFL script.


SetSeed(SeedVal) - When a neural network is created it is randomly initialized using this seed value. The default seed value is 1 but that can be changed using this function. Sometimes a neural network can get stuck in local minima when training so changing the seed can sometimes help improve the results.


SetMomentum(Momentum) - The momentum of a neural network affects how fast the neural network trains. Setting this value too large can affect performance because it can cause the neural network to jump over the optimum point but it can also speed up convergence.


SetLearningRate(LearningRate) - Learning rate is affected by momentum and by default is 0.1 but can be much lower to increase the neural network sensitivity or much higher. Setting this value too high can make the neural network jump over the optimum result just like momentum.


SetMaximumEpochs(Epochs) - This function sets the number of training runs for the neural network. The more epochs the better learned the neural network will be. But setting this value too high can lead to curve fitting so one should experiment to find the best number of epochs.


SetMaximumThreads(NoThreads) - Set the number of processor threads to use for neural network training. Maximum is the number of CPU cores available on your computer. By default all CPU cores will be utilized.


SetErrorAlgorithm(Algorithm) - Sets the neural network error training algorithm. This is the algorithm that compares your desired output with the actual neural network output and determines what the error is. There are currently two options. When "Algorithm" is set to 1 the neural network error training algorithm is linear. This algorithm is simply subtracts a desired output with the actual neural network output. When the "Algorithm" is set to 2 error algorithm is Tan H. This algorithm treats small errors as less significant and large errors as more significant.


SetSarpropTemperature(Temperature) - This function sets the simulated annealing back propagation (SARPROP) algorithm's temperature. The temperature value determines how fast the random simulated annealing weight adjustment is reduced. Good values for the temperature are 0.01, 0.015, 0.02, 0.03, 0.04, 0.05, 0.07. You will need to determine which value is appropriate for your problem.


EnableShuffleData() - This function and enables the random shuffling of all the data.


DisableShuffleData() - This function disables the random shuffling of all data.