c# - Encog Neural Network Error Never Changing -


i'm getting started working neural networks. have adapted xor example provided own purposes, when run error never changes.

the function i'm trying approximate takes 4 doubles , outputs 1 double, 4 inputs positive, output can negative or positive (majority positive). starters using 50 records train data.

working xor (the error goes down each iteration)

    public static double[][] xorinput = {         new[] {0.0, 0.0},         new[] {1.0, 0.0},         new[] {0.0, 1.0},         new[] {1.0, 1.0}     };      public static double[][] xorideal = {         new[] {0.0},         new[] {1.0},         new[] {1.0},         new[] {0.0}     };      basicnetwork network = new basicnetwork();     network.addlayer(new basiclayer(null, true, 2));     network.addlayer(new basiclayer(new activationsigmoid(), true, 3));     network.addlayer(new basiclayer(new activationsigmoid(), false, 1));     network.structure.finalizestructure();     network.reset();      imldataset trainingdata = new basicmldataset(xorinput, xorideal);      imltrain train = new resilientpropagation(network, trainingdata); 

not working (the error never goes down):

    basicnetwork network = new basicnetwork();     network.addlayer(new basiclayer(null, true, 4));     network.addlayer(new basiclayer(new activationsigmoid(), true, 6));     network.addlayer(new basiclayer(new activationsigmoid(), false, 1));     network.structure.finalizestructure();     network.reset();      imldataset trainingdata = new basicmldataset(myinput, myexpectedoutput);      imltrain train = new resilientpropagation(network, trainingdata); 

a few sample records of training data:

    input:               2.54, 3.15, 3.4, 1.73     5.3, 1.78, 3.9, 2.04     1.71, 5.4, 4.3, 2.26     1.62, 6.4, 4, 1.89     1.45, 8.4, 5.2, 2.14      output:     5.59     11.05     6.89     10.4     -0.56 

i believe problem activation function isn't firing. thought might because activationsigmoid() inappropriate problem, have tried activationtanh() exact same results.

the problem values weren't being normalised.

to work activation functions of inputs , outputs must between 1 , 0 (activationsigma) , -1 , 1 (activationtanh). need function normalise values range need be.

this link of great me:

http://www.heatonresearch.com/wiki/range_normalization


Comments

Popular posts from this blog

javascript - Count length of each class -

What design pattern is this code in Javascript? -

hadoop - Restrict secondarynamenode to be installed and run on any other node in the cluster -