Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Unexpected neural network output

Subject: Unexpected neural network output

From: Ton Schomaker

Date: 1 May, 2013 09:31:10

Message: 1 of 6

I have trained a very good ff neural network with the restiction that any increase of each input parameters A, B, C... must give higher output values (R2 = 0.94). However when I use the best trained NN using better (i.e. increased) input data the output sometimes is lower than the original target value.
Does someone have a solution for this problem?

Subject: Unexpected neural network output

From: Greg Heath

Date: 1 May, 2013 09:51:08

Message: 2 of 6

"Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klqncu$s1o$1@newscl01ah.mathworks.com>...
> I have trained a very good ff neural network with the restiction that any increase of each input parameters A, B, C... must give higher output values (R2 = 0.94). However when I use the best trained NN using better (i.e. increased) input data the output sometimes is lower than the original target value.
> Does someone have a solution for this problem?

Good solutions have a mean error of 0. Therefore, ~50% of the answers should be above the target value.

Hope this helps.

Greg

Subject: Unexpected neural network output

From: Ton Schomaker

Date: 1 May, 2013 10:08:10

Message: 3 of 6

"Greg Heath" <heath@alumni.brown.edu> wrote in message <klqoic$1km$1@newscl01ah.mathworks.com>...
> "Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klqncu$s1o$1@newscl01ah.mathworks.com>...
> > I have trained a very good ff neural network with the restiction that any increase of each input parameters A, B, C... must give higher output values (R2 = 0.94). However when I use the best trained NN using better (i.e. increased) input data the output sometimes is lower than the original target value.
> > Does someone have a solution for this problem?
>
> Good solutions have a mean error of 0. Therefore, ~50% of the answers should be above the target value.
>
> Hope this helps.
>
> Greg

You are right Greg, but that is not my problem. Let me try to explain. I am using NN to improve water quality of natural swimming waters. So I use data from a lot of these water with amongst others the amount of sewer outlets. With less outlets as input for the trained NN I sometimes get worse water quality while expecting a better one.
How can I train a new networks that avoids this error?

Kind regads, Ton

Subject: Unexpected neural network output

From: Greg Heath

Date: 2 May, 2013 05:05:09

Message: 4 of 6

"Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klqpia$41h$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <klqoic$1km$1@newscl01ah.mathworks.com>...
> > "Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klqncu$s1o$1@newscl01ah.mathworks.com>...
> > > I have trained a very good ff neural network with the restiction that any increase of each input parameters A, B, C... must give higher output values (R2 = 0.94). However when I use the best trained NN using better (i.e. increased) input data the output sometimes is lower than the original target value.
> > > Does someone have a solution for this problem?
> >
> > Good solutions have a mean error of 0. Therefore, ~50% of the answers should be above the target value.
> >
> > Hope this helps.
> >
> > Greg
>
> You are right Greg, but that is not my problem. Let me try to explain. I am using NN to improve water quality of natural swimming waters. So I use data from a lot of these water with amongst others the amount of sewer outlets. With less outlets as input for the trained NN I sometimes get worse water quality while expecting a better one.
> How can I train a new networks that avoids this error?
>
> Kind regads, Ton

I'm sorry I don't fully understand your problem. However, the basic assumption
is that the training data adequately characterizes the probability distribution of the
operational data. If that is not true, then you have to add simulated data which
contains the correct characteristics.

Well trained NNs can be excellent interpolators BUT terrible extrapolators.

I would expect if you used the smallest amount of hidden nodes possible,
you might get better extrapolation; However, don't bet on it with the kids
tuition.

Hope this helps

Greg

Subject: Unexpected neural network output

From: Ton Schomaker

Date: 2 May, 2013 11:20:09

Message: 5 of 6

"Greg Heath" <heath@alumni.brown.edu> wrote in message <klss65$j8m$1@newscl01ah.mathworks.com>...
> "Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klqpia$41h$1@newscl01ah.mathworks.com>...
> > "Greg Heath" <heath@alumni.brown.edu> wrote in message <klqoic$1km$1@newscl01ah.mathworks.com>...
> > > "Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klqncu$s1o$1@newscl01ah.mathworks.com>...
> > > > I have trained a very good ff neural network with the restiction that any increase of each input parameters A, B, C... must give higher output values (R2 = 0.94). However when I use the best trained NN using better (i.e. increased) input data the output sometimes is lower than the original target value.
> > > > Does someone have a solution for this problem?
> > >
> > > Good solutions have a mean error of 0. Therefore, ~50% of the answers should be above the target value.
> > >
> > > Hope this helps.
> > >
> > > Greg
> >
> > You are right Greg, but that is not my problem. Let me try to explain. I am using NN to improve water quality of natural swimming waters. So I use data from a lot of these water with amongst others the amount of sewer outlets. With less outlets as input for the trained NN I sometimes get worse water quality while expecting a better one.
> > How can I train a new networks that avoids this error?
> >
> > Kind regads, Ton
>
> I'm sorry I don't fully understand your problem. However, the basic assumption
> is that the training data adequately characterizes the probability distribution of the
> operational data. If that is not true, then you have to add simulated data which
> contains the correct characteristics.
>
> Well trained NNs can be excellent interpolators BUT terrible extrapolators.
>
> I would expect if you used the smallest amount of hidden nodes possible,
> you might get better extrapolation; However, don't bet on it with the kids
> tuition.
>
> Hope this helps
>
> Greg

Thanks Greg. I already work with your suggestions (good data quality and only one hidden layer). Maybe the problem occurs due to the fact that I have only 4 discrete target values: 1, 2, 3 and 4. My NN is interpolatiing giving output also between those values. Maybe I should urge the NN to only give the discrete 4 values. How can I achive this?

Ton

Subject: Unexpected neural network output

From: Greg Heath

Date: 3 May, 2013 02:39:09

Message: 6 of 6

"Ton Schomaker" <t.schomaker@royalhaskoning.com> wrote in message <klti59$fp7$1@newscl01ah.mathworks.com>...
 
> Thanks Greg. I already work with your suggestions (good data quality and only one hidden layer). Maybe the problem occurs due to the fact that I have only 4 discrete target values: 1, 2, 3 and 4. My NN is interpolatiing giving output also between those values. Maybe I should urge the NN to only give the discrete 4 values. How can I achive this?

Treat it like a classifier:
Take target columns from eye(4)'

Hope this helps.

Greg

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us