Gradient decent on the inputs of a pre-trained neural network to achieve a target y-value

1 view (last 30 days)
I have a trained neural network which suitably maps my inputs to my outputs. Is it then possible to specify a desired y output and then use a gradient decent method to determine the optimum input values to get that output?
When using backpropegation, the partial derivative of a weight is used with error function to proportionally adjust the weights; is there a way to do something similar with the input values themselves and a target y value?
  1 Comment
Greg Heath
Greg Heath on 14 Jan 2015
So far, I don't know what you are doing or why. What are the dimensions of your input(I-dimensional x) and output (O-dimensional output y and target t) vectors? How many examples(N) and how many hidden nodes()?
Can you explain the problem in terms of the I-H-O network and what you want as a final answer?
desired answer = ?

Sign in to comment.

Answers (0)

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!