Explanation for the output you’re getting:
Your patternnet is learning a mapping/relationship between the inputs and the target. Using the sim command, you’re predicting the values for every data point in inputs. Since inputs is of, I presume, size 4x2200, your predictions are of size 5x2200. Essentially, the network is predicting for every 4-length data point.
What you want is for the network to predict the future values. You haven’t mentioned whether there is temporal information in your dataset. That is, you haven’t mentioned whether these data points are sequential, in some sense, and have enough information among them to predict future target values. I am going to assume they do. There are, at least, two ways to achieve this.
The first way is for you train your network to predict the k-th future target which can be done with the following statement for some constant ‘k’:
[net, tr] = train(net, inputs(1:end-k), targets(1+k:end));
In this approach, you’re not learning/using any temporal relation among the datapoints. The network only learns a mapping/relation between the input for a given time instant and the target k steps into the future. This leads to the shortcoming that you’ll have to construct different networks for different values of ‘k’ if you’re constructing your networks using patternnet. You could make this better by constructing an equivalent network using fullyConnectedLayers of appropriate sizes. This has the advantage that you can drop the necessity for one-hot vectors. You can then replace the targets with a 5-20 length vectors containing the categorical target information for the next 5-20 time steps. Here’s an example to help you understand how fullyConnectedLayers are used: https://www.mathworks.com/help/deeplearning/ref/nnet.cnn.layer.fullyconnectedlayer.html#mw_bad3166a-93e7-42c0-8bd2-740cd2a842ad The second approach is to use an LSTM. This is a type of network which can learn relations between consecutive samples. This approach can be implemented like the previous approach except it also uses temporal information. You could train the network to learn a temporal relation between the current input and a 5-20 length target vector of future categorical information. Here’s a simple example for you to model your network: https://www.mathworks.com/help/deeplearning/examples/time-series-forecasting-using-deep-learning.html