How do you make predictions with a trained Neural Network (NAR)?

31 views (last 30 days)
Hello. I’m new here and I have a problem with the neural network toolbox that is unbearably frustrating because it should be the easiest thing in the world but I’m not getting it. My general Matlab skills are very moderate and I have never used the Neural Network function before until now.
I want to create a neural network that based on an input data series can predict values in the future. From what I understand the Nonlinear Autoregressive neural network should be perfect for this and I have tried for hours and hours to watch all of Matlabs own tutorials on how to use the neural network toolbox and read about it but it seems like all the tutorials basically stop after the data has been trained and refuses to tell you how to make any useful predictions at all with the network. Or I am just simply missing something that is really obvious to everyone else except me.
I read on a Matlab tutorial that they recommend you to use the GUI way of making a neural network first until you get the hang of it and can code it. So here is pretty much what I do when I follow the guides:
Write nnstart -> go to Time series app -> select NAR and chose my 1x1826 data series as a matrix row (imported from an excel sheet) -> select 10 hidden neurons & 2 delays.
Then I train the data using Levenberg-Marquardt and after the training it seems to have worked great. I click next and save the results and Finnish.
My workspace is now full of things generated by the neural network, for example a data series called “output” which of course sounds promising, but it is 1x1824 cells, in other words 2 values shorter than the original data series (because delay was set to 2 I guess).
And at this point I am totally lost. How do I make any predictions with this? I have also tried using the example input data in the toolbox instead of my own data series but it makes no difference. So after the training and everything is done, how in god’s name do I get the network to tell me the 1825th value of my data series?!
I have tired generating the script and trying to understand it from there, and If I’m lucky the script can tell me things like “perfc = 1.5278” and “stepAheadPerformance = 5.8328e-04”. I’m not sure what that means but it feels like Matlab is bragging about how good it could theoretically predict the next values, but it sure doesn’t like to give up its predicted values without a fight! :)
So can anyone help me out here? How do I predict values after the network has been trained?
Thanks in advance!
  5 Comments
NM
NM on 5 Dec 2017
I am facing the same problem. Did anyone find out? I am using the following command but it gives error: "Number of inputs does not match net.numInputs." Command: yPred= sim(net,X')';

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 5 Sep 2014
Edited: Greg Heath on 23 Nov 2017
%Peta on 2 Sep 2014 at 18:01
% Thanks for answering, I’m not sure what the ”autocorrelation function” is
Suggestions, not necessarily in order:
1. Use the command lookfor
lookfor autocorrelation
then use the help and doc commands on functions that are listed. For example
help nncorr
doc nncorr
2. Search in wikipedia
3. Search in the NEWSGROUP and ANSWERS. To reduce the results to a manageable size, use in combination with other reasonable searchwords e.g., some SUBSET of these
autocorrelation neural narnet nncorr greg
% but I when I change the number of hidden layers in the GUI and plot the “Error Autocorrelation” it basically looks exactly the same regardless of how many hidden layers I have. So I should set it to 1 in that case then?
I only use 1 hidden layer and try to minimize the number of hidden nodes subject to an upper bound on the MSE.
net = narnet(FD,H);
For efficient predictions FD should be a subset of the significant lags of the TARGET autocorrelation function. This calculation is independent of H. See some of my previous posts for examples
% I found the net.divideFcn in the script code and changed it like you said. But at point 4 and 5 I’m afraid I have pretty much no idea what you are talking about. I found nothing about Ntrails in the script, is this something I need to write myself? And if so is there any kind of example somewhere on how to do it? I’m afraid it is new to me.
Ntrials, not Ntrails. The number of trials for random initial weights given the number of hidden nodes.
I don't find the MATLAB help/doc script to be very helpful. For example, it says nothing about
1. Using capitals to indicate cell variables.
2. Using nncorr to find a reasonable subset of feedback delays
3. Using a for loop to find a reasonable value for H (e.g., Hmin:dH:Hmax)
4. Not using the default 'dividerand' because it destroys the correlations between the output and feedback signals
5. For each of numH candidate values for H, training success depends on starting with a good set of random initial weights. The best way to find one or more is to have an inner for loop over Ntrial random weight initializations that are created by the configure function.
6. Explicitly initializing the random number generator before the outer loop so that you can duplicate any of the numH*Ntrials designs
7. Often closing the loop on an openloop design to obtain netc does not yield acceptable results when inputted with original data. Therefore, the closeloop net should be trained beginning with the weights obtained from the openloop design to obtain netc2.
% And I’m not sure what to do with the LHS syntax thing, I did have xs ts xi and ai in my workspace so I tried adding the piece of code you wrote: % % [ net tr Ys Es Xf Af ] = train( net, Xs, Ts, Xi, Ai );
This code replaces the 3 step script
[ net tr ] = train( net, Xs, Ts, Xi, Ai );
[ Ys Xf Af ] = net( Xs, Xi, Ai );
Es = gsubtract(net,Ts,Ys);
Finally, to predict into the future M timesteps beyond the end of the target data
Xic2 = Xf;
Aic2 = Af;
Ypred = netc2( cell(1,M), Xic2, Aic2);
Hope this helps.
Thank you for formally accepting my answer
Greg
  3 Comments
Greg Heath
Greg Heath on 6 Sep 2014
Edited: Greg Heath on 15 Sep 2017
The most surprising thing that I found is that just closing the loop is often not sufficient. You may have to train netc! This is never mentioned in any of the documentation.
Probably, the closest you will come is one of my posts. Try searching in the NEWSGROUP and ANSWERS with
greg narnet closeloop
greg nar closeloop
However, none of them predict ahead with the empty cell. I just found out about that recently.
If none of them help, continue searching
without "greg"
Good Luck
Greg
Mahboubeh Molavi-Arabshahi
Hi every body
is it possible to tell more about netc2 in "Ypred = netc2( cell(1,M), Xic2, Aic2);"?
can you send the function? i can not find it
best

Sign in to comment.

More Answers (2)

desomon yang
desomon yang on 21 Apr 2016
Dear Peta and Greg I use onlinear autoregressive neural network With 2006 to 2015 the number of cars in Texas , predicted in 2016 that by 2020 the number of Texas.Code is as follows
x=[500 900 1200 1700 2100 2500 3100 4200 5400 7000];
lag=3;
iinput=x;
n=length(iinput);
inputs=zeros(lag,n-lag);
for i=1:n-lag
inputs(:,i)=iinput(i:i+lag-1)';
end
targets=x(lag+1:end);
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize);
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net,tr] = train(net,inputs,targets);
yn=net(inputs);
errors=targets-yn;
figure, ploterrcorr(errors)
figure, parcorr(errors)
%[h,pValue,stat,cValue]= lbqtest(errors)
figure,plotresponse(con2seq(targets),con2seq(yn))
%figure, ploterrhist(errors)
%figure, plotperform(tr)
fn=5;
f_in=iinput(n-lag+1:end)';
f_out=zeros(1,fn);
for i=1:fn
f_out(i)=net(f_in);
f_in=[f_in(2:end);f_out(i)];
end
figure,plot(2006:2015,iinput,'b',2015:2020,[iinput(end),f_out],'r')
  1 Comment
Hau Dang Trung
Hau Dang Trung on 21 Jun 2016
Edited: Hau Dang Trung on 21 Jun 2016
Dear Desomon yang, Your answer is very useful and clear. Thank you so much. In my problem, I use nonlinear autoregressive neural network with external input. For training, no problem. However, I can't predict after train. Please help me!.
this is my code.
clear all; clc; load magdata; n = 2000;
iinputs = u(1:n); ttargets = y(1:n); lag_int = 3; inputs = zeros(lag_int,n-lag_int+1); for i=1:n-lag_int+1 inputs(:,i)=iinputs(i:i+lag_int-1)'; end
lag_out = 3; targets = zeros(lag_out,n-lag_out+1); for i=1:n-lag_out+1 targets(:,i)=ttargets(i:i+lag_out-1)'; end
targets = targets(:,lag_int - lag_out + 1:end); inputs = [inputs;targets]; hiddenLayerSize = 10; net = fitnet(hiddenLayerSize); net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
[net,tr] = train(net,inputs,targets(end,:));
yn = net(inputs);
errors=targets(1,:)-yn(1,:); figure, ploterrcorr(errors) figure, parcorr(errors) figure,plotresponse(con2seq(targets),con2seq(yn))
fn = 100; inputs_test = u(n+1:n+fn)'; targets_test = y(n+1:n+fn)';
f_in = iinputs(n-lag_int+1:end)'; f_ou = ttargets(n-lag_out+1:end)';
f_ann = zeros(1,fn); for i=1:fn f_in = [f_in(2:end);inputs_test(i)]; fin = [f_in;f_ou]; f_ann(i) = net(fin); f_ou = [f_ou(2:end);f_ann(i)]; end
figure; plot(y(n + 1:n + fn),'r'); hold on; plot(f_ann,'--');

Sign in to comment.


Greg Heath
Greg Heath on 2 Sep 2014
Edited: Greg Heath on 15 Sep 2017
1. Use the autocorrelation function to find a subset of statistically significant lags to use in narnet. The default may be the wrong choice.
2. Start with the default H = 10.
a. If any of 10 random initial weight trial
designs are successful, try to find the smallest
successful value of H to improve the robustness
of the design
b. Otherwise increase H
3. Choose
net.divideFcn = 'divideblock'
the default 'dividerand' destroys correlations.
4. Use a loop and the function configure to obtain multiple (typically, I use Ntrials = 10) designs from random initial weights.
The last 30% of the series is not used, directly, to estimate weights.
Therefore, they are predictions.
The test output (last 15%) is unbiased. If validation stopping ended training, the validation output cannot be assumed to be unbiased. Regardless, I do look at it, compare it to both training and test results, and use it to determine the best of multiple designs.
5. Use the LHS syntax
[ net tr Ys Es Xf Af ] = train( net, Xs, Ts, Xi, Ai );
to get {X2i = Xf, A2i = Af } for new data
ynew = netc(xnew,X2i,A2i);
  6 Comments

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!