What are the Inputs, Outputs and Target in ANN

I am getting confusing about Inputs data set, outputs and target. I am studying about Artificial Neural Network in Matlab, my purposed is that I wanted to use the history data (I have rainfall and water levels for 20 years ago) to predict water level in the future (for example 2014). So, where is my inputs, targets, and output? For example i have a Excel sheet data as [Column1-Date| Column2-Rainfall | Column3 |Water level]

I am using this code to prediction, but it could not predict in the future, can anyone help me to fix it again? Thank you .

%% 1. Importing data
Data_Inputs=xlsread('demo.xls'); % Import file

Training_Set=Data_Inputs(1:end,2);%specific training set

Target_Set=Data_Inputs(1:end,3); %specific target set

Input=Training_Set'; %Convert to row

Target=Target_Set'; %Convert to row

X = con2seq(Input); %Convert to cell

T = con2seq(Target); %Convert to cell

%% 2. Data preparation

N = 365; % Multi-step ahead prediction

% Input and target series are divided in two groups of data:
% 1st group: used to train the network

inputSeries  = X(1:end-N);

targetSeries = T(1:end-N);

inputSeriesVal  = X(end-N+1:end);

targetSeriesVal = T(end-N+1:end); 
% Create a Nonlinear Autoregressive Network with External Input
delay = 2;
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);

% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.

[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);

% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;

% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);

% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)

% View the Network

% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotregression(targets,outputs)
%figure, plotresponse(targets,outputs)
%figure, ploterrcorr(errors)
%figure, plotinerrcorr(inputs,errors)

% Closed Loop Network
% Use this network to do multi-step prediction.
 % The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(netc,tc,yc)

% Early Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is given y(t+1).
% For some applications such as decision making, it would help to have predicted
% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early by removing one delay
% so that its minimal tap delay is now 0 instead of 1.  The new network returns the
% same outputs as the original network, but outputs are shifted left one timestep.
nets = removedelay(net);
nets.name = [net.name ' - Predict One Step Ahead'];
[xs,xis,ais,ts] = preparets(nets,inputSeries,{},targetSeries);
ys = nets(xs,xis,ais);
earlyPredictPerformance = perform(nets,ts,ys)

%% 5. Multi-step ahead prediction

inputSeriesPred  = [inputSeries(end-delay+1:end),inputSeriesVal];

targetSeriesPred = [targetSeries(end-delay+1:end), con2seq(nan(1,N))];

[Xs,Xi,Ai,Ts] = preparets(netc,inputSeriesPred,{},targetSeriesPred);

yPred = netc(Xs,Xi,Ai);

perf = perform(net,yPred,targetSeriesVal);



legend('Original Targets','Network Predictions','Expected Outputs');


Inputs and targets are data you are using to train net. Inputs and targets are correct data that is known. After you have trained net, you send again only inputs, and your output would be predicted based on inputs and targets you have sent in training session. So your targets would be the correct output for data you have already know.

As I can understand you are trying to predict future and about future you have only date? If I am wrong correct me. So in this case:

Before training:

input1 = date; input2 = rainFall;
input = [input1; input2];
target = waterLevel;

Because you want to get back the result of water level from the net, your targets should be also water level. Now you train net;

..train(net, input, target..

After training Now as you said you want to predict water level, but you gave only date for example 2015-11-11, so in this case it's impossible because you need rain fall info, so if you still want to predict your water level based on date you need to predict rain fall too, or eliminate it, because it's not helping when you don't know it anymore.

I'd say your inputs are both the rainfall and the water level, the target is the water level for the next year and the output is the predicted water level.

In other words, when training, your inputs should be rainfall(k-2:k-1) (direct input) and waterlevel(k-2:k-1) (as feedback). Your target is waterlevel(k). That should output an estimation of the water level for year k (waterlevel_hat(k)). You can compute the error e = waterlevel_hat(k) - waterlevel(k) and use it to train the network. You should repeat the same process for all k > 2 (the reason is that you have 2 input delays and 2 feedback delays).

Need Your Help

Does the Lua compiler optimize local vars?

lua compiler-optimization

Is the current Lua compiler smart enough to optimize away local variables that are used for clarity?

How to popup a view from bottom?

iphone objective-c uikit uiviewcontroller

When click a button,I want to popup a view form bottom,like this!