Performance in backpropagation algorithm

Posted by Taban on Stack Overflow See other posts from Stack Overflow or by Taban
Published on 2013-11-12T21:07:02Z Indexed on 2013/11/12 21:54 UTC
Read the original article Hit count: 433

Filed under:
|
|

I've written a matlab program for standard backpropagation algorithm, it is my homework and I should not use matlab toolbox, so I write the entire code by myself. This link helped me for backpropagation algorithm. I have a data set of 40 random number and initial weights randomly. As output, I want to see a diagram that shows the performance. I used mse and plot function to see performance for 20 epochs but the result is this: enter image description here

I heard that performance should go up through backpropagation, so I want to know is there any problem with my code or this result is normal because local minimums.

This is my code:

Hidden_node=inputdlg('Enter the number of Hidden nodes'); a=0.5;%initialize learning rate hiddenn=str2num(Hidden_node{1,1});

randn('seed',0);

%creating data set

s=2;

N=10;

m=[5 -5 5 5;-5 -5 5 -5];

S = s*eye(2);

[l,c] = size(m);

x = []; % Creating the training set

for i = 1:c

x = [x mvnrnd(m(:,i)',S,N)'];

end

% target value

toutput=[ones(1,N) zeros(1,N) ones(1,N) zeros(1,N)];

for epoch=1:20; %number of epochs

for kk=1:40; %number of patterns

%initial weights of hidden layer

for ii=1 : 2;

for jj=1 :hiddenn;

    whidden{ii,jj}=rand(1);

end

end

initial the wights of output layer

for ii=1 : hiddenn;

    woutput{ii,1}=rand(1);     

end

for ii=1:hiddenn;

x1=x(1,kk);

x2=x(2,kk);

w1=whidden{1,ii};

w2=whidden{2,ii};

activation{1,ii}=(x1(1,1)*w1(1,1))+(x2(1,1)*w2(1,1));

end

%calculate output of hidden nodes

for ii=1:hiddenn;

hidden_to_out{1,ii}=logsig(activation{1,ii});

end

activation_O{1,1}=0;  

for jj=1:hiddenn;

    activation_O{1,1} = activation_O{1,1}+(hidden_to_out{1,jj}*woutput{jj,1});

end

%calculate output

out{1,1}=logsig(activation_O{1,1});

out_for_plot(1,kk)= out{1,ii};

%calculate error for output node

delta_out{1,1}=(toutput(1,kk)-out{1,1});

%update weight of output node

for ii=1:hiddenn; woutput{ii,jj}=woutput{ii,jj}+delta_out{1,jj}*hidden_to_out{1,ii}*dlogsig(activation_O{1,jj},logsig(activation_O{1,jj}))*a;

end

%calculate error of hidden nodes

for ii=1:hiddenn;

delta_hidden{1,ii}=woutput{ii,1}*delta_out{1,1};

end

%update weight of hidden nodes

for ii=1:hiddenn;

for jj=1:2;

    whidden{jj,ii}= whidden{jj,ii}+(delta_hidden{1,ii}*dlogsig(activation{1,ii},logsig(activation{1,ii}))*x(jj,kk)*a);

end

end

   a=a/(1.1);%decrease learning rate

end

%calculate performance

e=toutput(1,kk)-out_for_plot(1,1);

perf(1,epoch)=mse(e);

end

plot(perf);

Thanks a lot.

© Stack Overflow or respective owner

Related posts about algorithm

Related posts about matlab