ELman神经网络matlab实现
ELman神經網絡matlab實現
by:Z.H.Gao
一.輸入樣本
用sin(xt)、sin(2xt)、sin(0.5xt)和時間t,預測cos(xt)
FIG.1. 原始數據
二.matlab實現代碼
clear all;close all;clc
%sine,‘tanh’,Lrate=0.02,Nohidden=12;
%sine,‘sigmoid’,Lrate=0.2,Nohidden=12;
load sine
sine=sine’;
sine=mapminmax(sine,0,1);
sine=sine’;
t=sine(:,1);
inst=sine(:,2:end-1);
label=sine(:,end);
%%
datalength=90;
trainx=inst(1:datalength,:);
trainy=label(1:datalength,:);
testx=inst(datalength+1:end,:);
testy=label(datalength+1:end,:);
%%
epoch=1000;
Lrate=0.8;
momentum=1;
backstep=20;
ActivationF=‘sigmoid’;
Nohidden=12;%隱藏層節點數目不能取太小
inputW=2 * rand(size(trainx,2),Nohidden)-1;
inputB=rand(1,Nohidden);
inputBW=[inputB;inputW];
outputW=2 * rand(Nohidden,size(trainy,2))-1;
outputB=rand(1,size(trainy,2));
outputBW=[outputB;outputW];
hiddenW=2 * rand(Nohidden,Nohidden)-1;
stateH=zeros(datalength,Nohidden);
%%
for v=1:1:epoch
%%
%正向計算,樣本由1到n,順序輸入
for i=1:1:datalength
x=[1,trainx(i,:)];
if i==1
tempH(i,:)=x * inputBW;
else
tempH(i,:)=x * inputBW+stateH(i-1,:) * hiddenW;
end
H = ActivationFunction(tempH(i,:),ActivationF);
stateH(i,:) = H;
tempY(i,1) = [1,H] * outputBW;
end
trainResult = ActivationFunction(tempY,ActivationF);
Error=trainResult-trainy;
trainMSE(v,1)=sum(sum(Error.^2))/datalength;
%%
%反向計算,回溯的樣本不能太少
DinputBW=zeros(size(inputBW));
DhiddenW=zeros(size(hiddenW));
Dout=Error. * GradientValue(tempY,ActivationF);
DoutputBW=[ones(datalength,1),stateH]’ * Dout;
DH=Dout * outputBW’;
DH=DH(:,2:end);
for i = datalength: -1 :1
DtempH = DH(i,:). * GradientValue(tempH(i,:),ActivationF);
for bptt_i = i: -1 :max(1,i-backstep)
DinputBW=DinputBW+[1,trainx(bptt_i,:)]’ * DtempH;
if bptt_i-1>0
DhiddenW=DhiddenW+stateH(bptt_i-1,:)’ * DtempH;
DtempH=DtempH*hiddenW’. * GradientValue(tempH(bptt_i-1,:),ActivationF);
end
end
end
%%
inputBW=inputBW-Lrate * DinputBW;
hiddenW=hiddenW-Lrate * DhiddenW;
outputW=outputW-Lrate * DoutputW;
% Lrate=0.9999 * Lrate;
%%
end
%%
%測試過程
for i=1:1:size(testx,1)
x=[1,testx(i,:)];
tempH(i+datalength,:)=x * inputBW+stateH(i+datalength-1,:)*hiddenW;
H = ActivationFunction(tempH(i+datalength,:),ActivationF);
stateH(i+datalength,:) = H;
tempResult(i,1) = [1,H]*outputBW;
end
testResult = ActivationFunction(tempResult,ActivationF);
Error=testResult-testy;
testMSE=sum(sum(Error.^2))/size(testx,1)
%%
t1=t(1:datalength,:);t2=t(datalength+1:end,:);
figure(1);plot(trainMSE);
figure(2);plot(t1,trainy,’-*b’);hold on;plot(t1,trainResult,’-or’);
hold on;plot(t2,testy,’-*k’);hold on;plot(t2,testResult,’-og’);
訓練MSE訓練MSE訓練MSE
ELman計算結果ELman計算結果ELman計算結果
三. matlab tool box 實現ELman
clear all;close all;clc
%%%%%%%%%%%%%%%%%%%%type、open、edit可以打開源代碼%%%%%%%%%%%%%%%%%%%%
load sine
t=sine(:,1);
inst=sine(:,2:end-1);
label=sine(:,end);
%%
datalength=90;
trainx=inst(1:datalength,:)’;
trainy=label(1:datalength,:)’;
testx=inst(datalength+1:end,:)’;
testy=label(datalength+1:end,:)’;
%%
TF1=‘tansig’;TF2=‘tansig’;%‘tansig’,‘purelin’,‘logsig’
net=newelm(trainx,trainy,[6,4],{TF1 TF2},‘traingda’);
net.trainParam.epochs=1000;
net.trainParam.goal=1e-7;
net.trainParam.lr=0.5;
net.trainParam.mc=0.9;%動量因子的設置,默認為0.9
net.trainParam.show=25;%顯示的間隔次數
net.trainFcn=‘traingda’;
net.divideFcn=’’;
[net,tr]=train(net,trainx,trainy);
[trainoutput,trainPerf]=sim(net,trainx,[],[],trainy);%sim(網絡,輸入,初始輸入延遲,初始層延遲,輸出,初始輸出延遲,最終層延遲)
[testoutput,testPerf]=sim(net,testx,[],[],testy);%測試數據,經BP得到的結果;
%%
MSE=mse(testoutput-testy)
figure(1)
t1=t(1:datalength,:);t2=t(datalength+1:end,:);
plot(t1,trainy,’-k*’);hold on;plot(t1,trainoutput,’-g*’);
plot(t2,testy,’-b*’);hold on;plot(t2,testoutput,’-r*’);
參考文獻
[1] https://zhuanlan.zhihu.com/p/26891871
[2] https://zhuanlan.zhihu.com/p/26892413
[3] https://zybuluo.com/hanbingtao/note/541458
總結
以上是生活随笔為你收集整理的ELman神经网络matlab实现的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: MSP430G2553 移植 Conti
- 下一篇: 使用IAR开发CC2530遇到的两个问题