matlab神经网络每次相差大,GA-BP网络为什么每次训练的结果相差很大呢?
代碼貼上來了
function [tsp mint maxt net]=GABPNET(XX,YY,YJ,SCC)
%XX輸入的訓(xùn)練樣本與測試樣本
%YY訓(xùn)練樣本輸出值
%YJ隱層節(jié)點數(shù)
%SCC輸出層節(jié)點數(shù)
%tsp訓(xùn)練樣本及測試樣本輸出值,列對應(yīng)列
%反歸一化參數(shù)mint,maxt(y=[x,mint,maxt])
%net訓(xùn)練好的神經(jīng)網(wǎng)絡(luò)
%--------------------------------------------------------------------------
%??GABPNET.m
%??使用遺傳算法對BP網(wǎng)絡(luò)權(quán)值閾值進(jìn)行優(yōu)化,再用BP算法訓(xùn)練網(wǎng)絡(luò)
%--------------------------------------------------------------------------
clear all
clc
global S1;
YJ=10;SCC=1;
%數(shù)據(jù)歸一化預(yù)處理
day=subfv;
XX=day(:,1:5)';
[XX,minp,maxp] = premnmx(XX);
%****************其他使用者這里必須修改************
YY=day(:,6)';
[YY,mint,maxt] = premnmx(YY);
%創(chuàng)建網(wǎng)絡(luò)
S1=YJ;
net=newff(minmax(XX),[S1,SCC],{'tansig','tansig','purelin'},'trainlm');
%下面使用遺傳算法對網(wǎng)絡(luò)進(jìn)行優(yōu)化
P=XX;
T=YY;
R=size(P,1);
S2=size(T,1);
S=R*S1+S1*S2+S1+S2;%遺傳算法編碼長度
aa=ones(S,1)*[-1,1];
popu=50;%種群規(guī)模
save data2 XX YY % 是將 xx,yy 二個變數(shù)的數(shù)值存入 data2 這個MAT-file,
initPpp=initializega(popu,aa,'gabpEval');%初始化種群
gen=100;%遺傳代數(shù)
%下面調(diào)用gaot工具箱,其中目標(biāo)函數(shù)定義為gabpEval
[x,endPop,bPop,trace]=ga(aa,'gabpEval',[],initPpp,[1e-6 1 1],'maxGenTerm',gen,...
'normGeomSelect',[0.09],['arithXover'],[2],'nonUnifMutation',[2 gen 3]);
%繪收斂曲線圖
figure(1)
plot(trace(:,1),1./trace(:,3),'r-');
hold on
plot(trace(:,1),1./trace(:,2),'b-');
xlabel('Generation');
ylabel('Sum-Squared Error');
figure(2)
plot(trace(:,1),trace(:,3),'r-');
hold on
plot(trace(:,1),trace(:,2),'b-');
xlabel('Generation');
ylabel('Fittness');
%下面將初步得到的權(quán)值矩陣賦給尚未開始訓(xùn)練的BP網(wǎng)絡(luò)
[W1,B1,W2,B2,P,T,A1,A2,SE,val]=gadecod(x);
net.iW{1,1}=W1;
net.LW{2,1}=W2;
net.b{1,1}=B1;
net.b{2,1}=B2;
XX=P;
YY=T;
%設(shè)置訓(xùn)練參數(shù)
net.trainParam.show=100;
net.trainParam.lr=0.1;
net.trainParam.epochs=5000;
net.trainParam.goal=0.0001;
%訓(xùn)練網(wǎng)絡(luò)
net=train(net,XX,YY);
y=sim(net,XX);
p2=[0.0909??0.6500??2.3895??0.8983??0.8520
0.1429??0.7500??2.0571??0.9855??0.8724
0.0769??0.6500??1.8432??1.0372??0.8653
0.2000??0.7500??1.8432??1.0372??0.8653
0.0769??0.6700??2.3256??0.9151??0.8802
0.0625??0.7300??2.5762??0.8501??0.7776
0.1429??0.7000??1.6233??1.0780??0.5834
0.1761??0.7270??2.1100??0.9719??0.8728
0.2000??0.6800??2.2200??0.9430??0.9263
0.1250??0.6000??2.0533??0.9865??0.8029
0.1429??0.7200??2.0840??0.9786??0.7771
0.1351??0.6910??2.2916??0.9241??0.8106
0.1429??0.6000??2.4423??0.8845??0.6881
]';
%p2=p2';
p2n=tramnmx(p2,minp,maxp);
a2n=sim(net,p2n);
a2=postmnmx(a2n,mint,maxt)
t2=[3.3000
3.3300
0.7400
12.0300
0.4500
0.7562
2.0167
3.8670
1.7100
0.7700
2.1540
1.7100
0.7600
]';
xxxx=1:13
figure(3)
plot(xxxx,t2,'*b-')
hold on
plot(xxxx,a2,'or-')
hold off
figure(4)
wuc=t2-a2
plot(1:13,wuc,'*--')
總結(jié)
以上是生活随笔為你收集整理的matlab神经网络每次相差大,GA-BP网络为什么每次训练的结果相差很大呢?的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Android进程间的通信之Messen
- 下一篇: 如何识别低位放量上涨出逃形态