Expectation Maximization-EM(期望最大化)-算法以及源码
在統(tǒng)計(jì)計(jì)算中,最大期望(EM)算法是在概率(probabilistic)模型中尋找參數(shù)最大似然估計(jì)的算法,其中概率模型依賴于無(wú)法觀測(cè)的隱藏變量(Latent Variable)。最大期望經(jīng)常用在機(jī)器學(xué)習(xí)和計(jì)算機(jī)視覺的數(shù)據(jù)聚類(Data Clustering) 領(lǐng)域。最大期望算法經(jīng)過(guò)兩個(gè)步驟交替進(jìn)行計(jì)算,第一步是計(jì)算期望(E),利用對(duì)隱藏變量的現(xiàn)有估計(jì)值,計(jì)算其最大似然估計(jì)值;第二步是最大化(M),最大 化在 E 步上求得的最大似然值來(lái)計(jì)算參數(shù)的值。M 步上找到的參數(shù)估計(jì)值被用于下一個(gè) E 步計(jì)算中,這個(gè)過(guò)程不斷交替進(jìn)行。
最大期望值算法由?Arthur Dempster,Nan Laird和Donald Rubin在他們1977年發(fā)表的經(jīng)典論文中提出。他們指出此方法之前其實(shí)已經(jīng)被很多作者"在他們特定的研究領(lǐng)域中多次提出過(guò)"。
我們用??表示能夠觀察到的不完整的變量值,用??表示無(wú)法觀察到的變量值,這樣??和??一起組成了完整的數(shù)據(jù)。?可能是實(shí)際測(cè)量丟失的數(shù)據(jù),也可能是能夠簡(jiǎn)化問(wèn)題的隱藏變量,如果它的值能夠知道的話。例如,在混合模型(Mixture Model)中,如果“產(chǎn)生”樣本的混合元素成分已知的話最大似然公式將變得更加便利(參見下面的例子)。
估計(jì)無(wú)法觀測(cè)的數(shù)據(jù)
讓??代表矢量 θ:??定義的參數(shù)的全部數(shù)據(jù)的概率分布(連續(xù)情況下)或者概率聚類函數(shù)(離散情況下),那么從這個(gè)函數(shù)就可以得到全部數(shù)據(jù)的最大似然值,另外,在給定的觀察到的數(shù)據(jù)條件下未知數(shù)據(jù)的條件分布可以表示為:
EM算法有這么兩個(gè)步驟E和M:
舉個(gè)例子吧:高斯混合
假設(shè)?x?=?(x1,x2,…,xn) 是一個(gè)獨(dú)立的觀測(cè)樣本,來(lái)自兩個(gè)多元d維正態(tài)分布的混合, 讓z=(z1,z2,…,zn)是潛在變量,確定其中的組成部分,是觀測(cè)的來(lái)源.
即:
where
目標(biāo)呢就是估計(jì)下面這些參數(shù)了,包括混合的參數(shù)以及高斯的均值很方差:
似然函數(shù):
where??是一個(gè)指示函數(shù) ,f?是 一個(gè)多元正態(tài)分布的概率密度函數(shù). 可以寫成指數(shù)形式:
下面就進(jìn)入兩個(gè)大步驟了:
E-step
給定目前的參數(shù)估計(jì)?θ(t),??Zi?的條件概率分布是由貝葉斯理論得出,高斯之間用參數(shù) τ加權(quán):
因此,E步驟的結(jié)果:
M步驟
Q(θ|θ(t))的二次型表示可以使得 最大化θ相對(duì)簡(jiǎn)單.??τ, (μ1,Σ1) and (μ2,Σ2) 可以單獨(dú)的進(jìn)行最大化.
首先考慮?τ, 有條件τ1?+?τ2=1:
和MLE的形式是類似的,二項(xiàng)分布 , 因此:
下一步估計(jì) (μ1,Σ1):
和加權(quán)的 MLE就正態(tài)分布來(lái)說(shuō)類似
對(duì)稱的:
這個(gè)例子來(lái)自Answers.com的Expectation-maximization algorithm,由于還沒(méi)有深入體驗(yàn),心里還說(shuō)不出一些更通俗易懂的東西來(lái),等研究了并且應(yīng)用了可能就有所理解和消化。另外,liuxqsmile也做了一些理解和翻譯。
============
在網(wǎng)上的源碼不多,有一個(gè)很好的EM_GM.m,是滑鐵盧大學(xué)的Patrick P. C. Tsui寫的,拿來(lái)分享一下:
運(yùn)行的時(shí)候可以如下進(jìn)行初始化:
下面是程序源碼: % matlab codefunction [W,M,V,L] = EM_GM(X,k,ltol,maxiter,pflag,Init) % [W,M,V,L] = EM_GM(X,k,ltol,maxiter,pflag,Init) % % EM algorithm for k multidimensional Gaussian mixture estimation % % Inputs: % X(n,d) - input data, n=number of observations, d=dimension of variable % k - maximum number of Gaussian components allowed % ltol - percentage of the log likelihood difference between 2 iterations ([] for none) % maxiter - maximum number of iteration allowed ([] for none) % pflag - 1 for plotting GM for 1D or 2D cases only, 0 otherwise ([] for none) % Init - structure of initial W, M, V: Init.W, Init.M, Init.V ([] for none) % % Ouputs: % W(1,k) - estimated weights of GM % M(d,k) - estimated mean vectors of GM % V(d,d,k) - estimated covariance matrices of GM % L - log likelihood of estimates % % Written by % Patrick P. C. Tsui, % PAMI research group % Department of Electrical and Computer Engineering % University of Waterloo, % March, 2006 %%%%% Validate inputs %%%% if nargin <= 1,disp('EM_GM must have at least 2 inputs: X,k!/n')return elseif nargin == 2,ltol = 0.1; maxiter = 1000; pflag = 0; Init = [];err_X = Verify_X(X);err_k = Verify_k(k);if err_X | err_k, return; end elseif nargin == 3,maxiter = 1000; pflag = 0; Init = [];err_X = Verify_X(X);err_k = Verify_k(k);[ltol,err_ltol] = Verify_ltol(ltol);if err_X | err_k | err_ltol, return; end elseif nargin == 4,pflag = 0; Init = [];err_X = Verify_X(X);err_k = Verify_k(k);[ltol,err_ltol] = Verify_ltol(ltol);[maxiter,err_maxiter] = Verify_maxiter(maxiter);if err_X | err_k | err_ltol | err_maxiter, return; end elseif nargin == 5,Init = [];err_X = Verify_X(X);err_k = Verify_k(k);[ltol,err_ltol] = Verify_ltol(ltol);[maxiter,err_maxiter] = Verify_maxiter(maxiter);[pflag,err_pflag] = Verify_pflag(pflag);if err_X | err_k | err_ltol | err_maxiter | err_pflag, return; end elseif nargin == 6,err_X = Verify_X(X);err_k = Verify_k(k);[ltol,err_ltol] = Verify_ltol(ltol);[maxiter,err_maxiter] = Verify_maxiter(maxiter);[pflag,err_pflag] = Verify_pflag(pflag);[Init,err_Init]=Verify_Init(Init);if err_X | err_k | err_ltol | err_maxiter | err_pflag | err_Init, return; end elsedisp('EM_GM must have 2 to 6 inputs!');return end%%%% Initialize W, M, V,L %%%% t = cputime; if isempty(Init),[W,M,V] = Init_EM(X,k); L = 0; elseW = Init.W;M = Init.M;V = Init.V; end Ln = Likelihood(X,k,W,M,V); % Initialize log likelihood Lo = 2*Ln;%%%% EM algorithm %%%% niter = 0; while (abs(100*(Ln-Lo)/Lo)>ltol) & (niter<=maxiter),E = Expectation(X,k,W,M,V); % E-step[W,M,V] = Maximization(X,k,E); % M-stepLo = Ln;Ln = Likelihood(X,k,W,M,V);niter = niter + 1; end L = Ln;%%%% Plot 1D or 2D %%%% if pflag==1,[n,d] = size(X);if d>2,disp('Can only plot 1 or 2 dimensional applications!/n');elsePlot_GM(X,k,W,M,V);endelapsed_time = sprintf('CPU time used for EM_GM: %5.2fs',cputime-t);disp(elapsed_time);disp(sprintf('Number of iterations: %d',niter-1)); end %%%%%%%%%%%%%%%%%%%%%% %%%% End of EM_GM %%%% %%%%%%%%%%%%%%%%%%%%%%function E = Expectation(X,k,W,M,V) [n,d] = size(X); a = (2*pi)^(0.5*d); S = zeros(1,k); iV = zeros(d,d,k); for j=1:k,if V(:,:,j)==zeros(d,d), V(:,:,j)=ones(d,d)*eps; endS(j) = sqrt(det(V(:,:,j)));iV(:,:,j) = inv(V(:,:,j)); end E = zeros(n,k); for i=1:n,for j=1:k,dXM = X(i,:)'-M(:,j);pl = exp(-0.5*dXM'*iV(:,:,j)*dXM)/(a*S(j));E(i,j) = W(j)*pl;endE(i,:) = E(i,:)/sum(E(i,:)); end %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Expectation %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%function [W,M,V] = Maximization(X,k,E) [n,d] = size(X); W = zeros(1,k); M = zeros(d,k); V = zeros(d,d,k); for i=1:k, % Compute weightsfor j=1:n,W(i) = W(i) + E(j,i);M(:,i) = M(:,i) + E(j,i)*X(j,:)';endM(:,i) = M(:,i)/W(i); end for i=1:k,for j=1:n,dXM = X(j,:)'-M(:,i);V(:,:,i) = V(:,:,i) + E(j,i)*dXM*dXM';endV(:,:,i) = V(:,:,i)/W(i); end W = W/n; %%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Maximization %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%function L = Likelihood(X,k,W,M,V) % Compute L based on K. V. Mardia, "Multivariate Analysis", Academic Press, 1979, PP. 96-97 % to enchance computational speed [n,d] = size(X); U = mean(X)'; S = cov(X); L = 0; for i=1:k,iV = inv(V(:,:,i));L = L + W(i)*(-0.5*n*log(det(2*pi*V(:,:,i))) ...-0.5*(n-1)*(trace(iV*S)+(U-M(:,i))'*iV*(U-M(:,i)))); end %%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Likelihood %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%function err_X = Verify_X(X) err_X = 1; [n,d] = size(X); if n<d,disp('Input data must be n x d!/n');return end err_X = 0; %%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Verify_X %%%% %%%%%%%%%%%%%%%%%%%%%%%%%function err_k = Verify_k(k) err_k = 1; if ~isnumeric(k) | ~isreal(k) | k<1,disp('k must be a real integer >= 1!/n');return end err_k = 0; %%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Verify_k %%%% %%%%%%%%%%%%%%%%%%%%%%%%%function [ltol,err_ltol] = Verify_ltol(ltol) err_ltol = 1; if isempty(ltol),ltol = 0.1; elseif ~isreal(ltol) | ltol<=0,disp('ltol must be a positive real number!');return end err_ltol = 0; %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Verify_ltol %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%function [maxiter,err_maxiter] = Verify_maxiter(maxiter) err_maxiter = 1; if isempty(maxiter),maxiter = 1000; elseif ~isreal(maxiter) | maxiter<=0,disp('ltol must be a positive real number!');return end err_maxiter = 0; %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Verify_maxiter %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%function [pflag,err_pflag] = Verify_pflag(pflag) err_pflag = 1; if isempty(pflag),pflag = 0; elseif pflag~=0 & pflag~=1,disp('Plot flag must be either 0 or 1!/n');return end err_pflag = 0; %%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Verify_pflag %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%function [Init,err_Init] = Verify_Init(Init) err_Init = 1; if isempty(Init),% Do nothing; elseif isstruct(Init),[Wd,Wk] = size(Init.W);[Md,Mk] = size(Init.M);[Vd1,Vd2,Vk] = size(Init.V);if Wk~=Mk | Wk~=Vk | Mk~=Vk,disp('k in Init.W(1,k), Init.M(d,k) and Init.V(d,d,k) must equal!/n')returnendif Md~=Vd1 | Md~=Vd2 | Vd1~=Vd2,disp('d in Init.W(1,k), Init.M(d,k) and Init.V(d,d,k) must equal!/n')returnend elsedisp('Init must be a structure: W(1,k), M(d,k), V(d,d,k) or []!');return end err_Init = 0; %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Verify_Init %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%function [W,M,V] = Init_EM(X,k) [n,d] = size(X); [Ci,C] = kmeans(X,k,'Start','cluster', ...'Maxiter',100, ...'EmptyAction','drop', ...'Display','off'); % Ci(nx1) - cluster indeices; C(k,d) - cluster centroid (i.e. mean) while sum(isnan(C))>0,[Ci,C] = kmeans(X,k,'Start','cluster', ...'Maxiter',100, ...'EmptyAction','drop', ...'Display','off'); end M = C'; Vp = repmat(struct('count',0,'X',zeros(n,d)),1,k); for i=1:n, % Separate cluster pointsVp(Ci(i)).count = Vp(Ci(i)).count + 1;Vp(Ci(i)).X(Vp(Ci(i)).count,:) = X(i,:); end V = zeros(d,d,k); for i=1:k,W(i) = Vp(i).count/n;V(:,:,i) = cov(Vp(i).X(1:Vp(i).count,:)); end %%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Init_EM %%%% %%%%%%%%%%%%%%%%%%%%%%%%function Plot_GM(X,k,W,M,V) [n,d] = size(X); if d>2,disp('Can only plot 1 or 2 dimensional applications!/n');return end S = zeros(d,k); R1 = zeros(d,k); R2 = zeros(d,k); for i=1:k, % Determine plot range as 4 x standard deviationsS(:,i) = sqrt(diag(V(:,:,i)));R1(:,i) = M(:,i)-4*S(:,i);R2(:,i) = M(:,i)+4*S(:,i); end Rmin = min(min(R1)); Rmax = max(max(R2)); R = [Rmin:0.001*(Rmax-Rmin):Rmax]; clf, hold on if d==1,Q = zeros(size(R));for i=1:k,P = W(i)*normpdf(R,M(:,i),sqrt(V(:,:,i)));Q = Q + P;plot(R,P,'r-'); grid on,endplot(R,Q,'k-');xlabel('X');ylabel('Probability density'); else % d==2plot(X(:,1),X(:,2),'r.');for i=1:k,Plot_Std_Ellipse(M(:,i),V(:,:,i));endxlabel('1^{st} dimension');ylabel('2^{nd} dimension');axis([Rmin Rmax Rmin Rmax]) end title('Gaussian Mixture estimated by EM'); %%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Plot_GM %%%% %%%%%%%%%%%%%%%%%%%%%%%%function Plot_Std_Ellipse(M,V) [Ev,D] = eig(V); d = length(M); if V(:,:)==zeros(d,d),V(:,:) = ones(d,d)*eps; end iV = inv(V); % Find the larger projection P = [1,0;0,0]; % X-axis projection operator P1 = P * 2*sqrt(D(1,1)) * Ev(:,1); P2 = P * 2*sqrt(D(2,2)) * Ev(:,2); if abs(P1(1)) >= abs(P2(1)),Plen = P1(1); elsePlen = P2(1); end count = 1; step = 0.001*Plen; Contour1 = zeros(2001,2); Contour2 = zeros(2001,2); for x = -Plen:step:Plen,a = iV(2,2);b = x * (iV(1,2)+iV(2,1));c = (x^2) * iV(1,1) - 1;Root1 = (-b + sqrt(b^2 - 4*a*c))/(2*a);Root2 = (-b - sqrt(b^2 - 4*a*c))/(2*a);if isreal(Root1),Contour1(count,:) = [x,Root1] + M';Contour2(count,:) = [x,Root2] + M';count = count + 1;end end Contour1 = Contour1(1:count-1,:); Contour2 = [Contour1(1,:);Contour2(1:count-1,:);Contour1(count-1,:)]; plot(M(1),M(2),'k+'); plot(Contour1(:,1),Contour1(:,2),'k-'); plot(Contour2(:,1),Contour2(:,2),'k-'); %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%% End of Plot_Std_Ellipse %%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
from: http://www.zhizhihu.com/html/y2010/2109.html
總結(jié)
以上是生活随笔為你收集整理的Expectation Maximization-EM(期望最大化)-算法以及源码的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 最大似然估计Maximum-likeli
- 下一篇: 高等数学:第三章 微分中值定理与导数的应