UA MATH566 统计理论 QE练习题2.2
UA MATH566 統計理論 QE練習題2.2
- 第五題
這是2014年5月的5題。
第五題
Part (a)
Joint density of the bivariate normal disrtribution is
fY1,Y2(y1,y2)=12πσ21?ρ2exp?(?12(1?ρ2)[(Y1?μ)2σ2+(Y2?μ)2σ2?2ρ(Y1?μ)(Y2?μ)σ2])f_{Y_1,Y_2}(y_1,y_2) = \frac{1}{2\pi \sigma^2\sqrt{1-\rho^2}}\exp \left(-\frac{1}{2(1-\rho^2)} \left[ \frac{(Y_1-\mu)^2}{\sigma^2} + \frac{(Y_2-\mu)^2}{\sigma^2} - \frac{2\rho(Y_1-\mu)(Y_2-\mu)}{\sigma^2} \right]\right)fY1?,Y2??(y1?,y2?)=2πσ21?ρ2?1?exp(?2(1?ρ2)1?[σ2(Y1??μ)2?+σ2(Y2??μ)2??σ22ρ(Y1??μ)(Y2??μ)?])
Joint likelihood for random sample is
L(α,β,σ2,ρ)=∏i=1nfY1,Y2(Y1i,Y2i)=(2π)nσ2n(1?ρ2)n/2×exp?(?12σ2(1?ρ2)[∑i=1n(Y1i?α?βxi)2+∑i=1n(Y2i?α?βxi)2?2ρ∑i=1n(Y1i?α?βxi)(Y2i?α?βxi)])L(\alpha,\beta,\sigma^2,\rho) = \prod_{i=1}^n f_{Y_1,Y_2}(Y_{1i},Y_{2i}) = (2\pi)^{n}\sigma^{2n}(1-\rho^2)^{n/2} \times \\ \exp \left( -\frac{1}{2\sigma^2(1-\rho^2)} \left[ \sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)^2 + \sum_{i=1}^n (Y_{2i}-\alpha - \beta x_i)^2 - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right] \right)L(α,β,σ2,ρ)=i=1∏n?fY1?,Y2??(Y1i?,Y2i?)=(2π)nσ2n(1?ρ2)n/2×exp(?2σ2(1?ρ2)1?[i=1∑n?(Y1i??α?βxi?)2+i=1∑n?(Y2i??α?βxi?)2?2ρi=1∑n?(Y1i??α?βxi?)(Y2i??α?βxi?)])
Let’s compute
[∑i=1n(Y1i?α?βxi)2+∑i=1n(Y2i?α?βxi)2?2ρ∑i=1n(Y1i?α?βxi)(Y2i?α?βxi)]=∑i=1nY1i2?2∑i=1nY1i(α+βxi)+∑i=1n(α+βxi)2+∑i=1nY2i2?2∑i=1nY2i(α+βxi)+∑i=1n(α+βxi)2?2ρ[∑i=1nY1iY2i?∑i=1n(α+βxi)Y1i?∑i=1n(α+βxi)Y2i+∑i=1n(α+βxi)2]=∑i=1nY1i2+∑i=1nY2i2+2nα2?2α∑i=1n(Y1i+Y2i)?2β∑i=1nxi(Y1i+Y2i)+2αβ∑i=1nxi+β2∑i=1nxi2?2ρ[∑i=1nY1iY2i?α∑i=1n(Y1i+Y2i)?β∑i=1nxi(Y1i+Y2i)+nα2+2αβ∑i=1nxi+β∑i=1nxi2]\left[ \sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)^2 + \sum_{i=1}^n (Y_{2i}-\alpha - \beta x_i)^2 - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right] \\ = \sum_{i=1}^n Y_{1i}^2 - 2\sum_{i=1}^n Y_{1i}(\alpha+\beta x_i) + \sum_{i=1}^n (\alpha+\beta x_i)^2 + \sum_{i=1}^n Y_{2i}^2 - 2\sum_{i=1}^n Y_{2i}(\alpha+\beta x_i) \\+ \sum_{i=1}^n (\alpha+\beta x_i)^2 - 2\rho \left[ \sum_{i=1}^n Y_{1i}Y_{2i} -\sum_{i=1}^n(\alpha+\beta x_i)Y_{1i} - \sum_{i=1}^n (\alpha+\beta x_i)Y_{2i} + \sum_{i=1}^n (\alpha+\beta x_i)^2 \right] \\ = \sum_{i=1}^n Y_{1i}^2+ \sum_{i=1}^n Y_{2i}^2 + 2n\alpha^2-2\alpha\sum_{i=1}^n \left( Y_{1i} + Y_{2i}\right) - 2\beta \sum_{i=1}^n x_i(Y_{1i} + Y_{2i})+2\alpha\beta\sum_{i=1}^n x_i + \beta^2 \sum_{i=1}^n x_i^2 \\ -2\rho \left[\sum_{i=1}^n Y_{1i}Y_{2i} -\alpha\sum_{i=1}^n \left( Y_{1i} + Y_{2i}\right)-\beta \sum_{i=1}^n x_i(Y_{1i} + Y_{2i})+n\alpha^2+2\alpha\beta \sum_{i=1}^n x_i + \beta\sum_{i=1}^n x_i^2 \right][i=1∑n?(Y1i??α?βxi?)2+i=1∑n?(Y2i??α?βxi?)2?2ρi=1∑n?(Y1i??α?βxi?)(Y2i??α?βxi?)]=i=1∑n?Y1i2??2i=1∑n?Y1i?(α+βxi?)+i=1∑n?(α+βxi?)2+i=1∑n?Y2i2??2i=1∑n?Y2i?(α+βxi?)+i=1∑n?(α+βxi?)2?2ρ[i=1∑n?Y1i?Y2i??i=1∑n?(α+βxi?)Y1i??i=1∑n?(α+βxi?)Y2i?+i=1∑n?(α+βxi?)2]=i=1∑n?Y1i2?+i=1∑n?Y2i2?+2nα2?2αi=1∑n?(Y1i?+Y2i?)?2βi=1∑n?xi?(Y1i?+Y2i?)+2αβi=1∑n?xi?+β2i=1∑n?xi2??2ρ[i=1∑n?Y1i?Y2i??αi=1∑n?(Y1i?+Y2i?)?βi=1∑n?xi?(Y1i?+Y2i?)+nα2+2αβi=1∑n?xi?+βi=1∑n?xi2?]
Define T1(Y)=∑i=1n(Y1i+Y2i)T_1(Y) = \sum_{i=1}^n \left( Y_{1i} + Y_{2i}\right)T1?(Y)=∑i=1n?(Y1i?+Y2i?),T2(Y)=∑i=1nxi(Y1i+Y2i)T_2(Y) =\sum_{i=1}^n x_i(Y_{1i} + Y_{2i})T2?(Y)=∑i=1n?xi?(Y1i?+Y2i?),T3(Y)=∑i=1nY1iY2iT_3(Y) = \sum_{i=1}^n Y_{1i}Y_{2i}T3?(Y)=∑i=1n?Y1i?Y2i?. By Neyman-Fisher theorem, they are sufficient statistics. Consider another group of random sample {(Z1i,Z2i)}\{(Z_{1i},Z_{2i})\}{(Z1i?,Z2i?)},
L(α,β,σ2,ρ∣Y)L(α,β,σ2,ρ∣Y)=exp?(∑i=1n(Y1i2+Y2i2?Z1i2?Z2i2)+2α(T1(Z)?T1(Y))+2β(T2(Z)?T2(Y))?2ρ[T3(Y)?T3(Z)?α(T1(Y)?T1(Z))?β(T2(Y)?T2(Z))])\frac{L(\alpha,\beta,\sigma^2,\rho|\textbf{Y})}{L(\alpha,\beta,\sigma^2,\rho|\textbf{Y})} = \exp (\sum_{i=1}^n (Y_{1i}^2+Y_{2i}^2 - Z_{1i}^2 - Z_{2i}^2) + 2\alpha (T_1(Z) - T_1(Y)) + \\2\beta(T_2(Z) - T_2(Y))-2\rho [T_3(Y)-T_3(Z) - \alpha(T_1(Y)-T_1(Z))-\beta(T_2(Y)-T_2(Z))])L(α,β,σ2,ρ∣Y)L(α,β,σ2,ρ∣Y)?=exp(i=1∑n?(Y1i2?+Y2i2??Z1i2??Z2i2?)+2α(T1?(Z)?T1?(Y))+2β(T2?(Z)?T2?(Y))?2ρ[T3?(Y)?T3?(Z)?α(T1?(Y)?T1?(Z))?β(T2?(Y)?T2?(Z))])
only if T1(Y)=T1(Z),T2(Y)=T2(Z),T3(Y)=T3(Z)T_1(Y) = T_1(Z),T_2(Y)=T_2(Z),T_3(Y) = T_3(Z)T1?(Y)=T1?(Z),T2?(Y)=T2?(Z),T3?(Y)=T3?(Z), this likelihood ratio is independent of parameters. So T1,T2,T3T_1,T_2,T_3T1?,T2?,T3? are minimal sufficient statistics.
Part (b)
Log-likelihood function is
l(α,β,σ2,ρ)=nlog?(2π)+nlog?(σ2)+n2log?(1?ρ2)?12σ2(1?ρ2)[∑i=1n(Y1i?α?βxi)2+∑i=1n(Y2i?α?βxi)2?2ρ∑i=1n(Y1i?α?βxi)(Y2i?α?βxi)]l(\alpha,\beta,\sigma^2,\rho) = n\log (2\pi)+n\log(\sigma^2)+\frac{n}{2}\log(1-\rho^2)-\\ \frac{1}{2\sigma^2(1-\rho^2)} \left[ \sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)^2 + \sum_{i=1}^n (Y_{2i}-\alpha - \beta x_i)^2 - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right]l(α,β,σ2,ρ)=nlog(2π)+nlog(σ2)+2n?log(1?ρ2)?2σ2(1?ρ2)1?[i=1∑n?(Y1i??α?βxi?)2+i=1∑n?(Y2i??α?βxi?)2?2ρi=1∑n?(Y1i??α?βxi?)(Y2i??α?βxi?)]
?l?β=?12σ2(1?ρ2)[∑i=1n?2xi(Y1i?α?βxi)+∑i=1n?2xi(Y2i?α?βxi)?2ρ∑i=1n(Y1i?α?βxi)(Y2i?α?βxi)]\frac{\partial l}{\partial \beta} = - \frac{1}{2\sigma^2(1-\rho^2)} \left[ \sum_{i=1}^n -2x_i(Y_{1i}-\alpha - \beta x_i)+ \sum_{i=1}^n -2x_i(Y_{2i}-\alpha - \beta x_i) - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right]?β?l?=?2σ2(1?ρ2)1?[i=1∑n??2xi?(Y1i??α?βxi?)+i=1∑n??2xi?(Y2i??α?βxi?)?2ρi=1∑n?(Y1i??α?βxi?)(Y2i??α?βxi?)]
這道題,我實在是不想算了,棄療,下面是答案。個人建議考試就四個小時還是放棄這種思路不難但計算很麻煩的題吧,反正六選五。
總結
以上是生活随笔為你收集整理的UA MATH566 统计理论 QE练习题2.2的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: UA MATH564 概率论 QE练习题
- 下一篇: 脑与神经科学3 脑神经影像上