R学习-小白笔记05
----回歸診斷----
#樣本是否符合正態(tài)分布假設(shè)?
#是否存在離群值導(dǎo)致模型產(chǎn)生較大誤差?
#線性模型是否合理?
#誤差是否滿足獨(dú)立性,等方差,正態(tài)分布等假設(shè)條件?
#是否存在多重共線性?
?
----正態(tài)分布檢驗(yàn)----
#正態(tài)性檢驗(yàn):函數(shù)shapiro.test()
#P>0.05,正態(tài)性分布
#數(shù)據(jù)來源
>num=seq(10378001,10378100);num
>x1=round(runif(100,min=80,max=100));x1#均勻分布產(chǎn)生
>x2=round(rnorm(100,mean=80,sd=7));x2
>x3=round(rnorm(100,mean=83,sd=18));x3
>x3[which(x3>100)]=100;x3
>x=data.frame(num,x1,x2,x3);x
>shapiro.test(x$x1)
 Shapiro-Wilk normality test
data: x$x1
W = 0.94894, p-value = 0.0007066 #P值很小說明統(tǒng)計(jì)學(xué)意義明顯,因此拒絕假設(shè),說明x1不是正態(tài)分布產(chǎn)生的
>shapiro.test(x$x2)
 Shapiro-Wilk normality test
data: x$x2
W = 0.97989, p-value = 0.1302 #P值不具備統(tǒng)計(jì)學(xué)意義,因此不能拒絕假設(shè),不能說明x2不是正態(tài)分布
?
----散點(diǎn)圖目測檢驗(yàn)----
#薛毅 例6.11
?
----殘差----
#殘差計(jì)算函數(shù)residuals()
#對殘差做正態(tài)性檢驗(yàn)
#殘差圖
薛毅 例6.14 標(biāo)準(zhǔn)化殘差圖
?
----多重共線性----
#什么是多重共線性?
#多重共線性對回歸模型的影響?--求逆矩陣會變得不穩(wěn)定
#利用計(jì)算特征根發(fā)現(xiàn)多重共線性
#Kappa()函數(shù)
薛毅 例6.18
#讀入數(shù)據(jù)
>collinear<-data.frame(
   Y=c(10.006, 9.737, 15.087, 8.422, 8.625, 16.289,
     5.958, 9.313, 12.960, 5.541, 8.756, 10.937),
   X1=rep(c(8, 0, 2, 0), c(3, 3, 3, 3)),
   X2=rep(c(1, 0, 7, 0), c(3, 3, 3, 3)),
   X3=rep(c(1, 9, 0), c(3, 3, 6)),
   X4=rep(c(1, 0, 1, 10), c(1, 2, 6, 3)),
   X5=c(0.541, 0.130, 2.116, -2.397, -0.046, 0.365,
     1.996, 0.228, 1.38, -0.798, 0.257, 0.440),
   X6=c(-0.099, 0.070, 0.115, 0.252, 0.017, 1.504,
     -0.865, -0.055, 0.502, -0.399, 0.101, 0.432))
#自變量的共線性分析
>XX<-cor(collinear[2:7])
#求k值
>kappa(XX,exact=TRUE)
[1] 2195.908 #k=2195.908>1000,認(rèn)為有嚴(yán)重的多重共線性
#找出哪些變量是多重共線性的,因此需計(jì)算矩陣的特征值和相應(yīng)的特征向量
>eigen(XX)
$values
[1] 2.428787365 1.546152096 0.922077664 0.793984690 0.307892134 0.001106051
$vectors
 [,1] [,2] [,3] [,4] [,5] [,6]
[1,] -0.3907189 0.33968212 0.67980398 -0.07990398 0.2510370 -0.447679719
[2,] -0.4556030 0.05392140 -0.70012501 -0.05768633 0.3444655 -0.421140280
[3,] 0.4826405 0.45332584 -0.16077736 -0.19102517 -0.4536372 -0.541689124
[4,] 0.1876590 -0.73546592 0.13587323 0.27645223 -0.0152087 -0.573371872
[5,] -0.4977330 0.09713874 -0.03185053 0.56356440 -0.6512834 -0.006052127
[6,] 0.3519499 0.35476494 -0.04864335 0.74817535 0.4337463 -0.002166594
得到:
 λmin = 0.001106, ? = (0.4476, 0.4211, 0.5417, 0.5734, 0.006052, 0.002167)T
即,
 0.4476x(1) + 0.4211x(2) + 0.5417x(3) + 0.5734x(4)+ 0.006052x(5) + 0.002167x(6) ≈ 0
由于x(6)和x(5)前的系數(shù)近似于0,因此將其去掉: 
 0.4476x(1) + 0.4211x(2) + 0.5417x(3) + 0.5734x(4) ≈ 0
所以存在,c0,c1,c2,c3,c4使得
 c1X1 + c2X2 + c3X3 + c4X4 ≈ c0
因此說明X1,X2,X3,X4存在多重共線性
?
----廣義線性模型----
薛毅 例6.19
#目標(biāo):求出電流強(qiáng)度與牛是否張嘴之間的關(guān)系
#困難:牛是否張嘴,是0-1變量,不是變量,無法建立線性回歸模型
#矛盾轉(zhuǎn)化:牛張嘴的概率是聯(lián)續(xù)變量
>a=c(0;5)
>b=c(0,0.129,0.3,0.671,0.857,0.9)
>plot(a,b)#根據(jù)圖片可以看出符合logistic回歸模型的曲線特征
#廣義線性模型建模函數(shù):glm()
 fm <- glm(formula, family = binomial(link = logit),data=data.frame)
#數(shù)據(jù)框形式輸入數(shù)據(jù)
>norell<-data.frame(x=0:5, n=rep(70,6), success=c(0,9,21,47,60,63))
#構(gòu)造矩陣
>norell$Ymat<-cbind(norell$success, norell$n-norell$success)
#建模
>glm.sol<-glm(Ymat~x, family=binomial, data=norell)
>summary(glm.sol)
Call:
glm(formula = Ymat ~ x, family = binomial, data = norell)
Deviance Residuals: 
 1 2 3 4 5 6 
-2.2507 0.3892 -0.1466 1.1080 0.3234 -1.6679 
Coefficients:
 Estimate Std. Error z value Pr(>|z|) 
(Intercept) -3.3010 0.3238 -10.20 <2e-16 ***
x 1.2459 0.1119 11.13 <2e-16 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
(Dispersion parameter for binomial family taken to be 1)
 Null deviance: 250.4866 on 5 degrees of freedom
Residual deviance: 9.3526 on 4 degrees of freedom
AIC: 34.093
Number of Fisher Scoring iterations: 4
?
----非線性模型----
#例子:銷售額×流通費(fèi)率y
>x=c(1.5,2.8,4.5,7.5,10.5,13.5,15.1,16.5,19.5,22.5,24.5,26.5)
>y=c(7.0,5.5,4.6,3.6,2.9,2.7,2.5,2.4,2.2,2.1,1.9,1.8)
>plot(x,y)#非線性
----------------------------------
#直線回歸(R2值不理想)
----------------------------------
>lm.1=lm(y~x)
>summary(lm.1)
Call:
lm(formula = y ~ x)
Residuals:
 Min 1Q Median 3Q Max 
-0.9179 -0.5537 -0.1628 0.3953 1.6519 
Coefficients:
 Estimate Std. Error t value Pr(>|t|) 
(Intercept) 5.60316 0.43474 12.889 1.49e-07 ***
x -0.17003 0.02719 -6.254 9.46e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 0.7701 on 10 degrees of freedom
Multiple R-squared: 0.7964, Adjusted R-squared: 0.776 
F-statistic: 39.11 on 1 and 10 DF, p-value: 9.456e-05
---------------------------
#對數(shù)法,y=a+blogx
----------------------------
>lm.log=lm(y~log(x))
>summary(lm.log)
Call:
lm(formula = y ~ log(x))
Residuals:
 Min 1Q Median 3Q Max 
-0.33291 -0.10133 -0.04693 0.16512 0.34844 
Coefficients:
 Estimate Std. Error t value Pr(>|t|) 
(Intercept) 7.3639 0.1688 43.64 9.60e-13 ***
log(x) -1.7568 0.0677 -25.95 1.66e-10 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 0.2064 on 10 degrees of freedom
Multiple R-squared: 0.9854, Adjusted R-squared: 0.9839 
F-statistic: 673.5 on 1 and 10 DF, p-value: 1.66e-10
>plot(x,y)
>lines(x,fitted(lm.log))y(lm.log)
--------------------------
#指數(shù)法,y=ae(bx)
--------------------------
>lm.exp=lm(log(y)~x)
>summary(lm.exp)
Call:
lm(formula = log(y) ~ x)
Residuals:
 Min 1Q Median 3Q Max 
-0.18246 -0.10664 -0.01670 0.08079 0.25946 
Coefficients:
 Estimate Std. Error t value Pr(>|t|) 
(Intercept) 1.759664 0.075101 23.43 4.54e-10 ***
x -0.048809 0.004697 -10.39 1.12e-06 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 0.133 on 10 degrees of freedom
Multiple R-squared: 0.9153, Adjusted R-squared: 0.9068 
F-statistic: 108 on 1 and 10 DF, p-value: 1.116e-06
>plot(x,y)
>lines(x,exp(fitted(lm.exp)))
---------------------------
#冪函數(shù)法,y=ax(b)
---------------------------
>lm.pow=lm(log(y)~log(x))
>summary(lm.pow)
Call:
lm(formula = log(y) ~ log(x))
Residuals:
 Min 1Q Median 3Q Max 
-0.054727 -0.020805 0.004548 0.024617 0.045896 
Coefficients:
 Estimate Std. Error t value Pr(>|t|) 
(Intercept) 2.19073 0.02951 74.23 4.81e-15 ***
log(x) -0.47243 0.01184 -39.90 2.34e-12 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 0.0361 on 10 degrees of freedom
Multiple R-squared: 0.9938, Adjusted R-squared: 0.9931 
F-statistic: 1592 on 1 and 10 DF, p-value: 2.337e-12
>plot(x,y)
>lines(x,exp(fitted(lm.pow)))
**對比以上各種擬合回歸過程得出結(jié)論是冪函數(shù)法為最佳
-----------------------
#多項(xiàng)式回歸模型
-----------------------
#正交多項(xiàng)式回歸
#缺點(diǎn):容易導(dǎo)致過擬合情況,或者擬合不足的情況
?
----非線性最小二乘問題----
#nls()函數(shù)
薛毅 例子6.23
?
?
轉(zhuǎn)載于:https://www.cnblogs.com/EleanorInHarbin/p/8268439.html
創(chuàng)作挑戰(zhàn)賽新人創(chuàng)作獎(jiǎng)勵(lì)來咯,堅(jiān)持創(chuàng)作打卡瓜分現(xiàn)金大獎(jiǎng)總結(jié)
以上是生活随笔為你收集整理的R学习-小白笔记05的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
                            
                        - 上一篇: JMeter工具使用初探
 - 下一篇: 安装docker 17.03.2.ce教