ML之PLiR之LARS:利用LARS算法求解ElasticNet回归类型问题(实数值评分预测)
生活随笔
收集整理的這篇文章主要介紹了
ML之PLiR之LARS:利用LARS算法求解ElasticNet回归类型问题(实数值评分预测)
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
ML之PLiR之LARS:利用LARS算法求解ElasticNet回歸類型問題(實數值評分預測)
?
?
目錄
設計思路
輸出結果
1、LARS
2、10-fold cross validation
實現代碼
?
?
?
設計思路
更新……
?
?
輸出結果
['"alcohol"', '"volatile acidity"', '"sulphates"', '"total sulfur dioxide"', '"chlorides"', '"fixed acidity"', '"pH"', '"free sulfur dioxide"', '"citric acid"', '"residual sugar"', '"density"']
1、LARS
2、10-fold cross validation
Minimum Mean Square Error 0.5873018933136459
Index of Minimum Mean Square Error 311
?
實現代碼
#initialize a vector of coefficients beta beta = [0.0] * ncols#initialize matrix of betas at each step betaMat = [] betaMat.append(list(beta))#number of steps to take nSteps = 350 stepSize = 0.004 nzList = []for i in range(nSteps):#calculate residualsresiduals = [0.0] * nrowsfor j in range(nrows):labelsHat = sum([xNormalized[j][k] * beta[k] for k in range(ncols)])residuals[j] = labelNormalized[j] - labelsHat#calculate correlation between attribute columns from normalized wine and residualcorr = [0.0] * ncolsfor j in range(ncols):corr[j] = sum([xNormalized[k][j] * residuals[k] for k in range(nrows)]) / nrowsiStar = 0corrStar = corr[0]for j in range(1, (ncols)):if abs(corrStar) < abs(corr[j]):iStar = j; corrStar = corr[j]beta[iStar] += stepSize * corrStar / abs(corrStar)betaMat.append(list(beta))nzBeta = [index for index in range(ncols) if beta[index] != 0.0]for q in nzBeta:if (q in nzList) == False:nzList.append(q)nameList = [names[nzList[i]] for i in range(len(nzList))]print(nameList) for i in range(ncols):#plot range of beta values for each attributecoefCurve = [betaMat[k][i] for k in range(nSteps)]xaxis = range(nSteps)plot.plot(xaxis, coefCurve)?
?
?
?
?
?
總結
以上是生活随笔為你收集整理的ML之PLiR之LARS:利用LARS算法求解ElasticNet回归类型问题(实数值评分预测)的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: ML之LiR之PLiR:惩罚线性回归PL
- 下一篇: ML之PLiR之Glmnet:利用Glm