ML之xgboost&GBM:基于xgboost&GBM算法對HiggsBoson數據集(Kaggle競賽)訓練(兩模型性能PK)實現二分類預測
?
?
?
目錄
輸出結果
設計思路
核心代碼
?
?
?
輸出結果
finish loading from csv
weight statistics: wpos=1522.37, wneg=904200, ratio=593.94loading data end, start to boost trees
training GBM from sklearnIter Train Loss Remaining Time 1 1.2069 49.52s2 1.1437 43.51s3 1.0909 37.43s4 1.0471 30.96s5 1.0096 25.09s6 0.9775 19.90s7 0.9505 15.22s8 0.9264 9.94s9 0.9058 4.88s10 0.8878 0.00s
sklearn.GBM total costs: 50.88141202926636 secondstraining xgboost
[0] train-ams@0.15:3.69849
[1] train-ams@0.15:3.96339
[2] train-ams@0.15:4.26978
[3] train-ams@0.15:4.32619
[4] train-ams@0.15:4.41415
[5] train-ams@0.15:4.49395
[6] train-ams@0.15:4.64614
[7] train-ams@0.15:4.64058
[8] train-ams@0.15:4.73064
[9] train-ams@0.15:4.79447
XGBoost with 1 thread costs: 24.5108642578125 seconds
[0] train-ams@0.15:3.69849
[1] train-ams@0.15:3.96339
[2] train-ams@0.15:4.26978
[3] train-ams@0.15:4.32619
[4] train-ams@0.15:4.41415
[5] train-ams@0.15:4.49395
[6] train-ams@0.15:4.64614
[7] train-ams@0.15:4.64058
[8] train-ams@0.15:4.73064
[9] train-ams@0.15:4.79447
XGBoost with 2 thread costs: 11.449955940246582 seconds
[0] train-ams@0.15:3.69849
[1] train-ams@0.15:3.96339
[2] train-ams@0.15:4.26978
[3] train-ams@0.15:4.32619
[4] train-ams@0.15:4.41415
[5] train-ams@0.15:4.49395
[6] train-ams@0.15:4.64614
[7] train-ams@0.15:4.64058
[8] train-ams@0.15:4.73064
[9] train-ams@0.15:4.79447
XGBoost with 4 thread costs: 8.809934616088867 seconds
[0] train-ams@0.15:3.69849
[1] train-ams@0.15:3.96339
[2] train-ams@0.15:4.26978
[3] train-ams@0.15:4.32619
[4] train-ams@0.15:4.41415
[5] train-ams@0.15:4.49395
[6] train-ams@0.15:4.64614
[7] train-ams@0.15:4.64058
[8] train-ams@0.15:4.73064
[9] train-ams@0.15:4.79447
XGBoost with 8 thread costs: 7.875434875488281 seconds
XGBoost total costs: 52.64618968963623 seconds
?
設計思路
?
?
?
?
核心代碼
?
?
?
?
總結
以上是生活随笔為你收集整理的ML之xgboostGBM:基于xgboostGBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测的全部內容,希望文章能夠幫你解決所遇到的問題。
如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。