集成学习01_xgboost参数讲解与实战
本章分以下幾塊來(lái)講解
一.xgboost 模型參數(shù)介紹
二.xgboost 兩種方式實(shí)現(xiàn)
三. 網(wǎng)格搜索最優(yōu)xgboost參數(shù)
一.XGBoost的參數(shù)
XGBoost的作者把所有的參數(shù)分成了三類,這里只介紹我們常用的一些參數(shù),不常用的不做介紹
通用參數(shù):宏觀函數(shù)控制。
Booster參數(shù):控制每一步的booster(tree/regression)。
學(xué)習(xí)目標(biāo)參數(shù):控制訓(xùn)練目標(biāo)的表現(xiàn)。
1 通用參數(shù)
1)booster[默認(rèn)gbtree]
- 選擇每次迭代的模型,有兩種選擇:
gbtree:基于樹的模型
gbliner:線性模型
2)silent[默認(rèn)0]
- 當(dāng)這個(gè)參數(shù)值為1時(shí),靜默模式開啟,不會(huì)輸出任何信息。
- 一般這個(gè)參數(shù)就保持默認(rèn)的0,因?yàn)檫@樣能幫我們更好地理解模型。
3)nthread[默認(rèn)值為最大可能的線程數(shù)]
- 這個(gè)參數(shù)用來(lái)進(jìn)行多線程控制,應(yīng)當(dāng)輸入系統(tǒng)的核數(shù)。
- 如果你希望使用CPU全部的核,那就不要輸入這個(gè)參數(shù),算法會(huì)自動(dòng)檢測(cè)它。
4)num_feature [set automatically by xgboost, no need to be set by user]
- boosting過程中用到的特征維數(shù),設(shè)置為特征個(gè)數(shù)。
- XGBoost會(huì)自動(dòng)設(shè)置,不需要手工設(shè)置。
2 booster參數(shù)
盡管有兩種booster可供選擇,我這里只介紹tree booster,因?yàn)樗谋憩F(xiàn)遠(yuǎn)遠(yuǎn)勝過linear booster,所以linear booster很少用到。
1)eta[默認(rèn)0.3]
- 和GBM中的 learning rate 參數(shù)類似。
- 通過減少每一步的權(quán)重,可以提高模型的魯棒性。
- 典型值為0.01-0.2。
2)min_child_weight[默認(rèn)1]
*決定最小葉子節(jié)點(diǎn)樣本權(quán)重和。
- 和GBM的 min_child_leaf 參數(shù)類似,但不完全一樣。XGBoost的這個(gè)參數(shù)是最小樣本權(quán)重的和,而GBM參數(shù)是最小樣本總數(shù)。
- 這個(gè)參數(shù)用于避免過擬合。當(dāng)它的值較大時(shí),可以避免模型學(xué)習(xí)到局部的特殊樣本。
- 但是如果這個(gè)值過高,會(huì)導(dǎo)致欠擬合。這個(gè)參數(shù)需要使用CV來(lái)調(diào)整。
3)max_depth[默認(rèn)6]
- 和GBM中的參數(shù)相同,這個(gè)值為樹的最大深度。
- 這個(gè)值也是用來(lái)避免過擬合的。max_depth越大,模型會(huì)學(xué)到更具體更局部的樣本。
- 需要使用CV函數(shù)來(lái)進(jìn)行調(diào)優(yōu)。
- 典型值:3-10
4)max_leaf_nodes
- 樹上最大的節(jié)點(diǎn)或葉子的數(shù)量。
- 可以替代max_depth的作用。因?yàn)槿绻傻氖嵌鏄?#xff0c;一個(gè)深度為n的樹最多生成
- 如果定義了這個(gè)參數(shù),GBM會(huì)忽略max_depth參數(shù)。
5)gamma[默認(rèn)0]
- 在節(jié)點(diǎn)分裂時(shí),只有分裂后損失函數(shù)的值下降了,才會(huì)分裂這個(gè)節(jié)點(diǎn)。
- Gamma指定了節(jié)點(diǎn)分裂所需的最小損失函數(shù)下降值。這個(gè)參數(shù)的值越大,算法越保守。這個(gè)參數(shù)的值和損失函數(shù)息息相關(guān),所以是需要調(diào)整的。
- 模型在默認(rèn)情況下,對(duì)于一個(gè)節(jié)點(diǎn)的劃分只有在其loss function 得到結(jié)果大于0的情況下才進(jìn)行,而gamma 給定了所需的最低loss function的值
- gamma值使得算法更c(diǎn)onservation,且其值依賴于loss function ,在模型中應(yīng)該進(jìn)行調(diào)參.
6)max_delta_step[默認(rèn)0]
- 這參數(shù)限制每棵樹權(quán)重改變的最大步長(zhǎng)。如果這個(gè)參數(shù)的值為0,那就意味著沒有約束。如果它被賦予了某個(gè)正值,那么它會(huì)讓這個(gè)算法更加保守。
- 通常,這個(gè)參數(shù)不需要設(shè)置。但是當(dāng)各類別的樣本十分不平衡時(shí),它對(duì)邏輯回歸是很有幫助的。
- 這個(gè)參數(shù)一般用不到,但是你可以挖掘出來(lái)它更多的用處。
7)subsample[默認(rèn)1]
- 和GBM中的subsample參數(shù)一模一樣。這個(gè)參數(shù)控制對(duì)于每棵樹,隨機(jī)采樣的比例。
- 減小這個(gè)參數(shù)的值,算法會(huì)更加保守,避免過擬合。但是,如果這個(gè)值設(shè)置得過小,它可能會(huì)導(dǎo)致欠擬合。
- 典型值:0.5-1
8)colsample_bytree[默認(rèn)1]
- 和GBM里面的max_features參數(shù)類似。用來(lái)控制每棵隨機(jī)采樣的列數(shù)的占比(每一列是一個(gè)特征)。
- 典型值:0.5-1
9)colsample_bylevel[默認(rèn)1]
- 用來(lái)控制樹的每一級(jí)的每一次分裂,對(duì)列數(shù)的采樣的占比。
- 一般不太用這個(gè)參數(shù),因?yàn)閟ubsample參數(shù)和colsample_bytree參數(shù)可以起到相同的作用。但是如果感興趣,可以挖掘這個(gè)參數(shù)更多的用處。
10)lambda[默認(rèn)1]
- 權(quán)重的L2正則化項(xiàng)。(和Ridge regression類似)。
- 這個(gè)參數(shù)是用來(lái)控制XGBoost的正則化部分的。
11)alpha[默認(rèn)1]
- 權(quán)重的L1正則化項(xiàng)。(和Lasso regression類似)。
- 可以應(yīng)用在很高維度的情況下,使得算法的速度更快。
12)scale_pos_weight[默認(rèn)1]
- 在各類別樣本十分不平衡時(shí),把這個(gè)參數(shù)設(shè)定為一個(gè)正值,可以使算法更快收斂。
- 大于0的取值可以處理類別不平衡的情況。幫助模型更快收斂。
13) Parameter for Linear Booster
lambda_bias
- 在偏置上的L2正則。缺省值為0(在L1上沒有偏置項(xiàng)的正則,因?yàn)長(zhǎng)1時(shí)偏置不重要)
3 學(xué)習(xí)目標(biāo)參數(shù)
這個(gè)參數(shù)用來(lái)控制理想的優(yōu)化目標(biāo)和每一步結(jié)果的度量方法。
1)objective[默認(rèn)reg:linear]
- 這個(gè)參數(shù)定義需要被最小化的損失函數(shù)。最常用的值有: 定義學(xué)習(xí)任務(wù)及相應(yīng)的學(xué)習(xí)目標(biāo),可選的目標(biāo)函數(shù)如下:
- “reg:linear” –線性回歸。
- “reg:logistic” –邏輯回歸。
- “binary:logistic” –二分類的邏輯回歸問題,輸出為概率。
- “binary:logitraw” –二分類的邏輯回歸問題,輸出的結(jié)果為wTx。
- “count:poisson” –計(jì)數(shù)問題的poisson回歸,輸出結(jié)果為poisson分布。
- 在poisson回歸中,max_delta_step的缺省值為0.7。(used to safeguard optimization)
- “multi:softmax” –讓XGBoost采用softmax目標(biāo)函數(shù)處理多分類問題,同時(shí)需要設(shè)置參數(shù)num_class(類別個(gè)數(shù))
- “multi:softprob” –和softmax一樣,但是輸出的是ndata * nclass的向量,可以將該向量reshape成ndata行nclass列的矩陣。每行數(shù)據(jù)表示樣本所屬于每個(gè)類別的概率。
- “rank:pairwise” –set XGBoost to do ranking task by minimizing the pairwise loss
2)eval_metric[默認(rèn)值取決于objective參數(shù)的取值]
- 對(duì)于有效數(shù)據(jù)的度量方法。
- 對(duì)于回歸問題,默認(rèn)值是rmse,對(duì)于分類問題,默認(rèn)值是error。
- 典型值有:
- rmse 均方根誤差
- mae 平均絕對(duì)誤差
- logloss 負(fù)對(duì)數(shù)似然函數(shù)值
- error 二分類錯(cuò)誤率(閾值為0.5)
- merror 多分類錯(cuò)誤率
- mlogloss 多分類logloss損失函數(shù)
- auc 曲線下面積
3)seed(默認(rèn)0)
- 隨機(jī)數(shù)的種子
- 設(shè)置它可以復(fù)現(xiàn)隨機(jī)數(shù)據(jù)的結(jié)果,也可以用于調(diào)整參數(shù)
- 如果你比較習(xí)慣scikit-learn的參數(shù)形式,那么XGBoost的Python 版本也提供了sklearn形式的接口 XGBClassifier。
4)sklearn 參數(shù)對(duì)照
它使用sklearn形式的參數(shù)命名方式,對(duì)應(yīng)關(guān)系如下:
- 1、eta -> learning_rate
- 2、lambda -> reg_lambda
- 3、alpha -> reg_alpha
4.平臺(tái)控制參數(shù) Console Parameters
The following parameters are only used in the console version of xgboost
- use_buffer [ default=1 ]
是否為輸入創(chuàng)建二進(jìn)制的緩存文件,緩存文件可以加速計(jì)算。缺省值為1 - num_round
boosting迭代計(jì)算次數(shù)。 - data
輸入數(shù)據(jù)的路徑 - test:data
測(cè)試數(shù)據(jù)的路徑 - save_period [default=0]
表示保存第i*save_period次迭代的模型。例如save_period=10表示每隔10迭代計(jì)算XGBoost將會(huì)保存中間結(jié)果,設(shè)置為0表示每次計(jì)算的模型都要保持。 - task [default=train] options: train, pred, eval, dump
train:訓(xùn)練模型
pred:對(duì)測(cè)試數(shù)據(jù)進(jìn)行預(yù)測(cè)
eval:通過eval[name]=filenam定義評(píng)價(jià)指標(biāo)
dump:將學(xué)習(xí)模型保存成文本格式 - model_in [default=NULL]
指向模型的路徑在test, eval, dump都會(huì)用到,如果在training中定義XGBoost將會(huì)接著輸入模型繼續(xù)訓(xùn)練 - model_out [default=NULL]
訓(xùn)練完成后模型的保存路徑,如果沒有定義則會(huì)輸出類似0003.model這樣的結(jié)果,0003是第三次訓(xùn)練的模型結(jié)果。 - model_dir [default=models]
輸出模型所保存的路徑。 - fmap
feature map, used for dump model - name_dump [default=dump.txt]
name of model dump file - name_pred [default=pred.txt]
預(yù)測(cè)結(jié)果文件 - pred_margin [default=0]
輸出預(yù)測(cè)的邊界,而不是轉(zhuǎn)換后的概率
二.xgboost 實(shí)現(xiàn)
本章以優(yōu)惠券推薦數(shù)據(jù)為例對(duì)xgboost結(jié)合skleran與直接采用xgboost進(jìn)行實(shí)現(xiàn)
1.導(dǎo)入相關(guān)包
import pandas as pd, numpy as np from sklearn.model_selection import train_test_split, GridSearchCV from sklearn import metrics import catboost as cb import xgboost as xgb from xgboost.sklearn import XGBClassifier import os import joblib from sklearn.preprocessing import LabelEncoder from collections import defaultdict data=pd.read_excel('car_coupon.xlsx') data.head(5)| 11263 | No Urgent Place | Friend(s) | 0 | 0 | 0 | 1 | Male | 55 | Widowed | ... | 0 | 0 | 1 | 1 | 1 | Sunny | 14 | Coffee House | 24 | 1 |
| 20136 | Work | Alone | 1 | 0 | 1 | 0 | Female | 26 | Married partner | ... | 0 | 0 | 3 | 3 | 3 | Sunny | 7 | Bar | 24 | 0 |
| 14763 | Work | Alone | 1 | 0 | 0 | 1 | Female | 55 | Single | ... | 0 | 0 | 1 | 1 | 1 | Sunny | 7 | Coffee House | 24 | 0 |
| 12612 | No Urgent Place | Kid(s) | 1 | 0 | 0 | 1 | Female | 41 | Married partner | ... | 0 | 3 | 3 | 3 | 3 | Sunny | 10 | Carry out & Take away | 2 | 0 |
| 17850 | No Urgent Place | Partner | 1 | 0 | 0 | 1 | Female | 31 | Married partner | ... | 1 | 1 | 10 | 10 | 10 | Snowy | 14 | Coffee House | 2 | 0 |
5 rows × 23 columns
2.數(shù)據(jù)處理
- 對(duì)類別數(shù)據(jù)進(jìn)行編碼
| 11263 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 55 | 4 | ... | 0 | 0 | 1 | 1 | 1 | 2 | 14 | 2 | 24 | 1 |
| 20136 | 2 | 0 | 1 | 0 | 1 | 0 | 0 | 26 | 1 | ... | 0 | 0 | 3 | 3 | 3 | 2 | 7 | 0 | 24 | 0 |
| 14763 | 2 | 0 | 1 | 0 | 0 | 1 | 0 | 55 | 2 | ... | 0 | 0 | 1 | 1 | 1 | 2 | 7 | 2 | 24 | 0 |
| 12612 | 1 | 2 | 1 | 0 | 0 | 1 | 0 | 41 | 1 | ... | 0 | 3 | 3 | 3 | 3 | 2 | 10 | 1 | 2 | 0 |
| 17850 | 1 | 3 | 1 | 0 | 0 | 1 | 0 | 31 | 1 | ... | 1 | 1 | 10 | 10 | 10 | 1 | 14 | 2 | 2 | 0 |
5 rows × 23 columns
- 切分訓(xùn)練集與測(cè)試集
- 注意下 data ,train, test, y_train, y_test的數(shù)據(jù)格式
- 撰寫評(píng)價(jià)函數(shù)
3.結(jié)合sklearn的xgboot模型
step01-擬合模型
from xgboost.sklearn import XGBClassifier xgboost_model = XGBClassifier() eval_set = [(test.values, y_test.values)] #擬合模型 xgboost_model.fit(train.values, y_train.values, early_stopping_rounds=300, eval_metric="logloss", # 損失函數(shù)的類型,分類一般都是用對(duì)數(shù)作為損失函數(shù)eval_set=eval_set,verbose=False) D:\dprograme\Anaconda3\lib\site-packages\xgboost\sklearn.py:793: UserWarning: `eval_metric` in `fit` method is deprecated for better compatibility with scikit-learn, use `eval_metric` in constructor or`set_params` instead.warnings.warn( D:\dprograme\Anaconda3\lib\site-packages\xgboost\sklearn.py:793: UserWarning: `early_stopping_rounds` in `fit` method is deprecated for better compatibility with scikit-learn, use `early_stopping_rounds` in constructor or`set_params` instead.warnings.warn( XGBClassifier(base_score=0.5, booster='gbtree', callbacks=None, colsample_bylevel=1, colsample_bynode=1, colsample_bytree=1,early_stopping_rounds=None, enable_categorical=False,eval_metric=None, gamma=0, gpu_id=-1, grow_policy='depthwise',importance_type=None, interaction_constraints='',learning_rate=0.300000012, max_bin=256, max_cat_to_onehot=4,max_delta_step=0, max_depth=6, max_leaves=0, min_child_weight=1,missing=nan, monotone_constraints='()', n_estimators=100,n_jobs=0, num_parallel_tree=1, predictor='auto', random_state=0,reg_alpha=0, reg_lambda=1, ...)</pre><b>In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook. <br />On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.</b></div><div class="sk-container" hidden><div class="sk-item"><div class="sk-estimator sk-toggleable"><input class="sk-toggleable__control sk-hidden--visually" id="sk-estimator-id-1" type="checkbox" checked><label for="sk-estimator-id-1" class="sk-toggleable__label sk-toggleable__label-arrow">XGBClassifier</label><div class="sk-toggleable__content"><pre>XGBClassifier(base_score=0.5, booster='gbtree', callbacks=None,colsample_bylevel=1, colsample_bynode=1, colsample_bytree=1,early_stopping_rounds=None, enable_categorical=False,eval_metric=None, gamma=0, gpu_id=-1, grow_policy='depthwise',importance_type=None, interaction_constraints='',learning_rate=0.300000012, max_bin=256, max_cat_to_onehot=4,max_delta_step=0, max_depth=6, max_leaves=0, min_child_weight=1,missing=nan, monotone_constraints='()', n_estimators=100,n_jobs=0, num_parallel_tree=1, predictor='auto', random_state=0,reg_alpha=0, reg_lambda=1, ...)</pre></div></div></div></div></div>step02-評(píng)價(jià)模型
model_eval2(xgboost_model, train.values, test.values) train_roc_auc_score: 0.890295988831706 test_roc_auc_score: 0.7178983466569767 train_accuracy_score: 0.8007142857142857 test_accuracy_score: 0.6683333333333333 train_precision_score: 0.7965116279069767 test__precision_score: 0.704225352112676 train_recall_score: 0.8681875792141952 test_recall_score: 0.7267441860465116 train_f1_score: 0.8308065494238933 test_f1_score: 0.7153075822603719step03-利用模型預(yù)測(cè)
- xgboost_model.predict 預(yù)測(cè)結(jié)果是0或1的int型
- xgboost_model.predict_proba預(yù)測(cè)結(jié)果是0到1之間的float型
step04-保存和調(diào)用模型
joblib.dump(xgboost_model , r'D:\Ensemble_Learning\xgboostinfo\xgboostsklearnsingle.model') load_model=joblib.load(r'D:\Ensemble_Learning\xgboostinfo\xgboostsklearnsingle.model') load_model.predict( test.values ) array([0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 1, 1, 0, 1,1, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0, 0, 0, 1, 0, 1,1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 1, 1, 1, 0, 1, 1, 0,1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0,1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1,0, 0, 1, 1, 1, 1, 0, 0, 0, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 1, 1,0, 0, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1, 0, 0,1, 1, 1, 0, 0, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 0, 1, 0, 1,0, 0, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0,1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1, 0,1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0,1, 0, 1, 1, 0, 1, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1,1, 0, 1, 1, 1, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0,1, 0, 0, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0, 0, 1,1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0,1, 0, 1, 1, 0, 1, 0, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1,0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 1, 0, 1, 1, 0, 0, 1, 0,1, 1, 1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1,1, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 1, 1, 1, 0, 1, 1,1, 0, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 1, 0,0, 0, 1, 0, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 1,1, 0, 1, 0, 0, 1, 0, 1, 1, 0, 1, 1, 1, 1, 0, 1, 1, 1, 0, 0, 0, 0,1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 0, 0, 1, 1, 0, 0, 0, 1,0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 1, 1, 0, 1, 1, 1, 0, 1, 1, 1,1, 0, 1, 0, 0, 0, 0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 1, 1, 1, 1, 1,1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 0,1, 1, 0, 1, 0, 0, 1, 0, 1, 0, 1, 1, 0, 1, 1, 1, 1, 1, 0, 1, 1, 1,0, 1, 1, 0, 1, 0])注意點(diǎn)
- 上面xgboost_model.fit傳入的是train.values和y_train.values,數(shù)據(jù)類型為numpy.ndarray
- 上面* xgboost_model.predict與xgboost_model.predict_proba傳入的數(shù)據(jù)類型為numpy.ndarray
4.直接采用xgboost的模型
step-01 構(gòu)建參數(shù)
params={'alpha': 0.09,'booster': 'gbtree','colsample_bylevel': 0.4,'colsample_bytree': 0.7,'eval_metric': 'logloss','gamma': 0.85,'learning_rate': 0.1,'max_depth': 7,'min_child_weight': 20,'n_estimator': 40,'objective': 'binary:logistic','reg_lambda': 0.1,'seed': 1,'subsample': 0.6}step-02 處理數(shù)據(jù)
dtrain = xgb.DMatrix(train, label=y_train,feature_names=list(train.columns)) dtest = xgb.DMatrix(test) validation = xgb.DMatrix(test,y_test) watchlist = [(validation,'train')]step-03 擬合模型
model = xgb.train(params,dtrain,num_boost_round= 2000, # 迭代的次數(shù),及弱學(xué)習(xí)器的個(gè)數(shù)evals= watchlist) [21:06:30] WARNING: C:/Users/administrator/workspace/xgboost-win64_release_1.6.0/src/learner.cc:627: Parameters: { "n_estimator" } might not be used.This could be a false alarm, with some parameters getting used by language bindings butthen being mistakenly passed down to XGBoost core, or some parameter actually being usedbut getting flagged wrongly here. Please open an issue if you find any such cases.[0] train-logloss:0.68835 [1] train-logloss:0.68565 [2] train-logloss:0.68298 [3] train-logloss:0.67752 [4] train-logloss:0.67465 [5] train-logloss:0.67235 [6] train-logloss:0.66660 [7] train-logloss:0.66280 [8] train-logloss:0.66026 [9] train-logloss:0.65894 [10] train-logloss:0.65901 [11] train-logloss:0.65892 [12] train-logloss:0.65751 [13] train-logloss:0.65512 [14] train-logloss:0.65389 [15] train-logloss:0.65229 [16] train-logloss:0.64792 [17] train-logloss:0.64436 [18] train-logloss:0.64343 [19] train-logloss:0.64374 [20] train-logloss:0.64223 [21] train-logloss:0.63890 [22] train-logloss:0.63934 [23] train-logloss:0.63531 [24] train-logloss:0.63163 [25] train-logloss:0.63014 [26] train-logloss:0.62985 [27] train-logloss:0.62939 [28] train-logloss:0.62872 [29] train-logloss:0.62832 [30] train-logloss:0.62718 [31] train-logloss:0.62531 [32] train-logloss:0.62274 [33] train-logloss:0.62034 [34] train-logloss:0.61853 [35] train-logloss:0.61825 [36] train-logloss:0.61698 [37] train-logloss:0.61518 [38] train-logloss:0.61462 [39] train-logloss:0.61375 [40] train-logloss:0.61137 [41] train-logloss:0.61013 [42] train-logloss:0.61013 [43] train-logloss:0.61091 [44] train-logloss:0.60978 [45] train-logloss:0.60987 [46] train-logloss:0.60909 [47] train-logloss:0.60926 [48] train-logloss:0.60889 [49] train-logloss:0.60833 [50] train-logloss:0.60849 [51] train-logloss:0.60889 [52] train-logloss:0.60871 [53] train-logloss:0.60861 [54] train-logloss:0.60935 [55] train-logloss:0.60868 [56] train-logloss:0.60836 [57] train-logloss:0.60862 [58] train-logloss:0.60933 [59] train-logloss:0.60926 [60] train-logloss:0.60929 [61] train-logloss:0.60936 [62] train-logloss:0.60876 [63] train-logloss:0.60862 [64] train-logloss:0.60866 [65] train-logloss:0.60921 [66] train-logloss:0.60946 [67] train-logloss:0.60896 [68] train-logloss:0.60919 [69] train-logloss:0.60852 [70] train-logloss:0.60873 [71] train-logloss:0.60902 [72] train-logloss:0.60903 [73] train-logloss:0.60881 [74] train-logloss:0.60862 [75] train-logloss:0.60658 [76] train-logloss:0.60641 [77] train-logloss:0.60657 [78] train-logloss:0.60661 [79] train-logloss:0.60736 [80] train-logloss:0.60740 [81] train-logloss:0.60726 [82] train-logloss:0.60717 [83] train-logloss:0.60745 [84] train-logloss:0.60663 [85] train-logloss:0.60681 [86] train-logloss:0.60718 [87] train-logloss:0.60616 [88] train-logloss:0.60682 [89] train-logloss:0.60632 [90] train-logloss:0.60609 [91] train-logloss:0.60548 [92] train-logloss:0.60544 [93] train-logloss:0.60522 [94] train-logloss:0.60536 [95] train-logloss:0.60596 [96] train-logloss:0.60680 [97] train-logloss:0.60665 [98] train-logloss:0.60742 [99] train-logloss:0.60716 [100] train-logloss:0.60704 [101] train-logloss:0.60628 [102] train-logloss:0.60648 [103] train-logloss:0.60658 [104] train-logloss:0.60748 [105] train-logloss:0.60746 [106] train-logloss:0.60750 [107] train-logloss:0.60736 [108] train-logloss:0.60640 [109] train-logloss:0.60703 [110] train-logloss:0.60651 [111] train-logloss:0.60647 [112] train-logloss:0.60556 [113] train-logloss:0.60544 [114] train-logloss:0.60372 [115] train-logloss:0.60246 [116] train-logloss:0.60285 [117] train-logloss:0.60266 [118] train-logloss:0.60286 [119] train-logloss:0.60331 [120] train-logloss:0.60429 [121] train-logloss:0.60428 [122] train-logloss:0.60386 [123] train-logloss:0.60349 [124] train-logloss:0.60357 [125] train-logloss:0.60228 [126] train-logloss:0.60228 [127] train-logloss:0.60304 [128] train-logloss:0.60288 [129] train-logloss:0.60234 [130] train-logloss:0.60196 [131] train-logloss:0.60220 [132] train-logloss:0.60163 [133] train-logloss:0.60118 [134] train-logloss:0.60188 [135] train-logloss:0.60089 [136] train-logloss:0.60052 [137] train-logloss:0.60121 [138] train-logloss:0.60029 [139] train-logloss:0.59980 [140] train-logloss:0.60066 [141] train-logloss:0.60037 [142] train-logloss:0.60084 [143] train-logloss:0.60068 [144] train-logloss:0.60141 [145] train-logloss:0.60053 [146] train-logloss:0.60028 [147] train-logloss:0.60044 [148] train-logloss:0.59957 [149] train-logloss:0.60004 [150] train-logloss:0.59962 [151] train-logloss:0.59961 [152] train-logloss:0.59938 [153] train-logloss:0.59880 [154] train-logloss:0.59873 [155] train-logloss:0.59878 [156] train-logloss:0.59905 [157] train-logloss:0.59885 [158] train-logloss:0.59913 [159] train-logloss:0.59885 [160] train-logloss:0.59845 [161] train-logloss:0.59908 [162] train-logloss:0.59909 [163] train-logloss:0.59804 [164] train-logloss:0.59788 [165] train-logloss:0.59796 [166] train-logloss:0.59915 [167] train-logloss:0.59874 [168] train-logloss:0.59868 [169] train-logloss:0.59866 [170] train-logloss:0.59915 [171] train-logloss:0.59945 [172] train-logloss:0.59978 [173] train-logloss:0.59945 [174] train-logloss:0.59956 [175] train-logloss:0.59835 [176] train-logloss:0.59840 [177] train-logloss:0.59836 [178] train-logloss:0.59825 [179] train-logloss:0.59791 [180] train-logloss:0.59836 [181] train-logloss:0.59813 [182] train-logloss:0.59832 [183] train-logloss:0.59790 [184] train-logloss:0.59847 [185] train-logloss:0.59873 [186] train-logloss:0.59886 [187] train-logloss:0.59942 [188] train-logloss:0.59865 [189] train-logloss:0.59852 [190] train-logloss:0.59852 [191] train-logloss:0.59848 [192] train-logloss:0.59884 [193] train-logloss:0.59845 [194] train-logloss:0.59827 [195] train-logloss:0.59773 [196] train-logloss:0.59742 [197] train-logloss:0.59782 [198] train-logloss:0.59742 [199] train-logloss:0.59765 [200] train-logloss:0.59699 [201] train-logloss:0.59748 [202] train-logloss:0.59788 [203] train-logloss:0.59799 [204] train-logloss:0.59756 [205] train-logloss:0.59685 [206] train-logloss:0.59746 [207] train-logloss:0.59756 [208] train-logloss:0.59718 [209] train-logloss:0.59742 [210] train-logloss:0.59784 [211] train-logloss:0.59826 [212] train-logloss:0.59800 [213] train-logloss:0.59736 [214] train-logloss:0.59694 [215] train-logloss:0.59707 [216] train-logloss:0.59706 [217] train-logloss:0.59695 [218] train-logloss:0.59711 [219] train-logloss:0.59697 [220] train-logloss:0.59773 [221] train-logloss:0.59839 [222] train-logloss:0.59860 [223] train-logloss:0.59783 [224] train-logloss:0.59776 [225] train-logloss:0.59783 [226] train-logloss:0.59780 [227] train-logloss:0.59815 [228] train-logloss:0.59765 [229] train-logloss:0.59831 [230] train-logloss:0.59830 [231] train-logloss:0.59818 [232] train-logloss:0.59829 [233] train-logloss:0.59806 [234] train-logloss:0.59734 [235] train-logloss:0.59763 [236] train-logloss:0.59748 [237] train-logloss:0.59630 [238] train-logloss:0.59615 [239] train-logloss:0.59571 [240] train-logloss:0.59605 [241] train-logloss:0.59521 [242] train-logloss:0.59485 [243] train-logloss:0.59427 [244] train-logloss:0.59476 [245] train-logloss:0.59555 [246] train-logloss:0.59568 [247] train-logloss:0.59555 [248] train-logloss:0.59653 [249] train-logloss:0.59710 [250] train-logloss:0.59722 [251] train-logloss:0.59678 [252] train-logloss:0.59689 [253] train-logloss:0.59721 [254] train-logloss:0.59773 [255] train-logloss:0.59789 [256] train-logloss:0.59814 [257] train-logloss:0.59722 [258] train-logloss:0.59697 [259] train-logloss:0.59736 [260] train-logloss:0.59678 [261] train-logloss:0.59661 [262] train-logloss:0.59701 [263] train-logloss:0.59634 [264] train-logloss:0.59628 [265] train-logloss:0.59599 [266] train-logloss:0.59570 [267] train-logloss:0.59623 [268] train-logloss:0.59656 [269] train-logloss:0.59578 [270] train-logloss:0.59617 [271] train-logloss:0.59549 [272] train-logloss:0.59521 [273] train-logloss:0.59510 [274] train-logloss:0.59484 [275] train-logloss:0.59461 [276] train-logloss:0.59496 [277] train-logloss:0.59509 [278] train-logloss:0.59511 [279] train-logloss:0.59475 [280] train-logloss:0.59425 [281] train-logloss:0.59337 [282] train-logloss:0.59408 [283] train-logloss:0.59440 [284] train-logloss:0.59461 [285] train-logloss:0.59478 [286] train-logloss:0.59540 [287] train-logloss:0.59601 [288] train-logloss:0.59565 [289] train-logloss:0.59641 [290] train-logloss:0.59619 [291] train-logloss:0.59652 [292] train-logloss:0.59666 [293] train-logloss:0.59647 [294] train-logloss:0.59690 [295] train-logloss:0.59681 [296] train-logloss:0.59674 [297] train-logloss:0.59613 [298] train-logloss:0.59633 [299] train-logloss:0.59615 [300] train-logloss:0.59657 [301] train-logloss:0.59685 [302] train-logloss:0.59679 [303] train-logloss:0.59676 [304] train-logloss:0.59651 [305] train-logloss:0.59599 [306] train-logloss:0.59591 [307] train-logloss:0.59589 [308] train-logloss:0.59606 [309] train-logloss:0.59680 [310] train-logloss:0.59755 [311] train-logloss:0.59776 [312] train-logloss:0.59839 [313] train-logloss:0.59982 [314] train-logloss:0.60061 [315] train-logloss:0.60068 [316] train-logloss:0.60074 [317] train-logloss:0.60003 [318] train-logloss:0.59996 [319] train-logloss:0.59952 [320] train-logloss:0.59922 [321] train-logloss:0.59896 [322] train-logloss:0.59843 [323] train-logloss:0.59792 [324] train-logloss:0.59771 [325] train-logloss:0.59799 [326] train-logloss:0.59850 [327] train-logloss:0.59840 [328] train-logloss:0.59858 [329] train-logloss:0.59830 [330] train-logloss:0.59859 [331] train-logloss:0.59892 [332] train-logloss:0.59962 [333] train-logloss:0.59948 [334] train-logloss:0.59957 [335] train-logloss:0.59921 [336] train-logloss:0.59992 [337] train-logloss:0.60011 [338] train-logloss:0.60025 [339] train-logloss:0.60013 [340] train-logloss:0.59981 [341] train-logloss:0.59978 [342] train-logloss:0.59933 [343] train-logloss:0.59936 [344] train-logloss:0.59835 [345] train-logloss:0.59806 [346] train-logloss:0.59652 [347] train-logloss:0.59686 [348] train-logloss:0.59685 [349] train-logloss:0.59660 [350] train-logloss:0.59550 [351] train-logloss:0.59544 [352] train-logloss:0.59591 [353] train-logloss:0.59621 [354] train-logloss:0.59615 [355] train-logloss:0.59651 [356] train-logloss:0.59627 [357] train-logloss:0.59743 [358] train-logloss:0.59777 [359] train-logloss:0.59810 [360] train-logloss:0.59777 [361] train-logloss:0.59743 [362] train-logloss:0.59659 [363] train-logloss:0.59644 [364] train-logloss:0.59640 [365] train-logloss:0.59634 [366] train-logloss:0.59636 [367] train-logloss:0.59684 [368] train-logloss:0.59731 [369] train-logloss:0.59742 [370] train-logloss:0.59739 [371] train-logloss:0.59784 [372] train-logloss:0.59729 [373] train-logloss:0.59773 [374] train-logloss:0.59768 [375] train-logloss:0.59806 [376] train-logloss:0.59811 [377] train-logloss:0.59777 [378] train-logloss:0.59874 [379] train-logloss:0.59870 [380] train-logloss:0.59868 [381] train-logloss:0.59937 [382] train-logloss:0.59917 [383] train-logloss:0.59956 [384] train-logloss:0.59952 [385] train-logloss:0.59952 [386] train-logloss:0.59907 [387] train-logloss:0.59934 [388] train-logloss:0.59920 [389] train-logloss:0.59938 [390] train-logloss:0.59972 [391] train-logloss:0.59959 [392] train-logloss:0.59966 [393] train-logloss:0.59993 [394] train-logloss:0.59983 [395] train-logloss:0.60023 [396] train-logloss:0.60025 [397] train-logloss:0.60012 [398] train-logloss:0.59959 [399] train-logloss:0.59971 [400] train-logloss:0.59964 [401] train-logloss:0.59952 [402] train-logloss:0.59944 [403] train-logloss:0.59939 [404] train-logloss:0.59934 [405] train-logloss:0.59978 [406] train-logloss:0.59954 [407] train-logloss:0.59956 [408] train-logloss:0.59985 [409] train-logloss:0.59924 [410] train-logloss:0.59999 [411] train-logloss:0.60040 [412] train-logloss:0.60098 [413] train-logloss:0.60030 [414] train-logloss:0.60028 [415] train-logloss:0.59985 [416] train-logloss:0.60055 [417] train-logloss:0.60067 [418] train-logloss:0.60093 [419] train-logloss:0.60046 [420] train-logloss:0.60099 [421] train-logloss:0.60128 [422] train-logloss:0.60063 [423] train-logloss:0.60044 [424] train-logloss:0.60062 [425] train-logloss:0.60075 [426] train-logloss:0.60039 [427] train-logloss:0.60039 [428] train-logloss:0.60120 [429] train-logloss:0.60134 [430] train-logloss:0.60121 [431] train-logloss:0.60132 [432] train-logloss:0.60147 [433] train-logloss:0.60110 [434] train-logloss:0.60113 [435] train-logloss:0.60103 [436] train-logloss:0.60065 [437] train-logloss:0.60031 [438] train-logloss:0.60043 [439] train-logloss:0.60048 [440] train-logloss:0.60005 [441] train-logloss:0.59975 [442] train-logloss:0.59958 [443] train-logloss:0.59946 [444] train-logloss:0.59932 [445] train-logloss:0.59964 [446] train-logloss:0.59884 [447] train-logloss:0.59847 [448] train-logloss:0.59863 [449] train-logloss:0.59869 [450] train-logloss:0.59856 [451] train-logloss:0.59894 [452] train-logloss:0.59901 [453] train-logloss:0.59873 [454] train-logloss:0.59953 [455] train-logloss:0.59953 [456] train-logloss:0.59972 [457] train-logloss:0.59962 [458] train-logloss:0.59994 [459] train-logloss:0.60006 [460] train-logloss:0.60028 [461] train-logloss:0.60110 [462] train-logloss:0.60111 [463] train-logloss:0.60122 [464] train-logloss:0.60074 [465] train-logloss:0.60093 [466] train-logloss:0.60080 [467] train-logloss:0.60120 [468] train-logloss:0.60122 [469] train-logloss:0.60124 [470] train-logloss:0.60122 [471] train-logloss:0.60116 [472] train-logloss:0.60101 [473] train-logloss:0.60090 [474] train-logloss:0.60111 [475] train-logloss:0.60109 [476] train-logloss:0.60151 [477] train-logloss:0.60201 [478] train-logloss:0.60160 [479] train-logloss:0.60101 [480] train-logloss:0.60132 [481] train-logloss:0.60067 [482] train-logloss:0.60054 [483] train-logloss:0.60041 [484] train-logloss:0.60017 [485] train-logloss:0.60025 [486] train-logloss:0.60024 [487] train-logloss:0.59967 [488] train-logloss:0.59935 [489] train-logloss:0.59868 [490] train-logloss:0.59907 [491] train-logloss:0.59912 [492] train-logloss:0.59919 [493] train-logloss:0.59890 [494] train-logloss:0.59955 [495] train-logloss:0.59947 [496] train-logloss:0.59907 [497] train-logloss:0.59937 [498] train-logloss:0.59933 [499] train-logloss:0.59960 [500] train-logloss:0.60029 [501] train-logloss:0.60047 [502] train-logloss:0.60013 [503] train-logloss:0.59989 [504] train-logloss:0.60059 [505] train-logloss:0.60072 [506] train-logloss:0.60102 [507] train-logloss:0.60086 [508] train-logloss:0.60060 [509] train-logloss:0.60126 [510] train-logloss:0.60112 [511] train-logloss:0.60126 [512] train-logloss:0.60129 [513] train-logloss:0.60059 [514] train-logloss:0.59989 [515] train-logloss:0.60005 [516] train-logloss:0.59968 [517] train-logloss:0.60008 [518] train-logloss:0.60084 [519] train-logloss:0.60062 [520] train-logloss:0.60111 [521] train-logloss:0.60070 [522] train-logloss:0.60063 [523] train-logloss:0.60065 [524] train-logloss:0.60044 [525] train-logloss:0.60053 [526] train-logloss:0.60099 [527] train-logloss:0.60125 [528] train-logloss:0.60105 [529] train-logloss:0.60155 [530] train-logloss:0.60176 [531] train-logloss:0.60249 [532] train-logloss:0.60304 [533] train-logloss:0.60372 [534] train-logloss:0.60326 [535] train-logloss:0.60391 [536] train-logloss:0.60371 [537] train-logloss:0.60472 [538] train-logloss:0.60431 [539] train-logloss:0.60337 [540] train-logloss:0.60355 [541] train-logloss:0.60365 [542] train-logloss:0.60295 [543] train-logloss:0.60268 [544] train-logloss:0.60312 [545] train-logloss:0.60293 [546] train-logloss:0.60275 [547] train-logloss:0.60344 [548] train-logloss:0.60334 [549] train-logloss:0.60411 [550] train-logloss:0.60460 [551] train-logloss:0.60409 [552] train-logloss:0.60423 [553] train-logloss:0.60366 [554] train-logloss:0.60341 [555] train-logloss:0.60364 [556] train-logloss:0.60365 [557] train-logloss:0.60316 [558] train-logloss:0.60353 [559] train-logloss:0.60382 [560] train-logloss:0.60396 [561] train-logloss:0.60426 [562] train-logloss:0.60465 [563] train-logloss:0.60500 [564] train-logloss:0.60502 [565] train-logloss:0.60465 [566] train-logloss:0.60496 [567] train-logloss:0.60545 [568] train-logloss:0.60523 [569] train-logloss:0.60451 [570] train-logloss:0.60424 [571] train-logloss:0.60479 [572] train-logloss:0.60501 [573] train-logloss:0.60472 [574] train-logloss:0.60399 [575] train-logloss:0.60399 [576] train-logloss:0.60339 [577] train-logloss:0.60306 [578] train-logloss:0.60286 [579] train-logloss:0.60302 [580] train-logloss:0.60266 [581] train-logloss:0.60206 [582] train-logloss:0.60216 [583] train-logloss:0.60172 [584] train-logloss:0.60186 [585] train-logloss:0.60173 [586] train-logloss:0.60159 [587] train-logloss:0.60130 [588] train-logloss:0.60173 [589] train-logloss:0.60182 [590] train-logloss:0.60176 [591] train-logloss:0.60224 [592] train-logloss:0.60242 [593] train-logloss:0.60209 [594] train-logloss:0.60148 [595] train-logloss:0.60173 [596] train-logloss:0.60187 [597] train-logloss:0.60157 [598] train-logloss:0.60219 [599] train-logloss:0.60211 [600] train-logloss:0.60197 [601] train-logloss:0.60217 [602] train-logloss:0.60158 [603] train-logloss:0.60171 [604] train-logloss:0.60143 [605] train-logloss:0.60067 [606] train-logloss:0.60052 [607] train-logloss:0.60008 [608] train-logloss:0.59992 [609] train-logloss:0.60023 [610] train-logloss:0.60063 [611] train-logloss:0.60079 [612] train-logloss:0.60056 [613] train-logloss:0.60045 [614] train-logloss:0.60035 [615] train-logloss:0.60040 [616] train-logloss:0.60038 [617] train-logloss:0.60047 [618] train-logloss:0.60006 [619] train-logloss:0.60058 [620] train-logloss:0.60048 [621] train-logloss:0.60130 [622] train-logloss:0.60134 [623] train-logloss:0.60108 [624] train-logloss:0.60107 [625] train-logloss:0.60103 [626] train-logloss:0.60110 [627] train-logloss:0.60111 [628] train-logloss:0.60118 [629] train-logloss:0.60107 [630] train-logloss:0.60026 [631] train-logloss:0.60035 [632] train-logloss:0.60089 [633] train-logloss:0.60139 [634] train-logloss:0.60136 [635] train-logloss:0.60107 [636] train-logloss:0.60094 [637] train-logloss:0.60075 [638] train-logloss:0.60102 [639] train-logloss:0.60164 [640] train-logloss:0.60075 [641] train-logloss:0.60064 [642] train-logloss:0.60051 [643] train-logloss:0.60076 [644] train-logloss:0.60053 [645] train-logloss:0.60062 [646] train-logloss:0.60055 [647] train-logloss:0.60115 [648] train-logloss:0.60093 [649] train-logloss:0.60052 [650] train-logloss:0.60054 [651] train-logloss:0.60064 [652] train-logloss:0.60126 [653] train-logloss:0.60113 [654] train-logloss:0.60096 [655] train-logloss:0.60108 [656] train-logloss:0.60129 [657] train-logloss:0.60122 [658] train-logloss:0.60162 [659] train-logloss:0.60155 [660] train-logloss:0.60163 [661] train-logloss:0.60166 [662] train-logloss:0.60170 [663] train-logloss:0.60317 [664] train-logloss:0.60358 [665] train-logloss:0.60430 [666] train-logloss:0.60406 [667] train-logloss:0.60419 [668] train-logloss:0.60394 [669] train-logloss:0.60423 [670] train-logloss:0.60479 [671] train-logloss:0.60492 [672] train-logloss:0.60493 [673] train-logloss:0.60458 [674] train-logloss:0.60413 [675] train-logloss:0.60381 [676] train-logloss:0.60380 [677] train-logloss:0.60329 [678] train-logloss:0.60327 [679] train-logloss:0.60334 [680] train-logloss:0.60352 [681] train-logloss:0.60370 [682] train-logloss:0.60361 [683] train-logloss:0.60389 [684] train-logloss:0.60361 [685] train-logloss:0.60419 [686] train-logloss:0.60502 [687] train-logloss:0.60500 [688] train-logloss:0.60507 [689] train-logloss:0.60466 [690] train-logloss:0.60461 [691] train-logloss:0.60461 [692] train-logloss:0.60505 [693] train-logloss:0.60527 [694] train-logloss:0.60532 [695] train-logloss:0.60534 [696] train-logloss:0.60565 [697] train-logloss:0.60592 [698] train-logloss:0.60541 [699] train-logloss:0.60534 [700] train-logloss:0.60509 [701] train-logloss:0.60491 [702] train-logloss:0.60503 [703] train-logloss:0.60507 [704] train-logloss:0.60564 [705] train-logloss:0.60548 [706] train-logloss:0.60611 [707] train-logloss:0.60603 [708] train-logloss:0.60553 [709] train-logloss:0.60522 [710] train-logloss:0.60433 [711] train-logloss:0.60431 [712] train-logloss:0.60441 [713] train-logloss:0.60433 [714] train-logloss:0.60479 [715] train-logloss:0.60464 [716] train-logloss:0.60522 [717] train-logloss:0.60565 [718] train-logloss:0.60521 [719] train-logloss:0.60472 [720] train-logloss:0.60502 [721] train-logloss:0.60541 [722] train-logloss:0.60551 [723] train-logloss:0.60531 [724] train-logloss:0.60464 [725] train-logloss:0.60453 [726] train-logloss:0.60449 [727] train-logloss:0.60426 [728] train-logloss:0.60378 [729] train-logloss:0.60523 [730] train-logloss:0.60574 [731] train-logloss:0.60550 [732] train-logloss:0.60547 [733] train-logloss:0.60580 [734] train-logloss:0.60546 [735] train-logloss:0.60541 [736] train-logloss:0.60566 [737] train-logloss:0.60568 [738] train-logloss:0.60556 [739] train-logloss:0.60546 [740] train-logloss:0.60533 [741] train-logloss:0.60570 [742] train-logloss:0.60580 [743] train-logloss:0.60562 [744] train-logloss:0.60563 [745] train-logloss:0.60553 [746] train-logloss:0.60570 [747] train-logloss:0.60584 [748] train-logloss:0.60632 [749] train-logloss:0.60628 [750] train-logloss:0.60637 [751] train-logloss:0.60680 [752] train-logloss:0.60716 [753] train-logloss:0.60663 [754] train-logloss:0.60630 [755] train-logloss:0.60617 [756] train-logloss:0.60614 [757] train-logloss:0.60527 [758] train-logloss:0.60568 [759] train-logloss:0.60560 [760] train-logloss:0.60595 [761] train-logloss:0.60631 [762] train-logloss:0.60588 [763] train-logloss:0.60584 [764] train-logloss:0.60627 [765] train-logloss:0.60617 [766] train-logloss:0.60665 [767] train-logloss:0.60641 [768] train-logloss:0.60655 [769] train-logloss:0.60689 [770] train-logloss:0.60710 [771] train-logloss:0.60707 [772] train-logloss:0.60664 [773] train-logloss:0.60689 [774] train-logloss:0.60732 [775] train-logloss:0.60677 [776] train-logloss:0.60677 [777] train-logloss:0.60719 [778] train-logloss:0.60771 [779] train-logloss:0.60774 [780] train-logloss:0.60803 [781] train-logloss:0.60886 [782] train-logloss:0.60919 [783] train-logloss:0.60931 [784] train-logloss:0.60956 [785] train-logloss:0.60928 [786] train-logloss:0.60890 [787] train-logloss:0.60871 [788] train-logloss:0.60884 [789] train-logloss:0.60840 [790] train-logloss:0.60815 [791] train-logloss:0.60824 [792] train-logloss:0.60808 [793] train-logloss:0.60843 [794] train-logloss:0.60818 [795] train-logloss:0.60906 [796] train-logloss:0.60931 [797] train-logloss:0.60894 [798] train-logloss:0.60874 [799] train-logloss:0.60895 [800] train-logloss:0.60818 [801] train-logloss:0.60806 [802] train-logloss:0.60856 [803] train-logloss:0.60939 [804] train-logloss:0.60937 [805] train-logloss:0.60924 [806] train-logloss:0.60880 [807] train-logloss:0.60893 [808] train-logloss:0.60851 [809] train-logloss:0.60872 [810] train-logloss:0.60823 [811] train-logloss:0.60924 [812] train-logloss:0.60916 [813] train-logloss:0.60913 [814] train-logloss:0.60906 [815] train-logloss:0.60876 [816] train-logloss:0.60875 [817] train-logloss:0.60929 [818] train-logloss:0.60952 [819] train-logloss:0.60933 [820] train-logloss:0.60891 [821] train-logloss:0.60856 [822] train-logloss:0.60921 [823] train-logloss:0.60961 [824] train-logloss:0.60921 [825] train-logloss:0.60899 [826] train-logloss:0.60953 [827] train-logloss:0.61011 [828] train-logloss:0.60985 [829] train-logloss:0.60952 [830] train-logloss:0.60889 [831] train-logloss:0.60909 [832] train-logloss:0.60925 [833] train-logloss:0.60953 [834] train-logloss:0.60918 [835] train-logloss:0.60896 [836] train-logloss:0.60951 [837] train-logloss:0.60939 [838] train-logloss:0.60935 [839] train-logloss:0.60904 [840] train-logloss:0.60951 [841] train-logloss:0.61017 [842] train-logloss:0.61034 [843] train-logloss:0.61009 [844] train-logloss:0.61010 [845] train-logloss:0.61063 [846] train-logloss:0.61112 [847] train-logloss:0.61078 [848] train-logloss:0.61036 [849] train-logloss:0.61058 [850] train-logloss:0.61066 [851] train-logloss:0.61041 [852] train-logloss:0.61029 [853] train-logloss:0.60977 [854] train-logloss:0.60990 [855] train-logloss:0.60954 [856] train-logloss:0.60964 [857] train-logloss:0.60979 [858] train-logloss:0.60995 [859] train-logloss:0.60974 [860] train-logloss:0.60945 [861] train-logloss:0.60979 [862] train-logloss:0.61024 [863] train-logloss:0.61075 [864] train-logloss:0.61087 [865] train-logloss:0.61062 [866] train-logloss:0.61108 [867] train-logloss:0.61132 [868] train-logloss:0.61127 [869] train-logloss:0.61123 [870] train-logloss:0.61163 [871] train-logloss:0.61160 [872] train-logloss:0.61153 [873] train-logloss:0.61156 [874] train-logloss:0.61207 [875] train-logloss:0.61186 [876] train-logloss:0.61301 [877] train-logloss:0.61300 [878] train-logloss:0.61276 [879] train-logloss:0.61250 [880] train-logloss:0.61269 [881] train-logloss:0.61302 [882] train-logloss:0.61330 [883] train-logloss:0.61256 [884] train-logloss:0.61219 [885] train-logloss:0.61190 [886] train-logloss:0.61175 [887] train-logloss:0.61211 [888] train-logloss:0.61195 [889] train-logloss:0.61177 [890] train-logloss:0.61180 [891] train-logloss:0.61172 [892] train-logloss:0.61242 [893] train-logloss:0.61320 [894] train-logloss:0.61337 [895] train-logloss:0.61354 [896] train-logloss:0.61354 [897] train-logloss:0.61361 [898] train-logloss:0.61390 [899] train-logloss:0.61390 [900] train-logloss:0.61439 [901] train-logloss:0.61473 [902] train-logloss:0.61455 [903] train-logloss:0.61482 [904] train-logloss:0.61491 [905] train-logloss:0.61608 [906] train-logloss:0.61604 [907] train-logloss:0.61654 [908] train-logloss:0.61628 [909] train-logloss:0.61609 [910] train-logloss:0.61661 [911] train-logloss:0.61665 [912] train-logloss:0.61649 [913] train-logloss:0.61661 [914] train-logloss:0.61669 [915] train-logloss:0.61661 [916] train-logloss:0.61669 [917] train-logloss:0.61610 [918] train-logloss:0.61622 [919] train-logloss:0.61678 [920] train-logloss:0.61674 [921] train-logloss:0.61652 [922] train-logloss:0.61651 [923] train-logloss:0.61610 [924] train-logloss:0.61625 [925] train-logloss:0.61607 [926] train-logloss:0.61634 [927] train-logloss:0.61619 [928] train-logloss:0.61594 [929] train-logloss:0.61565 [930] train-logloss:0.61541 [931] train-logloss:0.61557 [932] train-logloss:0.61549 [933] train-logloss:0.61504 [934] train-logloss:0.61500 [935] train-logloss:0.61530 [936] train-logloss:0.61608 [937] train-logloss:0.61571 [938] train-logloss:0.61553 [939] train-logloss:0.61567 [940] train-logloss:0.61549 [941] train-logloss:0.61562 [942] train-logloss:0.61594 [943] train-logloss:0.61611 [944] train-logloss:0.61579 [945] train-logloss:0.61624 [946] train-logloss:0.61548 [947] train-logloss:0.61579 [948] train-logloss:0.61570 [949] train-logloss:0.61623 [950] train-logloss:0.61624 [951] train-logloss:0.61583 [952] train-logloss:0.61581 [953] train-logloss:0.61566 [954] train-logloss:0.61573 [955] train-logloss:0.61590 [956] train-logloss:0.61602 [957] train-logloss:0.61595 [958] train-logloss:0.61607 [959] train-logloss:0.61633 [960] train-logloss:0.61581 [961] train-logloss:0.61588 [962] train-logloss:0.61593 [963] train-logloss:0.61603 [964] train-logloss:0.61550 [965] train-logloss:0.61553 [966] train-logloss:0.61595 [967] train-logloss:0.61583 [968] train-logloss:0.61558 [969] train-logloss:0.61575 [970] train-logloss:0.61599 [971] train-logloss:0.61579 [972] train-logloss:0.61623 [973] train-logloss:0.61584 [974] train-logloss:0.61529 [975] train-logloss:0.61515 [976] train-logloss:0.61492 [977] train-logloss:0.61465 [978] train-logloss:0.61481 [979] train-logloss:0.61462 [980] train-logloss:0.61420 [981] train-logloss:0.61395 [982] train-logloss:0.61406 [983] train-logloss:0.61360 [984] train-logloss:0.61340 [985] train-logloss:0.61345 [986] train-logloss:0.61342 [987] train-logloss:0.61302 [988] train-logloss:0.61285 [989] train-logloss:0.61300 [990] train-logloss:0.61285 [991] train-logloss:0.61253 [992] train-logloss:0.61262 [993] train-logloss:0.61249 [994] train-logloss:0.61250 [995] train-logloss:0.61245 [996] train-logloss:0.61260 [997] train-logloss:0.61251 [998] train-logloss:0.61306 [999] train-logloss:0.61383 [1000] train-logloss:0.61397 [1001] train-logloss:0.61455 [1002] train-logloss:0.61472 [1003] train-logloss:0.61494 [1004] train-logloss:0.61473 [1005] train-logloss:0.61453 [1006] train-logloss:0.61421 [1007] train-logloss:0.61468 [1008] train-logloss:0.61430 [1009] train-logloss:0.61480 [1010] train-logloss:0.61528 [1011] train-logloss:0.61538 [1012] train-logloss:0.61550 [1013] train-logloss:0.61584 [1014] train-logloss:0.61590 [1015] train-logloss:0.61605 [1016] train-logloss:0.61570 [1017] train-logloss:0.61538 [1018] train-logloss:0.61533 [1019] train-logloss:0.61534 [1020] train-logloss:0.61527 [1021] train-logloss:0.61568 [1022] train-logloss:0.61605 [1023] train-logloss:0.61607 [1024] train-logloss:0.61542 [1025] train-logloss:0.61558 [1026] train-logloss:0.61556 [1027] train-logloss:0.61553 [1028] train-logloss:0.61594 [1029] train-logloss:0.61582 [1030] train-logloss:0.61594 [1031] train-logloss:0.61604 [1032] train-logloss:0.61639 [1033] train-logloss:0.61661 [1034] train-logloss:0.61689 [1035] train-logloss:0.61686 [1036] train-logloss:0.61699 [1037] train-logloss:0.61677 [1038] train-logloss:0.61704 [1039] train-logloss:0.61679 [1040] train-logloss:0.61639 [1041] train-logloss:0.61661 [1042] train-logloss:0.61671 [1043] train-logloss:0.61707 [1044] train-logloss:0.61705 [1045] train-logloss:0.61700 [1046] train-logloss:0.61702 [1047] train-logloss:0.61658 [1048] train-logloss:0.61620 [1049] train-logloss:0.61636 [1050] train-logloss:0.61652 [1051] train-logloss:0.61664 [1052] train-logloss:0.61641 [1053] train-logloss:0.61597 [1054] train-logloss:0.61604 [1055] train-logloss:0.61616 [1056] train-logloss:0.61564 [1057] train-logloss:0.61594 [1058] train-logloss:0.61626 [1059] train-logloss:0.61589 [1060] train-logloss:0.61572 [1061] train-logloss:0.61588 [1062] train-logloss:0.61573 [1063] train-logloss:0.61585 [1064] train-logloss:0.61614 [1065] train-logloss:0.61631 [1066] train-logloss:0.61634 [1067] train-logloss:0.61673 [1068] train-logloss:0.61688 [1069] train-logloss:0.61712 [1070] train-logloss:0.61709 [1071] train-logloss:0.61696 [1072] train-logloss:0.61791 [1073] train-logloss:0.61820 [1074] train-logloss:0.61861 [1075] train-logloss:0.61900 [1076] train-logloss:0.61834 [1077] train-logloss:0.61826 [1078] train-logloss:0.61791 [1079] train-logloss:0.61792 [1080] train-logloss:0.61756 [1081] train-logloss:0.61741 [1082] train-logloss:0.61676 [1083] train-logloss:0.61664 [1084] train-logloss:0.61645 [1085] train-logloss:0.61573 [1086] train-logloss:0.61622 [1087] train-logloss:0.61672 [1088] train-logloss:0.61692 [1089] train-logloss:0.61723 [1090] train-logloss:0.61650 [1091] train-logloss:0.61586 [1092] train-logloss:0.61588 [1093] train-logloss:0.61634 [1094] train-logloss:0.61671 [1095] train-logloss:0.61643 [1096] train-logloss:0.61593 [1097] train-logloss:0.61576 [1098] train-logloss:0.61546 [1099] train-logloss:0.61495 [1100] train-logloss:0.61523 [1101] train-logloss:0.61544 [1102] train-logloss:0.61590 [1103] train-logloss:0.61593 [1104] train-logloss:0.61564 [1105] train-logloss:0.61594 [1106] train-logloss:0.61570 [1107] train-logloss:0.61605 [1108] train-logloss:0.61652 [1109] train-logloss:0.61626 [1110] train-logloss:0.61620 [1111] train-logloss:0.61637 [1112] train-logloss:0.61701 [1113] train-logloss:0.61639 [1114] train-logloss:0.61580 [1115] train-logloss:0.61562 [1116] train-logloss:0.61616 [1117] train-logloss:0.61612 [1118] train-logloss:0.61586 [1119] train-logloss:0.61648 [1120] train-logloss:0.61633 [1121] train-logloss:0.61633 [1122] train-logloss:0.61712 [1123] train-logloss:0.61759 [1124] train-logloss:0.61791 [1125] train-logloss:0.61720 [1126] train-logloss:0.61710 [1127] train-logloss:0.61720 [1128] train-logloss:0.61675 [1129] train-logloss:0.61666 [1130] train-logloss:0.61628 [1131] train-logloss:0.61601 [1132] train-logloss:0.61628 [1133] train-logloss:0.61608 [1134] train-logloss:0.61602 [1135] train-logloss:0.61527 [1136] train-logloss:0.61503 [1137] train-logloss:0.61488 [1138] train-logloss:0.61479 [1139] train-logloss:0.61432 [1140] train-logloss:0.61408 [1141] train-logloss:0.61431 [1142] train-logloss:0.61440 [1143] train-logloss:0.61479 [1144] train-logloss:0.61484 [1145] train-logloss:0.61439 [1146] train-logloss:0.61438 [1147] train-logloss:0.61478 [1148] train-logloss:0.61462 [1149] train-logloss:0.61460 [1150] train-logloss:0.61440 [1151] train-logloss:0.61477 [1152] train-logloss:0.61534 [1153] train-logloss:0.61534 [1154] train-logloss:0.61508 [1155] train-logloss:0.61530 [1156] train-logloss:0.61556 [1157] train-logloss:0.61549 [1158] train-logloss:0.61548 [1159] train-logloss:0.61577 [1160] train-logloss:0.61552 [1161] train-logloss:0.61577 [1162] train-logloss:0.61566 [1163] train-logloss:0.61610 [1164] train-logloss:0.61608 [1165] train-logloss:0.61612 [1166] train-logloss:0.61637 [1167] train-logloss:0.61638 [1168] train-logloss:0.61655 [1169] train-logloss:0.61646 [1170] train-logloss:0.61632 [1171] train-logloss:0.61654 [1172] train-logloss:0.61617 [1173] train-logloss:0.61593 [1174] train-logloss:0.61582 [1175] train-logloss:0.61604 [1176] train-logloss:0.61593 [1177] train-logloss:0.61602 [1178] train-logloss:0.61590 [1179] train-logloss:0.61559 [1180] train-logloss:0.61554 [1181] train-logloss:0.61582 [1182] train-logloss:0.61582 [1183] train-logloss:0.61576 [1184] train-logloss:0.61592 [1185] train-logloss:0.61615 [1186] train-logloss:0.61567 [1187] train-logloss:0.61549 [1188] train-logloss:0.61548 [1189] train-logloss:0.61619 [1190] train-logloss:0.61626 [1191] train-logloss:0.61679 [1192] train-logloss:0.61673 [1193] train-logloss:0.61731 [1194] train-logloss:0.61746 [1195] train-logloss:0.61761 [1196] train-logloss:0.61761 [1197] train-logloss:0.61751 [1198] train-logloss:0.61805 [1199] train-logloss:0.61834 [1200] train-logloss:0.61812 [1201] train-logloss:0.61811 [1202] train-logloss:0.61823 [1203] train-logloss:0.61798 [1204] train-logloss:0.61777 [1205] train-logloss:0.61818 [1206] train-logloss:0.61818 [1207] train-logloss:0.61824 [1208] train-logloss:0.61831 [1209] train-logloss:0.61811 [1210] train-logloss:0.61812 [1211] train-logloss:0.61833 [1212] train-logloss:0.61835 [1213] train-logloss:0.61837 [1214] train-logloss:0.61841 [1215] train-logloss:0.61840 [1216] train-logloss:0.61836 [1217] train-logloss:0.61805 [1218] train-logloss:0.61808 [1219] train-logloss:0.61835 [1220] train-logloss:0.61845 [1221] train-logloss:0.61870 [1222] train-logloss:0.61850 [1223] train-logloss:0.61854 [1224] train-logloss:0.61863 [1225] train-logloss:0.61899 [1226] train-logloss:0.61892 [1227] train-logloss:0.61846 [1228] train-logloss:0.61747 [1229] train-logloss:0.61741 [1230] train-logloss:0.61723 [1231] train-logloss:0.61720 [1232] train-logloss:0.61760 [1233] train-logloss:0.61721 [1234] train-logloss:0.61750 [1235] train-logloss:0.61749 [1236] train-logloss:0.61791 [1237] train-logloss:0.61784 [1238] train-logloss:0.61782 [1239] train-logloss:0.61761 [1240] train-logloss:0.61788 [1241] train-logloss:0.61803 [1242] train-logloss:0.61798 [1243] train-logloss:0.61792 [1244] train-logloss:0.61842 [1245] train-logloss:0.61798 [1246] train-logloss:0.61819 [1247] train-logloss:0.61888 [1248] train-logloss:0.61904 [1249] train-logloss:0.61933 [1250] train-logloss:0.61934 [1251] train-logloss:0.61989 [1252] train-logloss:0.61986 [1253] train-logloss:0.61987 [1254] train-logloss:0.62028 [1255] train-logloss:0.62067 [1256] train-logloss:0.62057 [1257] train-logloss:0.62052 [1258] train-logloss:0.62099 [1259] train-logloss:0.62093 [1260] train-logloss:0.62084 [1261] train-logloss:0.62128 [1262] train-logloss:0.62201 [1263] train-logloss:0.62241 [1264] train-logloss:0.62245 [1265] train-logloss:0.62252 [1266] train-logloss:0.62243 [1267] train-logloss:0.62244 [1268] train-logloss:0.62245 [1269] train-logloss:0.62248 [1270] train-logloss:0.62249 [1271] train-logloss:0.62313 [1272] train-logloss:0.62362 [1273] train-logloss:0.62363 [1274] train-logloss:0.62333 [1275] train-logloss:0.62393 [1276] train-logloss:0.62373 [1277] train-logloss:0.62412 [1278] train-logloss:0.62350 [1279] train-logloss:0.62284 [1280] train-logloss:0.62233 [1281] train-logloss:0.62190 [1282] train-logloss:0.62219 [1283] train-logloss:0.62188 [1284] train-logloss:0.62152 [1285] train-logloss:0.62160 [1286] train-logloss:0.62161 [1287] train-logloss:0.62144 [1288] train-logloss:0.62174 [1289] train-logloss:0.62205 [1290] train-logloss:0.62258 [1291] train-logloss:0.62214 [1292] train-logloss:0.62211 [1293] train-logloss:0.62220 [1294] train-logloss:0.62162 [1295] train-logloss:0.62190 [1296] train-logloss:0.62167 [1297] train-logloss:0.62130 [1298] train-logloss:0.62131 [1299] train-logloss:0.62069 [1300] train-logloss:0.62077 [1301] train-logloss:0.62085 [1302] train-logloss:0.62065 [1303] train-logloss:0.62093 [1304] train-logloss:0.62098 [1305] train-logloss:0.62133 [1306] train-logloss:0.62180 [1307] train-logloss:0.62205 [1308] train-logloss:0.62153 [1309] train-logloss:0.62135 [1310] train-logloss:0.62109 [1311] train-logloss:0.62135 [1312] train-logloss:0.62126 [1313] train-logloss:0.62143 [1314] train-logloss:0.62136 [1315] train-logloss:0.62137 [1316] train-logloss:0.62184 [1317] train-logloss:0.62164 [1318] train-logloss:0.62177 [1319] train-logloss:0.62198 [1320] train-logloss:0.62296 [1321] train-logloss:0.62289 [1322] train-logloss:0.62195 [1323] train-logloss:0.62224 [1324] train-logloss:0.62239 [1325] train-logloss:0.62226 [1326] train-logloss:0.62231 [1327] train-logloss:0.62226 [1328] train-logloss:0.62208 [1329] train-logloss:0.62160 [1330] train-logloss:0.62211 [1331] train-logloss:0.62208 [1332] train-logloss:0.62155 [1333] train-logloss:0.62138 [1334] train-logloss:0.62145 [1335] train-logloss:0.62141 [1336] train-logloss:0.62144 [1337] train-logloss:0.62210 [1338] train-logloss:0.62197 [1339] train-logloss:0.62169 [1340] train-logloss:0.62142 [1341] train-logloss:0.62128 [1342] train-logloss:0.62129 [1343] train-logloss:0.62180 [1344] train-logloss:0.62237 [1345] train-logloss:0.62215 [1346] train-logloss:0.62250 [1347] train-logloss:0.62197 [1348] train-logloss:0.62196 [1349] train-logloss:0.62166 [1350] train-logloss:0.62169 [1351] train-logloss:0.62127 [1352] train-logloss:0.62157 [1353] train-logloss:0.62163 [1354] train-logloss:0.62116 [1355] train-logloss:0.62129 [1356] train-logloss:0.62164 [1357] train-logloss:0.62179 [1358] train-logloss:0.62193 [1359] train-logloss:0.62255 [1360] train-logloss:0.62253 [1361] train-logloss:0.62186 [1362] train-logloss:0.62189 [1363] train-logloss:0.62179 [1364] train-logloss:0.62182 [1365] train-logloss:0.62170 [1366] train-logloss:0.62147 [1367] train-logloss:0.62138 [1368] train-logloss:0.62146 [1369] train-logloss:0.62147 [1370] train-logloss:0.62220 [1371] train-logloss:0.62200 [1372] train-logloss:0.62165 [1373] train-logloss:0.62146 [1374] train-logloss:0.62162 [1375] train-logloss:0.62167 [1376] train-logloss:0.62154 [1377] train-logloss:0.62150 [1378] train-logloss:0.62163 [1379] train-logloss:0.62158 [1380] train-logloss:0.62126 [1381] train-logloss:0.62109 [1382] train-logloss:0.62034 [1383] train-logloss:0.62063 [1384] train-logloss:0.61993 [1385] train-logloss:0.62037 [1386] train-logloss:0.62061 [1387] train-logloss:0.62109 [1388] train-logloss:0.62067 [1389] train-logloss:0.62111 [1390] train-logloss:0.62117 [1391] train-logloss:0.62114 [1392] train-logloss:0.62100 [1393] train-logloss:0.62126 [1394] train-logloss:0.62121 [1395] train-logloss:0.62034 [1396] train-logloss:0.62015 [1397] train-logloss:0.61977 [1398] train-logloss:0.61984 [1399] train-logloss:0.61980 [1400] train-logloss:0.62001 [1401] train-logloss:0.62021 [1402] train-logloss:0.61998 [1403] train-logloss:0.61985 [1404] train-logloss:0.62000 [1405] train-logloss:0.61983 [1406] train-logloss:0.62019 [1407] train-logloss:0.62021 [1408] train-logloss:0.62011 [1409] train-logloss:0.62013 [1410] train-logloss:0.62020 [1411] train-logloss:0.62035 [1412] train-logloss:0.62013 [1413] train-logloss:0.62051 [1414] train-logloss:0.62023 [1415] train-logloss:0.61969 [1416] train-logloss:0.61964 [1417] train-logloss:0.62012 [1418] train-logloss:0.61977 [1419] train-logloss:0.62004 [1420] train-logloss:0.61985 [1421] train-logloss:0.62022 [1422] train-logloss:0.62018 [1423] train-logloss:0.62115 [1424] train-logloss:0.62131 [1425] train-logloss:0.62105 [1426] train-logloss:0.62091 [1427] train-logloss:0.62092 [1428] train-logloss:0.62157 [1429] train-logloss:0.62142 [1430] train-logloss:0.62116 [1431] train-logloss:0.62139 [1432] train-logloss:0.62133 [1433] train-logloss:0.62163 [1434] train-logloss:0.62205 [1435] train-logloss:0.62173 [1436] train-logloss:0.62203 [1437] train-logloss:0.62223 [1438] train-logloss:0.62139 [1439] train-logloss:0.62153 [1440] train-logloss:0.62179 [1441] train-logloss:0.62182 [1442] train-logloss:0.62184 [1443] train-logloss:0.62181 [1444] train-logloss:0.62181 [1445] train-logloss:0.62172 [1446] train-logloss:0.62191 [1447] train-logloss:0.62234 [1448] train-logloss:0.62249 [1449] train-logloss:0.62289 [1450] train-logloss:0.62289 [1451] train-logloss:0.62240 [1452] train-logloss:0.62203 [1453] train-logloss:0.62179 [1454] train-logloss:0.62148 [1455] train-logloss:0.62208 [1456] train-logloss:0.62211 [1457] train-logloss:0.62210 [1458] train-logloss:0.62212 [1459] train-logloss:0.62243 [1460] train-logloss:0.62176 [1461] train-logloss:0.62173 [1462] train-logloss:0.62273 [1463] train-logloss:0.62274 [1464] train-logloss:0.62264 [1465] train-logloss:0.62251 [1466] train-logloss:0.62216 [1467] train-logloss:0.62178 [1468] train-logloss:0.62177 [1469] train-logloss:0.62123 [1470] train-logloss:0.62158 [1471] train-logloss:0.62149 [1472] train-logloss:0.62120 [1473] train-logloss:0.62089 [1474] train-logloss:0.62088 [1475] train-logloss:0.62042 [1476] train-logloss:0.62060 [1477] train-logloss:0.62094 [1478] train-logloss:0.62070 [1479] train-logloss:0.62138 [1480] train-logloss:0.62191 [1481] train-logloss:0.62263 [1482] train-logloss:0.62314 [1483] train-logloss:0.62297 [1484] train-logloss:0.62304 [1485] train-logloss:0.62302 [1486] train-logloss:0.62320 [1487] train-logloss:0.62371 [1488] train-logloss:0.62408 [1489] train-logloss:0.62425 [1490] train-logloss:0.62483 [1491] train-logloss:0.62470 [1492] train-logloss:0.62468 [1493] train-logloss:0.62445 [1494] train-logloss:0.62364 [1495] train-logloss:0.62281 [1496] train-logloss:0.62235 [1497] train-logloss:0.62246 [1498] train-logloss:0.62299 [1499] train-logloss:0.62292 [1500] train-logloss:0.62292 [1501] train-logloss:0.62397 [1502] train-logloss:0.62421 [1503] train-logloss:0.62474 [1504] train-logloss:0.62482 [1505] train-logloss:0.62449 [1506] train-logloss:0.62440 [1507] train-logloss:0.62389 [1508] train-logloss:0.62370 [1509] train-logloss:0.62357 [1510] train-logloss:0.62330 [1511] train-logloss:0.62317 [1512] train-logloss:0.62402 [1513] train-logloss:0.62354 [1514] train-logloss:0.62335 [1515] train-logloss:0.62294 [1516] train-logloss:0.62292 [1517] train-logloss:0.62292 [1518] train-logloss:0.62291 [1519] train-logloss:0.62241 [1520] train-logloss:0.62281 [1521] train-logloss:0.62292 [1522] train-logloss:0.62264 [1523] train-logloss:0.62284 [1524] train-logloss:0.62344 [1525] train-logloss:0.62342 [1526] train-logloss:0.62341 [1527] train-logloss:0.62322 [1528] train-logloss:0.62380 [1529] train-logloss:0.62396 [1530] train-logloss:0.62362 [1531] train-logloss:0.62355 [1532] train-logloss:0.62339 [1533] train-logloss:0.62331 [1534] train-logloss:0.62320 [1535] train-logloss:0.62286 [1536] train-logloss:0.62317 [1537] train-logloss:0.62443 [1538] train-logloss:0.62493 [1539] train-logloss:0.62527 [1540] train-logloss:0.62483 [1541] train-logloss:0.62509 [1542] train-logloss:0.62480 [1543] train-logloss:0.62506 [1544] train-logloss:0.62635 [1545] train-logloss:0.62708 [1546] train-logloss:0.62721 [1547] train-logloss:0.62686 [1548] train-logloss:0.62723 [1549] train-logloss:0.62748 [1550] train-logloss:0.62745 [1551] train-logloss:0.62808 [1552] train-logloss:0.62749 [1553] train-logloss:0.62703 [1554] train-logloss:0.62705 [1555] train-logloss:0.62714 [1556] train-logloss:0.62733 [1557] train-logloss:0.62796 [1558] train-logloss:0.62826 [1559] train-logloss:0.62826 [1560] train-logloss:0.62829 [1561] train-logloss:0.62839 [1562] train-logloss:0.62812 [1563] train-logloss:0.62794 [1564] train-logloss:0.62794 [1565] train-logloss:0.62733 [1566] train-logloss:0.62713 [1567] train-logloss:0.62760 [1568] train-logloss:0.62765 [1569] train-logloss:0.62734 [1570] train-logloss:0.62715 [1571] train-logloss:0.62716 [1572] train-logloss:0.62697 [1573] train-logloss:0.62685 [1574] train-logloss:0.62616 [1575] train-logloss:0.62604 [1576] train-logloss:0.62584 [1577] train-logloss:0.62552 [1578] train-logloss:0.62563 [1579] train-logloss:0.62520 [1580] train-logloss:0.62522 [1581] train-logloss:0.62523 [1582] train-logloss:0.62511 [1583] train-logloss:0.62505 [1584] train-logloss:0.62541 [1585] train-logloss:0.62588 [1586] train-logloss:0.62578 [1587] train-logloss:0.62553 [1588] train-logloss:0.62557 [1589] train-logloss:0.62467 [1590] train-logloss:0.62473 [1591] train-logloss:0.62508 [1592] train-logloss:0.62497 [1593] train-logloss:0.62453 [1594] train-logloss:0.62384 [1595] train-logloss:0.62420 [1596] train-logloss:0.62446 [1597] train-logloss:0.62479 [1598] train-logloss:0.62449 [1599] train-logloss:0.62449 [1600] train-logloss:0.62423 [1601] train-logloss:0.62411 [1602] train-logloss:0.62388 [1603] train-logloss:0.62411 [1604] train-logloss:0.62443 [1605] train-logloss:0.62469 [1606] train-logloss:0.62507 [1607] train-logloss:0.62572 [1608] train-logloss:0.62554 [1609] train-logloss:0.62555 [1610] train-logloss:0.62558 [1611] train-logloss:0.62570 [1612] train-logloss:0.62653 [1613] train-logloss:0.62706 [1614] train-logloss:0.62691 [1615] train-logloss:0.62700 [1616] train-logloss:0.62672 [1617] train-logloss:0.62688 [1618] train-logloss:0.62700 [1619] train-logloss:0.62699 [1620] train-logloss:0.62742 [1621] train-logloss:0.62767 [1622] train-logloss:0.62734 [1623] train-logloss:0.62717 [1624] train-logloss:0.62756 [1625] train-logloss:0.62705 [1626] train-logloss:0.62695 [1627] train-logloss:0.62633 [1628] train-logloss:0.62619 [1629] train-logloss:0.62691 [1630] train-logloss:0.62652 [1631] train-logloss:0.62642 [1632] train-logloss:0.62627 [1633] train-logloss:0.62633 [1634] train-logloss:0.62699 [1635] train-logloss:0.62705 [1636] train-logloss:0.62704 [1637] train-logloss:0.62736 [1638] train-logloss:0.62731 [1639] train-logloss:0.62708 [1640] train-logloss:0.62668 [1641] train-logloss:0.62663 [1642] train-logloss:0.62660 [1643] train-logloss:0.62673 [1644] train-logloss:0.62695 [1645] train-logloss:0.62719 [1646] train-logloss:0.62804 [1647] train-logloss:0.62804 [1648] train-logloss:0.62861 [1649] train-logloss:0.62823 [1650] train-logloss:0.62817 [1651] train-logloss:0.62793 [1652] train-logloss:0.62743 [1653] train-logloss:0.62737 [1654] train-logloss:0.62774 [1655] train-logloss:0.62777 [1656] train-logloss:0.62778 [1657] train-logloss:0.62840 [1658] train-logloss:0.62773 [1659] train-logloss:0.62748 [1660] train-logloss:0.62749 [1661] train-logloss:0.62737 [1662] train-logloss:0.62715 [1663] train-logloss:0.62719 [1664] train-logloss:0.62730 [1665] train-logloss:0.62723 [1666] train-logloss:0.62722 [1667] train-logloss:0.62713 [1668] train-logloss:0.62705 [1669] train-logloss:0.62717 [1670] train-logloss:0.62800 [1671] train-logloss:0.62689 [1672] train-logloss:0.62649 [1673] train-logloss:0.62711 [1674] train-logloss:0.62687 [1675] train-logloss:0.62650 [1676] train-logloss:0.62633 [1677] train-logloss:0.62623 [1678] train-logloss:0.62646 [1679] train-logloss:0.62636 [1680] train-logloss:0.62612 [1681] train-logloss:0.62655 [1682] train-logloss:0.62635 [1683] train-logloss:0.62605 [1684] train-logloss:0.62646 [1685] train-logloss:0.62708 [1686] train-logloss:0.62742 [1687] train-logloss:0.62785 [1688] train-logloss:0.62789 [1689] train-logloss:0.62822 [1690] train-logloss:0.62799 [1691] train-logloss:0.62868 [1692] train-logloss:0.62901 [1693] train-logloss:0.62901 [1694] train-logloss:0.62914 [1695] train-logloss:0.62889 [1696] train-logloss:0.62889 [1697] train-logloss:0.62943 [1698] train-logloss:0.63000 [1699] train-logloss:0.63004 [1700] train-logloss:0.63026 [1701] train-logloss:0.63075 [1702] train-logloss:0.63076 [1703] train-logloss:0.63148 [1704] train-logloss:0.63152 [1705] train-logloss:0.63151 [1706] train-logloss:0.63170 [1707] train-logloss:0.63178 [1708] train-logloss:0.63160 [1709] train-logloss:0.63154 [1710] train-logloss:0.63216 [1711] train-logloss:0.63176 [1712] train-logloss:0.63144 [1713] train-logloss:0.63144 [1714] train-logloss:0.63135 [1715] train-logloss:0.63146 [1716] train-logloss:0.63145 [1717] train-logloss:0.63156 [1718] train-logloss:0.63085 [1719] train-logloss:0.63143 [1720] train-logloss:0.63115 [1721] train-logloss:0.63196 [1722] train-logloss:0.63176 [1723] train-logloss:0.63173 [1724] train-logloss:0.63226 [1725] train-logloss:0.63247 [1726] train-logloss:0.63249 [1727] train-logloss:0.63195 [1728] train-logloss:0.63201 [1729] train-logloss:0.63176 [1730] train-logloss:0.63183 [1731] train-logloss:0.63172 [1732] train-logloss:0.63126 [1733] train-logloss:0.63168 [1734] train-logloss:0.63187 [1735] train-logloss:0.63190 [1736] train-logloss:0.63153 [1737] train-logloss:0.63155 [1738] train-logloss:0.63142 [1739] train-logloss:0.63193 [1740] train-logloss:0.63273 [1741] train-logloss:0.63286 [1742] train-logloss:0.63302 [1743] train-logloss:0.63290 [1744] train-logloss:0.63289 [1745] train-logloss:0.63304 [1746] train-logloss:0.63250 [1747] train-logloss:0.63258 [1748] train-logloss:0.63193 [1749] train-logloss:0.63185 [1750] train-logloss:0.63220 [1751] train-logloss:0.63252 [1752] train-logloss:0.63256 [1753] train-logloss:0.63246 [1754] train-logloss:0.63227 [1755] train-logloss:0.63293 [1756] train-logloss:0.63271 [1757] train-logloss:0.63324 [1758] train-logloss:0.63333 [1759] train-logloss:0.63328 [1760] train-logloss:0.63319 [1761] train-logloss:0.63320 [1762] train-logloss:0.63344 [1763] train-logloss:0.63367 [1764] train-logloss:0.63343 [1765] train-logloss:0.63387 [1766] train-logloss:0.63410 [1767] train-logloss:0.63494 [1768] train-logloss:0.63479 [1769] train-logloss:0.63492 [1770] train-logloss:0.63518 [1771] train-logloss:0.63433 [1772] train-logloss:0.63369 [1773] train-logloss:0.63367 [1774] train-logloss:0.63371 [1775] train-logloss:0.63405 [1776] train-logloss:0.63410 [1777] train-logloss:0.63479 [1778] train-logloss:0.63420 [1779] train-logloss:0.63421 [1780] train-logloss:0.63344 [1781] train-logloss:0.63337 [1782] train-logloss:0.63343 [1783] train-logloss:0.63341 [1784] train-logloss:0.63357 [1785] train-logloss:0.63359 [1786] train-logloss:0.63375 [1787] train-logloss:0.63367 [1788] train-logloss:0.63314 [1789] train-logloss:0.63308 [1790] train-logloss:0.63310 [1791] train-logloss:0.63399 [1792] train-logloss:0.63392 [1793] train-logloss:0.63406 [1794] train-logloss:0.63405 [1795] train-logloss:0.63456 [1796] train-logloss:0.63486 [1797] train-logloss:0.63499 [1798] train-logloss:0.63507 [1799] train-logloss:0.63509 [1800] train-logloss:0.63491 [1801] train-logloss:0.63487 [1802] train-logloss:0.63536 [1803] train-logloss:0.63584 [1804] train-logloss:0.63591 [1805] train-logloss:0.63588 [1806] train-logloss:0.63546 [1807] train-logloss:0.63529 [1808] train-logloss:0.63565 [1809] train-logloss:0.63558 [1810] train-logloss:0.63572 [1811] train-logloss:0.63561 [1812] train-logloss:0.63598 [1813] train-logloss:0.63634 [1814] train-logloss:0.63634 [1815] train-logloss:0.63663 [1816] train-logloss:0.63615 [1817] train-logloss:0.63646 [1818] train-logloss:0.63635 [1819] train-logloss:0.63620 [1820] train-logloss:0.63593 [1821] train-logloss:0.63538 [1822] train-logloss:0.63517 [1823] train-logloss:0.63479 [1824] train-logloss:0.63480 [1825] train-logloss:0.63417 [1826] train-logloss:0.63417 [1827] train-logloss:0.63357 [1828] train-logloss:0.63291 [1829] train-logloss:0.63237 [1830] train-logloss:0.63229 [1831] train-logloss:0.63239 [1832] train-logloss:0.63236 [1833] train-logloss:0.63249 [1834] train-logloss:0.63265 [1835] train-logloss:0.63263 [1836] train-logloss:0.63264 [1837] train-logloss:0.63256 [1838] train-logloss:0.63256 [1839] train-logloss:0.63252 [1840] train-logloss:0.63264 [1841] train-logloss:0.63257 [1842] train-logloss:0.63266 [1843] train-logloss:0.63267 [1844] train-logloss:0.63223 [1845] train-logloss:0.63223 [1846] train-logloss:0.63218 [1847] train-logloss:0.63234 [1848] train-logloss:0.63234 [1849] train-logloss:0.63235 [1850] train-logloss:0.63175 [1851] train-logloss:0.63204 [1852] train-logloss:0.63210 [1853] train-logloss:0.63177 [1854] train-logloss:0.63243 [1855] train-logloss:0.63226 [1856] train-logloss:0.63271 [1857] train-logloss:0.63206 [1858] train-logloss:0.63206 [1859] train-logloss:0.63191 [1860] train-logloss:0.63220 [1861] train-logloss:0.63236 [1862] train-logloss:0.63214 [1863] train-logloss:0.63248 [1864] train-logloss:0.63216 [1865] train-logloss:0.63245 [1866] train-logloss:0.63247 [1867] train-logloss:0.63262 [1868] train-logloss:0.63261 [1869] train-logloss:0.63266 [1870] train-logloss:0.63278 [1871] train-logloss:0.63256 [1872] train-logloss:0.63322 [1873] train-logloss:0.63320 [1874] train-logloss:0.63290 [1875] train-logloss:0.63291 [1876] train-logloss:0.63290 [1877] train-logloss:0.63275 [1878] train-logloss:0.63277 [1879] train-logloss:0.63280 [1880] train-logloss:0.63254 [1881] train-logloss:0.63225 [1882] train-logloss:0.63286 [1883] train-logloss:0.63271 [1884] train-logloss:0.63270 [1885] train-logloss:0.63268 [1886] train-logloss:0.63268 [1887] train-logloss:0.63276 [1888] train-logloss:0.63250 [1889] train-logloss:0.63276 [1890] train-logloss:0.63270 [1891] train-logloss:0.63247 [1892] train-logloss:0.63222 [1893] train-logloss:0.63252 [1894] train-logloss:0.63280 [1895] train-logloss:0.63284 [1896] train-logloss:0.63253 [1897] train-logloss:0.63241 [1898] train-logloss:0.63218 [1899] train-logloss:0.63219 [1900] train-logloss:0.63192 [1901] train-logloss:0.63223 [1902] train-logloss:0.63201 [1903] train-logloss:0.63173 [1904] train-logloss:0.63202 [1905] train-logloss:0.63222 [1906] train-logloss:0.63181 [1907] train-logloss:0.63178 [1908] train-logloss:0.63213 [1909] train-logloss:0.63178 [1910] train-logloss:0.63225 [1911] train-logloss:0.63274 [1912] train-logloss:0.63294 [1913] train-logloss:0.63338 [1914] train-logloss:0.63338 [1915] train-logloss:0.63338 [1916] train-logloss:0.63341 [1917] train-logloss:0.63340 [1918] train-logloss:0.63349 [1919] train-logloss:0.63310 [1920] train-logloss:0.63315 [1921] train-logloss:0.63328 [1922] train-logloss:0.63319 [1923] train-logloss:0.63287 [1924] train-logloss:0.63251 [1925] train-logloss:0.63272 [1926] train-logloss:0.63240 [1927] train-logloss:0.63280 [1928] train-logloss:0.63241 [1929] train-logloss:0.63241 [1930] train-logloss:0.63241 [1931] train-logloss:0.63229 [1932] train-logloss:0.63205 [1933] train-logloss:0.63170 [1934] train-logloss:0.63269 [1935] train-logloss:0.63312 [1936] train-logloss:0.63253 [1937] train-logloss:0.63222 [1938] train-logloss:0.63223 [1939] train-logloss:0.63224 [1940] train-logloss:0.63252 [1941] train-logloss:0.63260 [1942] train-logloss:0.63329 [1943] train-logloss:0.63331 [1944] train-logloss:0.63432 [1945] train-logloss:0.63457 [1946] train-logloss:0.63454 [1947] train-logloss:0.63421 [1948] train-logloss:0.63418 [1949] train-logloss:0.63412 [1950] train-logloss:0.63373 [1951] train-logloss:0.63307 [1952] train-logloss:0.63306 [1953] train-logloss:0.63307 [1954] train-logloss:0.63296 [1955] train-logloss:0.63289 [1956] train-logloss:0.63286 [1957] train-logloss:0.63286 [1958] train-logloss:0.63286 [1959] train-logloss:0.63268 [1960] train-logloss:0.63289 [1961] train-logloss:0.63299 [1962] train-logloss:0.63288 [1963] train-logloss:0.63288 [1964] train-logloss:0.63280 [1965] train-logloss:0.63254 [1966] train-logloss:0.63272 [1967] train-logloss:0.63287 [1968] train-logloss:0.63327 [1969] train-logloss:0.63324 [1970] train-logloss:0.63324 [1971] train-logloss:0.63336 [1972] train-logloss:0.63382 [1973] train-logloss:0.63386 [1974] train-logloss:0.63427 [1975] train-logloss:0.63428 [1976] train-logloss:0.63462 [1977] train-logloss:0.63443 [1978] train-logloss:0.63445 [1979] train-logloss:0.63453 [1980] train-logloss:0.63466 [1981] train-logloss:0.63527 [1982] train-logloss:0.63546 [1983] train-logloss:0.63513 [1984] train-logloss:0.63484 [1985] train-logloss:0.63482 [1986] train-logloss:0.63484 [1987] train-logloss:0.63513 [1988] train-logloss:0.63536 [1989] train-logloss:0.63516 [1990] train-logloss:0.63468 [1991] train-logloss:0.63452 [1992] train-logloss:0.63448 [1993] train-logloss:0.63460 [1994] train-logloss:0.63451 [1995] train-logloss:0.63422 [1996] train-logloss:0.63409 [1997] train-logloss:0.63412 [1998] train-logloss:0.63406 [1999] train-logloss:0.63402step-04 用模型預(yù)測(cè)
ytrain=model.predict(dtrain)注意:
- 這里model.predict()預(yù)測(cè)得到的是概率值,而不是0或者1的結(jié)果
- 下面將結(jié)果轉(zhuǎn)換為0或者1
step-05 評(píng)價(jià)模型效果
print(‘train_roc_auc_score:’,metrics.roc_auc_score(y_train,ytrain))
print(‘test_roc_auc_score:’,metrics.roc_auc_score(y_test, ytest))
print(‘train_accuracy_score:’,metrics.accuracy_score(y_train, ytrain_class))
print(‘test_accuracy_score:’,metrics.accuracy_score(y_test,y_pred ))
step-06 保存模型并調(diào)用
joblib.dump(model , r'D:\Ensemble_Learning\xgboostinfo\xgboost01.model') load_model=joblib.load(r'D:\Ensemble_Learning\xgboostinfo\xgboost01.model') ytest=load_model.predict(dtest) ytest[0:5] array([0.265046 , 0.39359182, 0.82298654, 0.07664716, 0.28468448],dtype=float32)三. 網(wǎng)格搜索最優(yōu)xgboost參數(shù)
1.step-01 配置參數(shù)列表
from sklearn.model_selection import GridSearchCV ## 定義參數(shù)取值范圍 learning_rate = [0.1] #0.15,0.11 subsample = [ 0.65] #0.7,0.8 colsample_bytree = [0.6] #0.7, 0.5 colsample_bylevel=[0.7] #0.8, colsample_bynode=[0.7] #0.8, max_depth = [6] #,7 n_estimators=[1000] #,900 gamma=[0,0.1] reg_alpha=[1,2] reg_lambda=[2,3] min_child_weight=[30,50] max_bin=[12,16] base_score=[0.4,0.5,0.6]parameters = { 'learning_rate': learning_rate,'subsample': subsample,'colsample_bytree':colsample_bytree,'colsample_bylevel':colsample_bylevel,'colsample_bynode':colsample_bynode,'max_depth': max_depth,'n_estimators':n_estimators,'gamma':gamma,'reg_alpha':reg_alpha,'reg_lambda':reg_lambda,'min_child_weight':min_child_weight,'max_bin':max_bin,'base_score':base_score,}step-02 選擇待優(yōu)化模型
model = XGBClassifier( eval_metric="logloss")step-03 進(jìn)行網(wǎng)格搜索 擬合模型
clf = GridSearchCV(model, parameters, cv=2, scoring='accuracy',verbose=1,n_jobs=-1) clf = clf.fit(train.values, y_train.values,eval_set=eval_set) Fitting 2 folds for each of 96 candidates, totalling 192 fits [0] validation_0-logloss:0.68822 [1] validation_0-logloss:0.68488 [2] validation_0-logloss:0.67979 [3] validation_0-logloss:0.67770 [4] validation_0-logloss:0.67431 [5] validation_0-logloss:0.67095 [6] validation_0-logloss:0.66894 [7] validation_0-logloss:0.66736 [8] validation_0-logloss:0.66269 [9] validation_0-logloss:0.65911 [10] validation_0-logloss:0.65691 [11] validation_0-logloss:0.65429 [12] validation_0-logloss:0.64994 [13] validation_0-logloss:0.64843 [14] validation_0-logloss:0.64748 [15] validation_0-logloss:0.64628 [16] validation_0-logloss:0.64424 [17] validation_0-logloss:0.64260 [18] validation_0-logloss:0.64172 [19] validation_0-logloss:0.64020 [20] validation_0-logloss:0.63933 [21] validation_0-logloss:0.63795 [22] validation_0-logloss:0.63296 [23] validation_0-logloss:0.63192 [24] validation_0-logloss:0.63157 [25] validation_0-logloss:0.63006 [26] validation_0-logloss:0.62925 [27] validation_0-logloss:0.62915 [28] validation_0-logloss:0.62914 [29] validation_0-logloss:0.62940 [30] validation_0-logloss:0.62872 [31] validation_0-logloss:0.62866 [32] validation_0-logloss:0.62860 [33] validation_0-logloss:0.62812 [34] validation_0-logloss:0.62823 [35] validation_0-logloss:0.62819 [36] validation_0-logloss:0.62489 [37] validation_0-logloss:0.62490 [38] validation_0-logloss:0.62293 [39] validation_0-logloss:0.62222 [40] validation_0-logloss:0.62102 [41] validation_0-logloss:0.61937 [42] validation_0-logloss:0.61839 [43] validation_0-logloss:0.61829 [44] validation_0-logloss:0.61782 [45] validation_0-logloss:0.61781 [46] validation_0-logloss:0.61763 [47] validation_0-logloss:0.61733 [48] validation_0-logloss:0.61704 [49] validation_0-logloss:0.61602 [50] validation_0-logloss:0.61585 [51] validation_0-logloss:0.61632 [52] validation_0-logloss:0.61601 [53] validation_0-logloss:0.61658 [54] validation_0-logloss:0.61598 [55] validation_0-logloss:0.61581 [56] validation_0-logloss:0.61530 [57] validation_0-logloss:0.61455 [58] validation_0-logloss:0.61557 [59] validation_0-logloss:0.61533 [60] validation_0-logloss:0.61390 [61] validation_0-logloss:0.61426 [62] validation_0-logloss:0.61365 [63] validation_0-logloss:0.61269 [64] validation_0-logloss:0.61244 [65] validation_0-logloss:0.61196 [66] validation_0-logloss:0.61196 [67] validation_0-logloss:0.61175 [68] validation_0-logloss:0.61179 [69] validation_0-logloss:0.61195 [70] validation_0-logloss:0.61165 [71] validation_0-logloss:0.61130 [72] validation_0-logloss:0.61112 [73] validation_0-logloss:0.61133 [74] validation_0-logloss:0.61152 [75] validation_0-logloss:0.61118 [76] validation_0-logloss:0.61160 [77] validation_0-logloss:0.61167 [78] validation_0-logloss:0.61175 [79] validation_0-logloss:0.61156 [80] validation_0-logloss:0.61164 [81] validation_0-logloss:0.61126 [82] validation_0-logloss:0.61166 [83] validation_0-logloss:0.61163 [84] validation_0-logloss:0.61156 [85] validation_0-logloss:0.61177 [86] validation_0-logloss:0.61271 [87] validation_0-logloss:0.61074 [88] validation_0-logloss:0.61048 [89] validation_0-logloss:0.60983 [90] validation_0-logloss:0.60992 [91] validation_0-logloss:0.60904 [92] validation_0-logloss:0.60858 [93] validation_0-logloss:0.60805 [94] validation_0-logloss:0.60787 [95] validation_0-logloss:0.60836 [96] validation_0-logloss:0.60857 [97] validation_0-logloss:0.60862 [98] validation_0-logloss:0.60874 [99] validation_0-logloss:0.60815 [100] validation_0-logloss:0.60815 [101] validation_0-logloss:0.60762 [102] validation_0-logloss:0.60721 [103] validation_0-logloss:0.60722 [104] validation_0-logloss:0.60713 [105] validation_0-logloss:0.60712 [106] validation_0-logloss:0.60659 [107] validation_0-logloss:0.60623 [108] validation_0-logloss:0.60603 [109] validation_0-logloss:0.60549 [110] validation_0-logloss:0.60546 [111] validation_0-logloss:0.60535 [112] validation_0-logloss:0.60451 [113] validation_0-logloss:0.60451 [114] validation_0-logloss:0.60397 [115] validation_0-logloss:0.60426 [116] validation_0-logloss:0.60452 [117] validation_0-logloss:0.60424 [118] validation_0-logloss:0.60428 [119] validation_0-logloss:0.60379 [120] validation_0-logloss:0.60408 [121] validation_0-logloss:0.60420 [122] validation_0-logloss:0.60399 [123] validation_0-logloss:0.60389 [124] validation_0-logloss:0.60441 [125] validation_0-logloss:0.60494 [126] validation_0-logloss:0.60457 [127] validation_0-logloss:0.60444 [128] validation_0-logloss:0.60442 [129] validation_0-logloss:0.60438 [130] validation_0-logloss:0.60436 [131] validation_0-logloss:0.60378 [132] validation_0-logloss:0.60310 [133] validation_0-logloss:0.60328 [134] validation_0-logloss:0.60349 [135] validation_0-logloss:0.60336 [136] validation_0-logloss:0.60355 [137] validation_0-logloss:0.60356 [138] validation_0-logloss:0.60385 [139] validation_0-logloss:0.60383 [140] validation_0-logloss:0.60363 [141] validation_0-logloss:0.60288 [142] validation_0-logloss:0.60319 [143] validation_0-logloss:0.60344 [144] validation_0-logloss:0.60350 [145] validation_0-logloss:0.60393 [146] validation_0-logloss:0.60399 [147] validation_0-logloss:0.60408 [148] validation_0-logloss:0.60428 [149] validation_0-logloss:0.60439 [150] validation_0-logloss:0.60444 [151] validation_0-logloss:0.60460 [152] validation_0-logloss:0.60519 [153] validation_0-logloss:0.60553 [154] validation_0-logloss:0.60516 [155] validation_0-logloss:0.60552 [156] validation_0-logloss:0.60554 [157] validation_0-logloss:0.60521 [158] validation_0-logloss:0.60540 [159] validation_0-logloss:0.60549 [160] validation_0-logloss:0.60561 [161] validation_0-logloss:0.60576 [162] validation_0-logloss:0.60609 [163] validation_0-logloss:0.60591 [164] validation_0-logloss:0.60582 [165] validation_0-logloss:0.60576 [166] validation_0-logloss:0.60607 [167] validation_0-logloss:0.60569 [168] validation_0-logloss:0.60565 [169] validation_0-logloss:0.60612 [170] validation_0-logloss:0.60641 [171] validation_0-logloss:0.60640 [172] validation_0-logloss:0.60609 [173] validation_0-logloss:0.60584 [174] validation_0-logloss:0.60604 [175] validation_0-logloss:0.60608 [176] validation_0-logloss:0.60609 [177] validation_0-logloss:0.60606 [178] validation_0-logloss:0.60660 [179] validation_0-logloss:0.60601 [180] validation_0-logloss:0.60543 [181] validation_0-logloss:0.60482 [182] validation_0-logloss:0.60460 [183] validation_0-logloss:0.60465 [184] validation_0-logloss:0.60453 [185] validation_0-logloss:0.60450 [186] validation_0-logloss:0.60447 [187] validation_0-logloss:0.60442 [188] validation_0-logloss:0.60432 [189] validation_0-logloss:0.60451 [190] validation_0-logloss:0.60469 [191] validation_0-logloss:0.60473 [192] validation_0-logloss:0.60455 [193] validation_0-logloss:0.60426 [194] validation_0-logloss:0.60474 [195] validation_0-logloss:0.60463 [196] validation_0-logloss:0.60473 [197] validation_0-logloss:0.60477 [198] validation_0-logloss:0.60532 [199] validation_0-logloss:0.60515 [200] validation_0-logloss:0.60518 [201] validation_0-logloss:0.60515 [202] validation_0-logloss:0.60500 [203] validation_0-logloss:0.60524 [204] validation_0-logloss:0.60522 [205] validation_0-logloss:0.60516 [206] validation_0-logloss:0.60473 [207] validation_0-logloss:0.60459 [208] validation_0-logloss:0.60468 [209] validation_0-logloss:0.60497 [210] validation_0-logloss:0.60538 [211] validation_0-logloss:0.60584 [212] validation_0-logloss:0.60534 [213] validation_0-logloss:0.60530 [214] validation_0-logloss:0.60557 [215] validation_0-logloss:0.60565 [216] validation_0-logloss:0.60610 [217] validation_0-logloss:0.60636 [218] validation_0-logloss:0.60650 [219] validation_0-logloss:0.60661 [220] validation_0-logloss:0.60655 [221] validation_0-logloss:0.60701 [222] validation_0-logloss:0.60714 [223] validation_0-logloss:0.60700 [224] validation_0-logloss:0.60750 [225] validation_0-logloss:0.60757 [226] validation_0-logloss:0.60762 [227] validation_0-logloss:0.60722 [228] validation_0-logloss:0.60706 [229] validation_0-logloss:0.60686 [230] validation_0-logloss:0.60654 [231] validation_0-logloss:0.60657 [232] validation_0-logloss:0.60676 [233] validation_0-logloss:0.60664 [234] validation_0-logloss:0.60668 [235] validation_0-logloss:0.60694 [236] validation_0-logloss:0.60680 [237] validation_0-logloss:0.60677 [238] validation_0-logloss:0.60649 [239] validation_0-logloss:0.60630 [240] validation_0-logloss:0.60609 [241] validation_0-logloss:0.60574 [242] validation_0-logloss:0.60603 [243] validation_0-logloss:0.60609 [244] validation_0-logloss:0.60588 [245] validation_0-logloss:0.60599 [246] validation_0-logloss:0.60576 [247] validation_0-logloss:0.60621 [248] validation_0-logloss:0.60669 [249] validation_0-logloss:0.60657 [250] validation_0-logloss:0.60696 [251] validation_0-logloss:0.60693 [252] validation_0-logloss:0.60653 [253] validation_0-logloss:0.60678 [254] validation_0-logloss:0.60658 [255] validation_0-logloss:0.60608 [256] validation_0-logloss:0.60590 [257] validation_0-logloss:0.60587 [258] validation_0-logloss:0.60539 [259] validation_0-logloss:0.60528 [260] validation_0-logloss:0.60510 [261] validation_0-logloss:0.60560 [262] validation_0-logloss:0.60583 [263] validation_0-logloss:0.60592 [264] validation_0-logloss:0.60591 [265] validation_0-logloss:0.60541 [266] validation_0-logloss:0.60535 [267] validation_0-logloss:0.60566 [268] validation_0-logloss:0.60543 [269] validation_0-logloss:0.60562 [270] validation_0-logloss:0.60554 [271] validation_0-logloss:0.60535 [272] validation_0-logloss:0.60563 [273] validation_0-logloss:0.60566 [274] validation_0-logloss:0.60529 [275] validation_0-logloss:0.60534 [276] validation_0-logloss:0.60551 [277] validation_0-logloss:0.60549 [278] validation_0-logloss:0.60546 [279] validation_0-logloss:0.60526 [280] validation_0-logloss:0.60515 [281] validation_0-logloss:0.60527 [282] validation_0-logloss:0.60511 [283] validation_0-logloss:0.60428 [284] validation_0-logloss:0.60414 [285] validation_0-logloss:0.60400 [286] validation_0-logloss:0.60428 [287] validation_0-logloss:0.60393 [288] validation_0-logloss:0.60395 [289] validation_0-logloss:0.60418 [290] validation_0-logloss:0.60400 [291] validation_0-logloss:0.60397 [292] validation_0-logloss:0.60400 [293] validation_0-logloss:0.60457 [294] validation_0-logloss:0.60491 [295] validation_0-logloss:0.60482 [296] validation_0-logloss:0.60503 [297] validation_0-logloss:0.60526 [298] validation_0-logloss:0.60520 [299] validation_0-logloss:0.60509 [300] validation_0-logloss:0.60484 [301] validation_0-logloss:0.60457 [302] validation_0-logloss:0.60474 [303] validation_0-logloss:0.60462 [304] validation_0-logloss:0.60472 [305] validation_0-logloss:0.60515 [306] validation_0-logloss:0.60481 [307] validation_0-logloss:0.60471 [308] validation_0-logloss:0.60469 [309] validation_0-logloss:0.60460 [310] validation_0-logloss:0.60466 [311] validation_0-logloss:0.60474 [312] validation_0-logloss:0.60487 [313] validation_0-logloss:0.60508 [314] validation_0-logloss:0.60515 [315] validation_0-logloss:0.60525 [316] validation_0-logloss:0.60464 [317] validation_0-logloss:0.60475 [318] validation_0-logloss:0.60480 [319] validation_0-logloss:0.60429 [320] validation_0-logloss:0.60425 [321] validation_0-logloss:0.60446 [322] validation_0-logloss:0.60442 [323] validation_0-logloss:0.60446 [324] validation_0-logloss:0.60472 [325] validation_0-logloss:0.60480 [326] validation_0-logloss:0.60463 [327] validation_0-logloss:0.60456 [328] validation_0-logloss:0.60465 [329] validation_0-logloss:0.60469 [330] validation_0-logloss:0.60477 [331] validation_0-logloss:0.60517 [332] validation_0-logloss:0.60530 [333] validation_0-logloss:0.60528 [334] validation_0-logloss:0.60485 [335] validation_0-logloss:0.60464 [336] validation_0-logloss:0.60450 [337] validation_0-logloss:0.60485 [338] validation_0-logloss:0.60507 [339] validation_0-logloss:0.60503 [340] validation_0-logloss:0.60486 [341] validation_0-logloss:0.60507 [342] validation_0-logloss:0.60502 [343] validation_0-logloss:0.60454 [344] validation_0-logloss:0.60476 [345] validation_0-logloss:0.60511 [346] validation_0-logloss:0.60532 [347] validation_0-logloss:0.60501 [348] validation_0-logloss:0.60510 [349] validation_0-logloss:0.60524 [350] validation_0-logloss:0.60553 [351] validation_0-logloss:0.60552 [352] validation_0-logloss:0.60485 [353] validation_0-logloss:0.60502 [354] validation_0-logloss:0.60475 [355] validation_0-logloss:0.60484 [356] validation_0-logloss:0.60499 [357] validation_0-logloss:0.60494 [358] validation_0-logloss:0.60474 [359] validation_0-logloss:0.60461 [360] validation_0-logloss:0.60477 [361] validation_0-logloss:0.60355 [362] validation_0-logloss:0.60340 [363] validation_0-logloss:0.60368 [364] validation_0-logloss:0.60373 [365] validation_0-logloss:0.60382 [366] validation_0-logloss:0.60382 [367] validation_0-logloss:0.60366 [368] validation_0-logloss:0.60367 [369] validation_0-logloss:0.60350 [370] validation_0-logloss:0.60348 [371] validation_0-logloss:0.60336 [372] validation_0-logloss:0.60300 [373] validation_0-logloss:0.60334 [374] validation_0-logloss:0.60330 [375] validation_0-logloss:0.60371 [376] validation_0-logloss:0.60409 [377] validation_0-logloss:0.60424 [378] validation_0-logloss:0.60393 [379] validation_0-logloss:0.60401 [380] validation_0-logloss:0.60403 [381] validation_0-logloss:0.60395 [382] validation_0-logloss:0.60366 [383] validation_0-logloss:0.60358 [384] validation_0-logloss:0.60356 [385] validation_0-logloss:0.60394 [386] validation_0-logloss:0.60367 [387] validation_0-logloss:0.60399 [388] validation_0-logloss:0.60392 [389] validation_0-logloss:0.60449 [390] validation_0-logloss:0.60467 [391] validation_0-logloss:0.60516 [392] validation_0-logloss:0.60514 [393] validation_0-logloss:0.60507 [394] validation_0-logloss:0.60519 [395] validation_0-logloss:0.60530 [396] validation_0-logloss:0.60509 [397] validation_0-logloss:0.60484 [398] validation_0-logloss:0.60473 [399] validation_0-logloss:0.60446 [400] validation_0-logloss:0.60440 [401] validation_0-logloss:0.60455 [402] validation_0-logloss:0.60452 [403] validation_0-logloss:0.60424 [404] validation_0-logloss:0.60409 [405] validation_0-logloss:0.60405 [406] validation_0-logloss:0.60397 [407] validation_0-logloss:0.60402 [408] validation_0-logloss:0.60391 [409] validation_0-logloss:0.60378 [410] validation_0-logloss:0.60382 [411] validation_0-logloss:0.60386 [412] validation_0-logloss:0.60359 [413] validation_0-logloss:0.60344 [414] validation_0-logloss:0.60370 [415] validation_0-logloss:0.60382 [416] validation_0-logloss:0.60394 [417] validation_0-logloss:0.60401 [418] validation_0-logloss:0.60385 [419] validation_0-logloss:0.60374 [420] validation_0-logloss:0.60382 [421] validation_0-logloss:0.60395 [422] validation_0-logloss:0.60394 [423] validation_0-logloss:0.60395 [424] validation_0-logloss:0.60385 [425] validation_0-logloss:0.60374 [426] validation_0-logloss:0.60343 [427] validation_0-logloss:0.60384 [428] validation_0-logloss:0.60435 [429] validation_0-logloss:0.60471 [430] validation_0-logloss:0.60426 [431] validation_0-logloss:0.60393 [432] validation_0-logloss:0.60411 [433] validation_0-logloss:0.60418 [434] validation_0-logloss:0.60446 [435] validation_0-logloss:0.60360 [436] validation_0-logloss:0.60333 [437] validation_0-logloss:0.60326 [438] validation_0-logloss:0.60335 [439] validation_0-logloss:0.60329 [440] validation_0-logloss:0.60312 [441] validation_0-logloss:0.60343 [442] validation_0-logloss:0.60387 [443] validation_0-logloss:0.60386 [444] validation_0-logloss:0.60377 [445] validation_0-logloss:0.60369 [446] validation_0-logloss:0.60395 [447] validation_0-logloss:0.60427 [448] validation_0-logloss:0.60443 [449] validation_0-logloss:0.60459 [450] validation_0-logloss:0.60452 [451] validation_0-logloss:0.60487 [452] validation_0-logloss:0.60499 [453] validation_0-logloss:0.60422 [454] validation_0-logloss:0.60429 [455] validation_0-logloss:0.60423 [456] validation_0-logloss:0.60457 [457] validation_0-logloss:0.60458 [458] validation_0-logloss:0.60459 [459] validation_0-logloss:0.60461 [460] validation_0-logloss:0.60487 [461] validation_0-logloss:0.60523 [462] validation_0-logloss:0.60522 [463] validation_0-logloss:0.60511 [464] validation_0-logloss:0.60496 [465] validation_0-logloss:0.60522 [466] validation_0-logloss:0.60537 [467] validation_0-logloss:0.60529 [468] validation_0-logloss:0.60488 [469] validation_0-logloss:0.60495 [470] validation_0-logloss:0.60476 [471] validation_0-logloss:0.60436 [472] validation_0-logloss:0.60453 [473] validation_0-logloss:0.60423 [474] validation_0-logloss:0.60389 [475] validation_0-logloss:0.60389 [476] validation_0-logloss:0.60365 [477] validation_0-logloss:0.60376 [478] validation_0-logloss:0.60377 [479] validation_0-logloss:0.60350 [480] validation_0-logloss:0.60341 [481] validation_0-logloss:0.60335 [482] validation_0-logloss:0.60350 [483] validation_0-logloss:0.60303 [484] validation_0-logloss:0.60329 [485] validation_0-logloss:0.60326 [486] validation_0-logloss:0.60336 [487] validation_0-logloss:0.60346 [488] validation_0-logloss:0.60365 [489] validation_0-logloss:0.60350 [490] validation_0-logloss:0.60350 [491] validation_0-logloss:0.60373 [492] validation_0-logloss:0.60363 [493] validation_0-logloss:0.60397 [494] validation_0-logloss:0.60403 [495] validation_0-logloss:0.60371 [496] validation_0-logloss:0.60382 [497] validation_0-logloss:0.60378 [498] validation_0-logloss:0.60390 [499] validation_0-logloss:0.60406 [500] validation_0-logloss:0.60411 [501] validation_0-logloss:0.60401 [502] validation_0-logloss:0.60416 [503] validation_0-logloss:0.60469 [504] validation_0-logloss:0.60466 [505] validation_0-logloss:0.60460 [506] validation_0-logloss:0.60480 [507] validation_0-logloss:0.60445 [508] validation_0-logloss:0.60471 [509] validation_0-logloss:0.60446 [510] validation_0-logloss:0.60447 [511] validation_0-logloss:0.60452 [512] validation_0-logloss:0.60432 [513] validation_0-logloss:0.60395 [514] validation_0-logloss:0.60411 [515] validation_0-logloss:0.60397 [516] validation_0-logloss:0.60418 [517] validation_0-logloss:0.60432 [518] validation_0-logloss:0.60424 [519] validation_0-logloss:0.60419 [520] validation_0-logloss:0.60442 [521] validation_0-logloss:0.60408 [522] validation_0-logloss:0.60413 [523] validation_0-logloss:0.60399 [524] validation_0-logloss:0.60416 [525] validation_0-logloss:0.60426 [526] validation_0-logloss:0.60448 [527] validation_0-logloss:0.60472 [528] validation_0-logloss:0.60455 [529] validation_0-logloss:0.60461 [530] validation_0-logloss:0.60446 [531] validation_0-logloss:0.60432 [532] validation_0-logloss:0.60416 [533] validation_0-logloss:0.60405 [534] validation_0-logloss:0.60423 [535] validation_0-logloss:0.60428 [536] validation_0-logloss:0.60378 [537] validation_0-logloss:0.60372 [538] validation_0-logloss:0.60382 [539] validation_0-logloss:0.60379 [540] validation_0-logloss:0.60388 [541] validation_0-logloss:0.60372 [542] validation_0-logloss:0.60382 [543] validation_0-logloss:0.60378 [544] validation_0-logloss:0.60367 [545] validation_0-logloss:0.60397 [546] validation_0-logloss:0.60379 [547] validation_0-logloss:0.60401 [548] validation_0-logloss:0.60416 [549] validation_0-logloss:0.60442 [550] validation_0-logloss:0.60443 [551] validation_0-logloss:0.60432 [552] validation_0-logloss:0.60414 [553] validation_0-logloss:0.60427 [554] validation_0-logloss:0.60457 [555] validation_0-logloss:0.60423 [556] validation_0-logloss:0.60474 [557] validation_0-logloss:0.60459 [558] validation_0-logloss:0.60463 [559] validation_0-logloss:0.60445 [560] validation_0-logloss:0.60412 [561] validation_0-logloss:0.60404 [562] validation_0-logloss:0.60418 [563] validation_0-logloss:0.60409 [564] validation_0-logloss:0.60425 [565] validation_0-logloss:0.60470 [566] validation_0-logloss:0.60461 [567] validation_0-logloss:0.60490 [568] validation_0-logloss:0.60464 [569] validation_0-logloss:0.60456 [570] validation_0-logloss:0.60474 [571] validation_0-logloss:0.60472 [572] validation_0-logloss:0.60466 [573] validation_0-logloss:0.60453 [574] validation_0-logloss:0.60497 [575] validation_0-logloss:0.60498 [576] validation_0-logloss:0.60512 [577] validation_0-logloss:0.60532 [578] validation_0-logloss:0.60528 [579] validation_0-logloss:0.60516 [580] validation_0-logloss:0.60537 [581] validation_0-logloss:0.60552 [582] validation_0-logloss:0.60537 [583] validation_0-logloss:0.60543 [584] validation_0-logloss:0.60534 [585] validation_0-logloss:0.60534 [586] validation_0-logloss:0.60523 [587] validation_0-logloss:0.60507 [588] validation_0-logloss:0.60517 [589] validation_0-logloss:0.60532 [590] validation_0-logloss:0.60511 [591] validation_0-logloss:0.60522 [592] validation_0-logloss:0.60522 [593] validation_0-logloss:0.60500 [594] validation_0-logloss:0.60504 [595] validation_0-logloss:0.60453 [596] validation_0-logloss:0.60472 [597] validation_0-logloss:0.60476 [598] validation_0-logloss:0.60454 [599] validation_0-logloss:0.60482 [600] validation_0-logloss:0.60493 [601] validation_0-logloss:0.60508 [602] validation_0-logloss:0.60498 [603] validation_0-logloss:0.60468 [604] validation_0-logloss:0.60489 [605] validation_0-logloss:0.60471 [606] validation_0-logloss:0.60445 [607] validation_0-logloss:0.60449 [608] validation_0-logloss:0.60416 [609] validation_0-logloss:0.60470 [610] validation_0-logloss:0.60475 [611] validation_0-logloss:0.60463 [612] validation_0-logloss:0.60459 [613] validation_0-logloss:0.60463 [614] validation_0-logloss:0.60483 [615] validation_0-logloss:0.60463 [616] validation_0-logloss:0.60455 [617] validation_0-logloss:0.60469 [618] validation_0-logloss:0.60512 [619] validation_0-logloss:0.60497 [620] validation_0-logloss:0.60498 [621] validation_0-logloss:0.60506 [622] validation_0-logloss:0.60505 [623] validation_0-logloss:0.60511 [624] validation_0-logloss:0.60516 [625] validation_0-logloss:0.60471 [626] validation_0-logloss:0.60465 [627] validation_0-logloss:0.60462 [628] validation_0-logloss:0.60465 [629] validation_0-logloss:0.60461 [630] validation_0-logloss:0.60509 [631] validation_0-logloss:0.60494 [632] validation_0-logloss:0.60538 [633] validation_0-logloss:0.60578 [634] validation_0-logloss:0.60573 [635] validation_0-logloss:0.60580 [636] validation_0-logloss:0.60596 [637] validation_0-logloss:0.60593 [638] validation_0-logloss:0.60586 [639] validation_0-logloss:0.60597 [640] validation_0-logloss:0.60609 [641] validation_0-logloss:0.60606 [642] validation_0-logloss:0.60550 [643] validation_0-logloss:0.60544 [644] validation_0-logloss:0.60542 [645] validation_0-logloss:0.60576 [646] validation_0-logloss:0.60561 [647] validation_0-logloss:0.60587 [648] validation_0-logloss:0.60584 [649] validation_0-logloss:0.60494 [650] validation_0-logloss:0.60505 [651] validation_0-logloss:0.60494 [652] validation_0-logloss:0.60488 [653] validation_0-logloss:0.60494 [654] validation_0-logloss:0.60439 [655] validation_0-logloss:0.60448 [656] validation_0-logloss:0.60448 [657] validation_0-logloss:0.60455 [658] validation_0-logloss:0.60459 [659] validation_0-logloss:0.60436 [660] validation_0-logloss:0.60424 [661] validation_0-logloss:0.60412 [662] validation_0-logloss:0.60409 [663] validation_0-logloss:0.60410 [664] validation_0-logloss:0.60421 [665] validation_0-logloss:0.60425 [666] validation_0-logloss:0.60453 [667] validation_0-logloss:0.60444 [668] validation_0-logloss:0.60434 [669] validation_0-logloss:0.60442 [670] validation_0-logloss:0.60437 [671] validation_0-logloss:0.60456 [672] validation_0-logloss:0.60458 [673] validation_0-logloss:0.60443 [674] validation_0-logloss:0.60407 [675] validation_0-logloss:0.60402 [676] validation_0-logloss:0.60406 [677] validation_0-logloss:0.60406 [678] validation_0-logloss:0.60412 [679] validation_0-logloss:0.60435 [680] validation_0-logloss:0.60433 [681] validation_0-logloss:0.60408 [682] validation_0-logloss:0.60389 [683] validation_0-logloss:0.60368 [684] validation_0-logloss:0.60364 [685] validation_0-logloss:0.60370 [686] validation_0-logloss:0.60360 [687] validation_0-logloss:0.60370 [688] validation_0-logloss:0.60363 [689] validation_0-logloss:0.60367 [690] validation_0-logloss:0.60391 [691] validation_0-logloss:0.60374 [692] validation_0-logloss:0.60393 [693] validation_0-logloss:0.60394 [694] validation_0-logloss:0.60422 [695] validation_0-logloss:0.60424 [696] validation_0-logloss:0.60417 [697] validation_0-logloss:0.60411 [698] validation_0-logloss:0.60426 [699] validation_0-logloss:0.60473 [700] validation_0-logloss:0.60487 [701] validation_0-logloss:0.60560 [702] validation_0-logloss:0.60577 [703] validation_0-logloss:0.60570 [704] validation_0-logloss:0.60535 [705] validation_0-logloss:0.60524 [706] validation_0-logloss:0.60532 [707] validation_0-logloss:0.60555 [708] validation_0-logloss:0.60548 [709] validation_0-logloss:0.60556 [710] validation_0-logloss:0.60569 [711] validation_0-logloss:0.60592 [712] validation_0-logloss:0.60615 [713] validation_0-logloss:0.60617 [714] validation_0-logloss:0.60631 [715] validation_0-logloss:0.60655 [716] validation_0-logloss:0.60684 [717] validation_0-logloss:0.60676 [718] validation_0-logloss:0.60646 [719] validation_0-logloss:0.60614 [720] validation_0-logloss:0.60583 [721] validation_0-logloss:0.60571 [722] validation_0-logloss:0.60550 [723] validation_0-logloss:0.60545 [724] validation_0-logloss:0.60471 [725] validation_0-logloss:0.60475 [726] validation_0-logloss:0.60462 [727] validation_0-logloss:0.60456 [728] validation_0-logloss:0.60422 [729] validation_0-logloss:0.60413 [730] validation_0-logloss:0.60415 [731] validation_0-logloss:0.60436 [732] validation_0-logloss:0.60453 [733] validation_0-logloss:0.60435 [734] validation_0-logloss:0.60413 [735] validation_0-logloss:0.60428 [736] validation_0-logloss:0.60421 [737] validation_0-logloss:0.60376 [738] validation_0-logloss:0.60376 [739] validation_0-logloss:0.60379 [740] validation_0-logloss:0.60400 [741] validation_0-logloss:0.60416 [742] validation_0-logloss:0.60410 [743] validation_0-logloss:0.60400 [744] validation_0-logloss:0.60408 [745] validation_0-logloss:0.60419 [746] validation_0-logloss:0.60411 [747] validation_0-logloss:0.60401 [748] validation_0-logloss:0.60395 [749] validation_0-logloss:0.60409 [750] validation_0-logloss:0.60397 [751] validation_0-logloss:0.60388 [752] validation_0-logloss:0.60448 [753] validation_0-logloss:0.60439 [754] validation_0-logloss:0.60436 [755] validation_0-logloss:0.60419 [756] validation_0-logloss:0.60411 [757] validation_0-logloss:0.60439 [758] validation_0-logloss:0.60456 [759] validation_0-logloss:0.60472 [760] validation_0-logloss:0.60418 [761] validation_0-logloss:0.60395 [762] validation_0-logloss:0.60395 [763] validation_0-logloss:0.60384 [764] validation_0-logloss:0.60380 [765] validation_0-logloss:0.60412 [766] validation_0-logloss:0.60415 [767] validation_0-logloss:0.60427 [768] validation_0-logloss:0.60411 [769] validation_0-logloss:0.60426 [770] validation_0-logloss:0.60430 [771] validation_0-logloss:0.60455 [772] validation_0-logloss:0.60482 [773] validation_0-logloss:0.60490 [774] validation_0-logloss:0.60482 [775] validation_0-logloss:0.60506 [776] validation_0-logloss:0.60499 [777] validation_0-logloss:0.60479 [778] validation_0-logloss:0.60462 [779] validation_0-logloss:0.60462 [780] validation_0-logloss:0.60461 [781] validation_0-logloss:0.60505 [782] validation_0-logloss:0.60512 [783] validation_0-logloss:0.60534 [784] validation_0-logloss:0.60552 [785] validation_0-logloss:0.60558 [786] validation_0-logloss:0.60575 [787] validation_0-logloss:0.60570 [788] validation_0-logloss:0.60578 [789] validation_0-logloss:0.60564 [790] validation_0-logloss:0.60568 [791] validation_0-logloss:0.60587 [792] validation_0-logloss:0.60602 [793] validation_0-logloss:0.60574 [794] validation_0-logloss:0.60576 [795] validation_0-logloss:0.60569 [796] validation_0-logloss:0.60569 [797] validation_0-logloss:0.60633 [798] validation_0-logloss:0.60678 [799] validation_0-logloss:0.60706 [800] validation_0-logloss:0.60701 [801] validation_0-logloss:0.60686 [802] validation_0-logloss:0.60681 [803] validation_0-logloss:0.60680 [804] validation_0-logloss:0.60670 [805] validation_0-logloss:0.60700 [806] validation_0-logloss:0.60709 [807] validation_0-logloss:0.60697 [808] validation_0-logloss:0.60676 [809] validation_0-logloss:0.60660 [810] validation_0-logloss:0.60628 [811] validation_0-logloss:0.60646 [812] validation_0-logloss:0.60627 [813] validation_0-logloss:0.60681 [814] validation_0-logloss:0.60678 [815] validation_0-logloss:0.60702 [816] validation_0-logloss:0.60666 [817] validation_0-logloss:0.60681 [818] validation_0-logloss:0.60716 [819] validation_0-logloss:0.60757 [820] validation_0-logloss:0.60738 [821] validation_0-logloss:0.60758 [822] validation_0-logloss:0.60761 [823] validation_0-logloss:0.60766 [824] validation_0-logloss:0.60746 [825] validation_0-logloss:0.60728 [826] validation_0-logloss:0.60736 [827] validation_0-logloss:0.60739 [828] validation_0-logloss:0.60743 [829] validation_0-logloss:0.60748 [830] validation_0-logloss:0.60727 [831] validation_0-logloss:0.60745 [832] validation_0-logloss:0.60717 [833] validation_0-logloss:0.60697 [834] validation_0-logloss:0.60676 [835] validation_0-logloss:0.60640 [836] validation_0-logloss:0.60708 [837] validation_0-logloss:0.60744 [838] validation_0-logloss:0.60775 [839] validation_0-logloss:0.60798 [840] validation_0-logloss:0.60808 [841] validation_0-logloss:0.60765 [842] validation_0-logloss:0.60776 [843] validation_0-logloss:0.60782 [844] validation_0-logloss:0.60783 [845] validation_0-logloss:0.60776 [846] validation_0-logloss:0.60800 [847] validation_0-logloss:0.60782 [848] validation_0-logloss:0.60815 [849] validation_0-logloss:0.60799 [850] validation_0-logloss:0.60784 [851] validation_0-logloss:0.60796 [852] validation_0-logloss:0.60805 [853] validation_0-logloss:0.60803 [854] validation_0-logloss:0.60794 [855] validation_0-logloss:0.60811 [856] validation_0-logloss:0.60789 [857] validation_0-logloss:0.60779 [858] validation_0-logloss:0.60777 [859] validation_0-logloss:0.60769 [860] validation_0-logloss:0.60778 [861] validation_0-logloss:0.60765 [862] validation_0-logloss:0.60734 [863] validation_0-logloss:0.60729 [864] validation_0-logloss:0.60720 [865] validation_0-logloss:0.60696 [866] validation_0-logloss:0.60701 [867] validation_0-logloss:0.60726 [868] validation_0-logloss:0.60718 [869] validation_0-logloss:0.60698 [870] validation_0-logloss:0.60683 [871] validation_0-logloss:0.60689 [872] validation_0-logloss:0.60708 [873] validation_0-logloss:0.60722 [874] validation_0-logloss:0.60703 [875] validation_0-logloss:0.60677 [876] validation_0-logloss:0.60664 [877] validation_0-logloss:0.60656 [878] validation_0-logloss:0.60645 [879] validation_0-logloss:0.60644 [880] validation_0-logloss:0.60642 [881] validation_0-logloss:0.60628 [882] validation_0-logloss:0.60623 [883] validation_0-logloss:0.60586 [884] validation_0-logloss:0.60563 [885] validation_0-logloss:0.60562 [886] validation_0-logloss:0.60593 [887] validation_0-logloss:0.60599 [888] validation_0-logloss:0.60578 [889] validation_0-logloss:0.60594 [890] validation_0-logloss:0.60606 [891] validation_0-logloss:0.60617 [892] validation_0-logloss:0.60616 [893] validation_0-logloss:0.60620 [894] validation_0-logloss:0.60611 [895] validation_0-logloss:0.60604 [896] validation_0-logloss:0.60608 [897] validation_0-logloss:0.60654 [898] validation_0-logloss:0.60656 [899] validation_0-logloss:0.60647 [900] validation_0-logloss:0.60649 [901] validation_0-logloss:0.60647 [902] validation_0-logloss:0.60646 [903] validation_0-logloss:0.60673 [904] validation_0-logloss:0.60678 [905] validation_0-logloss:0.60708 [906] validation_0-logloss:0.60672 [907] validation_0-logloss:0.60680 [908] validation_0-logloss:0.60665 [909] validation_0-logloss:0.60660 [910] validation_0-logloss:0.60646 [911] validation_0-logloss:0.60655 [912] validation_0-logloss:0.60660 [913] validation_0-logloss:0.60635 [914] validation_0-logloss:0.60667 [915] validation_0-logloss:0.60676 [916] validation_0-logloss:0.60678 [917] validation_0-logloss:0.60682 [918] validation_0-logloss:0.60632 [919] validation_0-logloss:0.60579 [920] validation_0-logloss:0.60602 [921] validation_0-logloss:0.60611 [922] validation_0-logloss:0.60623 [923] validation_0-logloss:0.60628 [924] validation_0-logloss:0.60643 [925] validation_0-logloss:0.60628 [926] validation_0-logloss:0.60611 [927] validation_0-logloss:0.60583 [928] validation_0-logloss:0.60574 [929] validation_0-logloss:0.60544 [930] validation_0-logloss:0.60559 [931] validation_0-logloss:0.60561 [932] validation_0-logloss:0.60572 [933] validation_0-logloss:0.60564 [934] validation_0-logloss:0.60589 [935] validation_0-logloss:0.60591 [936] validation_0-logloss:0.60569 [937] validation_0-logloss:0.60572 [938] validation_0-logloss:0.60552 [939] validation_0-logloss:0.60558 [940] validation_0-logloss:0.60522 [941] validation_0-logloss:0.60468 [942] validation_0-logloss:0.60427 [943] validation_0-logloss:0.60452 [944] validation_0-logloss:0.60500 [945] validation_0-logloss:0.60481 [946] validation_0-logloss:0.60507 [947] validation_0-logloss:0.60503 [948] validation_0-logloss:0.60505 [949] validation_0-logloss:0.60494 [950] validation_0-logloss:0.60439 [951] validation_0-logloss:0.60454 [952] validation_0-logloss:0.60453 [953] validation_0-logloss:0.60467 [954] validation_0-logloss:0.60456 [955] validation_0-logloss:0.60452 [956] validation_0-logloss:0.60464 [957] validation_0-logloss:0.60494 [958] validation_0-logloss:0.60493 [959] validation_0-logloss:0.60518 [960] validation_0-logloss:0.60535 [961] validation_0-logloss:0.60534 [962] validation_0-logloss:0.60530 [963] validation_0-logloss:0.60515 [964] validation_0-logloss:0.60497 [965] validation_0-logloss:0.60475 [966] validation_0-logloss:0.60487 [967] validation_0-logloss:0.60496 [968] validation_0-logloss:0.60503 [969] validation_0-logloss:0.60510 [970] validation_0-logloss:0.60502 [971] validation_0-logloss:0.60511 [972] validation_0-logloss:0.60512 [973] validation_0-logloss:0.60506 [974] validation_0-logloss:0.60495 [975] validation_0-logloss:0.60517 [976] validation_0-logloss:0.60527 [977] validation_0-logloss:0.60520 [978] validation_0-logloss:0.60499 [979] validation_0-logloss:0.60524 [980] validation_0-logloss:0.60502 [981] validation_0-logloss:0.60549 [982] validation_0-logloss:0.60578 [983] validation_0-logloss:0.60528 [984] validation_0-logloss:0.60477 [985] validation_0-logloss:0.60478 [986] validation_0-logloss:0.60509 [987] validation_0-logloss:0.60460 [988] validation_0-logloss:0.60440 [989] validation_0-logloss:0.60463 [990] validation_0-logloss:0.60491 [991] validation_0-logloss:0.60490 [992] validation_0-logloss:0.60493 [993] validation_0-logloss:0.60501 [994] validation_0-logloss:0.60499 [995] validation_0-logloss:0.60497 [996] validation_0-logloss:0.60508 [997] validation_0-logloss:0.60511 [998] validation_0-logloss:0.60555 [999] validation_0-logloss:0.60554step-04 確認(rèn)最優(yōu)參數(shù)
print(clf.best_params_) {'base_score': 0.5, 'colsample_bylevel': 0.7, 'colsample_bynode': 0.7, 'colsample_bytree': 0.6, 'gamma': 0, 'learning_rate': 0.1, 'max_bin': 12, 'max_depth': 6, 'min_child_weight': 30, 'n_estimators': 1000, 'reg_alpha': 2, 'reg_lambda': 3, 'subsample': 0.65}step-05 選取最優(yōu)模型
best_model=clf.best_estimator_step-06 評(píng)價(jià)最優(yōu)模型
model_eval2(best_model, train.values, test.values) train_roc_auc_score: 0.8766644056264636 test_roc_auc_score: 0.7278343023255814 train_accuracy_score: 0.8 test_accuracy_score: 0.6833333333333333 train_precision_score: 0.8069963811821471 test__precision_score: 0.7162921348314607 train_recall_score: 0.8479087452471483 test_recall_score: 0.7412790697674418 train_f1_score: 0.8269468479604452 test_f1_score: 0.7285714285714285step-07 保存并調(diào)用模型
joblib.dump(best_model , r'D:\Ensemble_Learning\xgboostinfo\xgboostgridbest.model') best_model=joblib.load( r'D:\Ensemble_Learning\xgboostinfo\xgboostgridbest.model') model_eval2(best_model, train.values, test.values) train_roc_auc_score: 0.8766644056264636 test_roc_auc_score: 0.7278343023255814 train_accuracy_score: 0.8 test_accuracy_score: 0.6833333333333333 train_precision_score: 0.8069963811821471 test__precision_score: 0.7162921348314607 train_recall_score: 0.8479087452471483 test_recall_score: 0.7412790697674418 train_f1_score: 0.8269468479604452 test_f1_score: 0.7285714285714285總結(jié)
以上是生活随笔為你收集整理的集成学习01_xgboost参数讲解与实战的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 梅湛演讲
- 下一篇: 使用JS实现表单验证