陈天奇《Introduction to Boosted Trees》PPT 缩略版笔记
生活随笔
收集整理的這篇文章主要介紹了
陈天奇《Introduction to Boosted Trees》PPT 缩略版笔记
小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
深入研究了一下陳天奇Boosted Tree的PPT,做了點(diǎn)簡(jiǎn)單的筆記,可以說(shuō)是PPT的縮略版:
框架有了,截了些重要的圖和公式。
雖然簡(jiǎn)略,但是足以學(xué)習(xí)大牛思考問(wèn)題的方式。
Review of key concepts of supervised learning
- Elements in Supervised Learning
- Model
- Parameters
- Objective function
- Putting known knowledge into context
- Objective and Bias Variance Trade-off
Regression Tree and Ensemble (What are we Learning)
- Regression Tree (CART)
- Regression Tree Ensemble
- Tree Ensemble methods(基于樹(shù)的集成方法的一些優(yōu)點(diǎn))
- Put into context: Model and Parameters(model:加法模型;parameters:樹(shù)/函數(shù))
- Learning a tree on single variable(情歌率-時(shí)間序列的例子)
- Learning a step function(階躍函數(shù))
- Learning step function (visually)
- Coming back: Objective for Tree Ensemble
- Objective vs Heuristic
- Regression Tree is not just for regression!
Gradient Boosting (How do we Learn)
- Take Home Message for this section (其實(shí)是總結(jié)第二部分。。)
- So How do we Learn? (SGD不可以,提出加法模型)
- Additive Training (公式推導(dǎo),生成殘差)
- Taylor Expansion Approximation of Loss (二級(jí)泰勒展開(kāi),出現(xiàn)gi和hi)
- Our New Goal (化簡(jiǎn)后的損失函數(shù))
- Refine the definition of tree (樹(shù)的數(shù)學(xué)表達(dá))
- Define Complexity of a Tree (cont’) (定義樹(shù)的復(fù)雜度)
- Revisit the Objectives (結(jié)合前兩張,重看目標(biāo)函數(shù))
- The Structure Score
- The Structure Score Calculation
- Searching Algorithm for Single Tree
- Greedy Learning of the Tree
- Efficient Finding of the Best Split
- An Algorithm for Split Finding
- What about Categorical Variables?
- Pruning and Regularization
- Recap: Boosted Tree Algorithm (總結(jié)第三部分)
總結(jié)
以上是生活随笔為你收集整理的陈天奇《Introduction to Boosted Trees》PPT 缩略版笔记的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 送书福利 |《趣味编程三剑客:从Scra
- 下一篇: 八年phper的高级工程师面试之路八年p