15-388/688 - Practical Data Science: Decision trees and interpretable models
- J. Zico Kolter
Carnegie Mellon University Spring 2018
1
15-388/688 - Practical Data Science: Decision trees and - - PowerPoint PPT Presentation
15-388/688 - Practical Data Science: Decision trees and interpretable models J. Zico Kolter Carnegie Mellon University Spring 2018 1 Outline Decision trees Training (classification) decision trees Interpreting predictions Boosting
1
2
3
4
5
6
7
8
9
10
휃
푚 푖=1
11
휃
푚 푖=1
12
휃
푚 푖=1
13
1−푦
14
휽
풎 풊=ퟏ
15
16
ℎ
17
+/− and defined similarly to before), and we would choose
+ / 𝒴0 , ℎ1 = 𝒴1 + / 𝒴1
18
0 |
|X0|
1 |
|X1|
19
20
21
j
j
j
j
j
j
j
ℎ
22
23
24
25
26
27
28
29
30
휃
푚 푖=1
31
32
푖=1,…,푚, # trees 𝑈 , loss ℓ, step size 𝛽
푖 ← 휕 휕푦̂ ℓ 𝑧̂ 푖 , 𝑧 푖
1,…,푚
푇 푖=1
33
34
35
from sklearn.tree import DecisionTreeClassifier clf = DecisionTreeClassifier(criterion='entropy', max_depth=5) clf.fit(X, y) from sklearn.ensemble import GradientBoostingClassifier clf = GradientBoostingClassifier(loss='deviance', max_depth=3, n_estimators=100) clf.fit(X, y)
36
37