ART

In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The original paper casts the AdaBoost algorithm into a statistical framework.[1] Specifically, if one considers AdaBoost as a generalized additive model and then applies the cost function of logistic regression, one can derive the LogitBoost algorithm.
Minimizing the LogitBoost cost function

LogitBoost can be seen as a convex optimization. Specifically, given that we seek an additive model of the form

\( f=\sum _{t}\alpha _{t}h_{t} \)

the LogitBoost algorithm minimizes the logistic loss:

\( \sum _{i}\log \left(1+e^{{-y_{i}f(x_{i})}}\right) \)

See also

Gradient boosting
Logistic model tree

References

Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert (2000). "Additive logistic regression: a statistical view of boosting". Annals of Statistics. 28 (2): 337–407. CiteSeerX 10.1.1.51.9525. doi:10.1214/aos/1016218223.

Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License