Bibliography

[1] Blackard, J. A. and D. J. Dean. Comparative accuracies of artificial neural networks and discriminant analysis in predicting forest cover types from cartographic variables. Computers and Electronics in Agriculture 24, pp. 131–151, 1999.

[2] Bottou, L., and Chih-Jen Lin. Support Vector Machine Solvers. Available at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.64.4209
&rep=rep1&type=pdf
.

[3] Breiman, L. Bagging Predictors. Machine Learning 26, pp. 123–140, 1996.

[4] Breiman, L. Random Forests. Machine Learning 45, pp. 5–32, 2001.

[5] Breiman, L. http://www.stat.berkeley.edu/~breiman/RandomForests/

[6] Breiman, L., et al. Classification and Regression Trees. Chapman & Hall, Boca Raton, 1993.

[7] Christianini, N., and J. Shawe-Taylor. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge University Press, Cambridge, UK, 2000.

[8] Fan, R.-E., P.-H. Chen, and C.-J. Lin. "Working set selection using second order information for training support vector machines." Journal of Machine Learning Research, Vol 6, 2005, pp. 1889–1918.

[9] Freund, Y. A more robust boosting algorithm. arXiv:0905.2138v1, 2009.

[10] Freund, Y. and R. E. Schapire. A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. J. of Computer and System Sciences, Vol. 55, pp. 119–139, 1997.

[11] Friedman, J. Greedy function approximation: A gradient boosting machine. Annals of Statistics, Vol. 29, No. 5, pp. 1189–1232, 2001.

[12] Friedman, J., T. Hastie, and R. Tibshirani. Additive logistic regression: A statistical view of boosting. Annals of Statistics, Vol. 28, No. 2, pp. 337–407, 2000.

[13] Hastie, T., R. Tibshirani, and J. Friedman. The Elements of Statistical Learning, second edition. Springer, New York, 2008.

[14] Ho, T. K. The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 20, No. 8, pp. 832–844, 1998.

[15] Hsu, Chih-Wei, Chih-Chung Chang, and Chih-Jen Lin. A Practical Guide to Support Vector Classification. Available at http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf.

[16] Kecman V., T. -M. Huang, and M. Vogt. "Iterative Single Data Algorithm for Training Kernel Machines from Huge Data Sets: Theory and Performance." In Support Vector Machines: Theory and Applications. Edited by Lipo Wang, 255–274. Berlin: Springer-Verlag, 2005.

[17] Schapire, R. E. et al. Boosting the margin: A new explanation for the effectiveness of voting methods. Annals of Statistics, Vol. 26, No. 5, pp. 1651–1686, 1998.

[18] Schapire, R., and Y. Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning, Vol. 37, No. 3, pp. 297–336, 1999.

[19] Seiffert, C., T. Khoshgoftaar, J. Hulse, and A. Napolitano. RUSBoost: Improving clasification performance when training data is skewed. 19th International Conference on Pattern Recognition, pp. 1–4, 2008.

[20] Warmuth, M., J. Liao, and G. Ratsch. Totally corrective boosting algorithms that maximize the margin. Proc. 23rd Int'l. Conf. on Machine Learning, ACM, New York, pp. 1001–1008, 2006.

[21] Zadrozny, B., J. Langford, and N. Abe. Cost-Sensitive Learning by Cost-Proportionate Example Weighting. CiteSeerX. [Online] 2003. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.5.9780

[22] Zhou, Z.-H. and X.-Y. Liu. On Multi-Class Cost-Sensitive Learning. CiteSeerX. [Online] 2006. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.92.9999

Was this topic helpful?