Optimal aggregation of classifiers in statistical learning

阅读量:

147

作者:

AB Tsybakov

展开

摘要:

Classification can be considered as nonparametric estimation of sets, where the risk is defined by means of a specific distance between sets associated with misclassification error. It is shown that the rates of convergence of classifiers depend on two parameters: the complexity of the class of candidate sets and the margin parameter. The dependence is explicitly given, indicating that optimal fast rates approaching O(n-1) can be attained, where n is the sample size, and that the proposed classifiers have the property of robustness to the margin. The main result of the paper concerns optimal aggregation of classifiers: we suggest a classifier that automatically adapts both to the complexity and to the margin, and attains the optimal fast rates, up to a logarithmic factor.

展开

DOI:

10.1214/aos/1079120131

被引量:

1027

年份:

2004

通过文献互助平台发起求助,成功后即可免费获取论文全文。

相似文献

参考文献

引证文献

来源期刊

引用走势

2011
被引量:103

站内活动

辅助模式

0

引用

文献可以批量引用啦~
欢迎点我试用!

关于我们

百度学术集成海量学术资源,融合人工智能、深度学习、大数据分析等技术,为科研工作者提供全面快捷的学术服务。在这里我们保持学习的态度,不忘初心,砥砺前行。
了解更多>>

友情链接

百度云百度翻译

联系我们

合作与服务

期刊合作 图书馆合作 下载产品手册

©2025 Baidu 百度学术声明 使用百度前必读

引用