Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Improved boosting performance by exclusion of ambiguous positive examples
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
2013 (English)In: ICPRAM 2013 - Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods, 2013, 11-21 p.Conference paper, Published paper (Refereed)
Abstract [en]

In visual object class recognition it is difficult to densely sample the set of positive examples. Therefore, frequently there will be areas of the feature space that are sparsely populated, in which uncommon examples are hard to disambiguate from surrounding negatives without overfitting. Boosting in particular struggles to learn optimal decision boundaries in the presence of such hard and ambiguous examples. We propose a two-pass dataset pruning method for identifying ambiguous examples and subjecting them to an exclusion function, in order to obtain more optimal decision boundaries for existing boosting algorithms. We also provide an experimental comparison of different boosting algorithms on the VOC2007 dataset, training them with and without our proposed extension. Using our exclusion extension improves the performance of all the tested boosting algorithms except TangentBoost, without adding any additional test-time cost. In our experiments LogitBoost performs best overall and is also significantly improved by our extension. Our results also suggest that outlier exclusion is complementary to positive jittering and hard negative mining.

Place, publisher, year, edition, pages
2013. 11-21 p.
Series
ICPRAM 2013 - Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods
Keyword [en]
Algorithm evaluation, Boosting, Dataset pruning, Image classification, VOC2007, Boosting algorithm, Experimental comparison, Optimal decision boundary, Positive examples, Adaptive boosting, Optimization, Pattern recognition, Algorithms
National Category
Computer and Information Science
Identifiers
URN: urn:nbn:se:kth:diva-134459Scopus ID: 2-s2.0-84877932599ISBN: 9789898565419 (print)OAI: oai:DiVA.org:kth-134459DiVA: diva2:669043
Conference
2nd International Conference on Pattern Recognition Applications and Methods, ICPRAM 2013, 15 February 2013 through 18 February 2013, Barcelona
Note

QC 20131202

Available from: 2013-12-02 Created: 2013-11-25 Last updated: 2013-12-02Bibliographically approved

Open Access in DiVA

No full text

Other links

ScopusImproved boosting performance by exclusion of ambiguous positive examples

Search in DiVA

By author/editor
Kobetski, MiroslavSullivan, Josephine
By organisation
Computer Vision and Active Perception, CVAP
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 30 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf