Improved boosting performance by explicit handling of ambiguous positive examples
2015 (English)In: Pattern Recognition: Applications and Methods, Springer Berlin/Heidelberg, 2015, 17-37 p.Conference paper (Refereed)
Visual classes naturally have ambiguous examples, that are different depending on feature and classifier and are hard to disambiguate from surrounding negatives without overfitting. Boosting in particular tends to overfit to such hard and ambiguous examples, due to its flexibility and typically aggressive loss functions. We propose a two-pass learning method for identifying ambiguous examples and relearning, either subjecting them to an exclusion function or using them in a later stage of an inverted cascade. We provide an experimental comparison of different boosting algorithms on the VOC2007 dataset, training them with and without our proposed extension. Using our exclusion extension improves the performance of almost all of the tested boosting algorithms, without adding any additional test-time cost. Our proposed inverted cascade adds some test-time cost but gives additional improvements in performance. Our results also suggest that outlier exclusion is complementary to positive jittering and hard negative mining.
Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2015. 17-37 p.
, Advances in Intelligent Systems and Computing, ISSN 2194-5357 ; 318
Algorithm evaluation, Boosting, Dataset pruning, Image classification, VOC2007, Pattern recognition, Boosting algorithm, Experimental comparison, Outlier exclusion, Positive examples, Classification (of information)
Computer and Information Science
IdentifiersURN: urn:nbn:se:kth:diva-167371DOI: 10.1007/978-3-319-12610-4_2ISI: 000364822300002ScopusID: 2-s2.0-84914145602ISBN: 9783319126098OAI: oai:DiVA.org:kth-167371DiVA: diva2:815524
nd International Conference on Pattern Recognition, ICPRAM 2013; Barcelona; Spain; 15 February 2013 through 18 February 2013
QC 201506012015-06-012015-05-222015-12-17Bibliographically approved