Loading…
NIPS 2013 has ended
Sunday, December 8 • 2:00pm - 6:00pm
Direct 0-1 Loss Minimization and Margin Maximization with Boosting

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

We propose a boosting method, DirectBoost, a greedy coordinate descent algorithm that builds an ensemble classifier of weak classifiers through directly minimizing empirical classification error over labeled training examples; once the training classification error is reduced to a local coordinatewise minimum, DirectBoost runs a greedy coordinate ascent algorithm that continuously adds weak classifiers to maximize any targeted arbitrarily defined margins until reaching a local coordinatewise maximum of the margins in a certain sense. Experimental results on a collection of machine-learning benchmark datasets show that DirectBoost gives consistently better results than AdaBoost, LogitBoost, LPBoost with column generation and BrownBoost, and is noise tolerant when it maximizes an n'th order bottom sample margin.
None


Sunday December 8, 2013 2:00pm - 6:00pm PST
Harrah's Special Events Center, 2nd Floor
  Posters
  • posterid Sun86
  • location Poster# Sun86

Attendees (0)