Efficient Multiclass Boosting Classification with Active Learning

2007-01-01
Huang, Jian
Ertekin Bolelli, Şeyda
Song, Yang
Zha, Hongyuan
Giles, C. Lee
We propose a novel multiclass classification algorithm Gentle Adaptive Multiclass Boosting Learning (GAMBLE). The algorithm naturally extends the two class Gentle AdaBoost algorithm to multiclass classification by using the multiclass exponential loss and the multiclass response encoding scheme. Unlike other multiclass algorithms which reduce the K-class classification task to K binary classifications, GAMBLE handles the task directly and symmetrically, with only one committee classifier. We formally derive the GAM-BLE algorithm with the quasi-Newton method, and prove the structural equivalence of the two regression trees in each boosting step.

Suggestions

Efficient adaptive regression spline algorithms based on mapping approach with a case study on finance
Koc, Elcin Kartal; İyigün, Cem; Batmaz, İnci; Weber, Gerhard-Wilhelm (2014-09-01)
Multivariate adaptive regression splines (MARS) has become a popular data mining (DM) tool due to its flexible model building strategy for high dimensional data. Compared to well-known others, it performs better in many areas such as finance, informatics, technology and science. Many studies have been conducted on improving its performance. For this purpose, an alternative backward stepwise algorithm is proposed through Conic-MARS (CMARS) method which uses a penalized residual sum of squares for MARS as a T...
Robust multiobjective evolutionary feature subset selection algorithm for binary classification using machine learning techniques
Deniz, Ayca; Kiziloz, Hakan Ezgi; Dokeroglu, Tansel; Coşar, Ahmet (2017-06-07)
This study investigates the success of a multiobjective genetic algorithm (GA) combined with state-of-the-art machine learning (ML) techniques for the feature subset selection (FSS) in binary classification problem (BCP). Recent studies have focused on improving the accuracy of BCP by including all of the features, neglecting to determine the best performing subset of features. However, for some problems, the number of features may reach thousands, which will cause too much computation power to be consumed ...
An experimental comparison of symbolic and neural learning algorithms
Baykal, Nazife (1998-04-23)
In this paper comparative strengths and weaknesses of symbolic and neural learning algorithms are analysed. Experiments comparing the new generation symbolic algorithms and neural network algorithms have been performed using twelve large, real-world data sets.
Extended Target Tracking using a Gaussian-Mixture PHD Filter
Granstrom, Karl; Lundquist, Christian; Orguner, Umut (2012-10-01)
This paper presents a Gaussian-mixture (GM) implementation of the probability hypothesis density (PHD) filter for tracking extended targets. The exact filter requires processing of all possible measurement set partitions, which is generally infeasible to implement. A method is proposed for limiting the number of considered partitions and possible alternatives are discussed. The implementation is used on simulated data and in experiments with real laser data, and the advantage of the filter is illustrated. S...
Compact Frequency Memory for Reinforcement Learning with Hidden States.
Polat, Faruk; Cilden, Erkin (2019-10-28)
Memory-based reinforcement learning approaches keep track of past experiences of the agent in environments with hidden states. This may require extensive use of memory that limits the practice of these methods in a real-life problem. The motivation behind this study is the observation that less frequent transitions provide more reliable information about the current state of the agent in ambiguous environments. In this work, a selective memory approach based on the frequencies of transitions is proposed to ...
Citation Formats
J. Huang, Ş. Ertekin Bolelli, Y. Song, H. Zha, and C. L. Giles, “Efficient Multiclass Boosting Classification with Active Learning,” 2007, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/54186.