Adaboost.MRT: Boosting regression for multivariate estimation

Nikolai Kummer, Homayoun Najjaran


Adaboost.RT is a well-known extension of Adaboost to regression problems, which achieves increased accuracy by iterativetraining of weak learners on different subsets of data. At each iteration, the prediction error is compared against a threshold,which is used to increase or decrease the weight of the sample for the next iteration. Adaboost.RT is susceptible to noiseand contains a singularity in its misclassification function, which results in reduced accuracy for output values near zero. Wepropose Adaboost.MRT, which extends Adaboost.RT to multivariate output, addresses the singularity in the misclassificationfunction and reduces noise sensitivity. A singularity-free, variance-scaled misclassification function is proposed that generatesdiversity in the training sets. Adaboost.MRT boosts multivariate regression by assigning each output variable a weight for eachsample in the training data. To avoid fitting to outliers, the sampling weights for the training sets are averaged across all outputvariables. The threshold parameter is extended to accommodate the multivariate output and experiments suggest that for smallamounts of output variables, the threshold can be tuned for each output variable individually. Comparisons on six singlevariateoutput datasets show that the proposed Adaboost.MRT outperforms Adaboost.RT on datasets with values near zero or with largenoise and displays a similar accuracy otherwise. Experiments with three multivariate output datasets show that Adaboost.MRTperforms similar or better than bagging and a simple averaging ensemble.

Full Text:




  • There are currently no refbacks.

Artificial Intelligence Research

ISSN 1927-6974 (Print)   ISSN 1927-6982 (Online)

Copyright © Sciedu Press 
To make sure that you can receive messages from us, please add the '' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.