adaboost {fastAdaboost}R Documentation

Adaboost.M1 algorithm

Description

Implements Freund and Schapire's Adaboost.M1 algorithm

Usage

adaboost(formula, data, nIter, ...)

Arguments

formula

Formula for models

data

Input dataframe

nIter

no. of classifiers

...

other optional arguments, not implemented now

Details

This implements the Adaboost.M1 algorithm for a binary classification task. The target variable must be a factor with exactly two levels. The final classifier is a linear combination of weak decision tree classifiers.

Value

object of class adaboost

References

Freund, Y. and Schapire, R.E. (1996):“Experiments with a new boosting algorithm” . In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156, Morgan Kaufmann.

See Also

real_adaboost, predict.adaboost

Examples

fakedata <- data.frame( X=c(rnorm(100,0,1),rnorm(100,1,1)), Y=c(rep(0,100),rep(1,100) ) )
fakedata$Y <- factor(fakedata$Y)
test_adaboost <- adaboost(Y~X, data=fakedata,10)

[Package fastAdaboost version 1.0.0 Index]