predefinedClassifiers {TunePareto} | R Documentation |
Creates TunePareto classifier objects for the k-Nearest Neighbour classifier, support vector machines, and trees.
tunePareto.knn() tunePareto.svm() tunePareto.tree() tunePareto.randomForest() tunePareto.NaiveBayes()
tunePareto.knn
encapsulates a k-Nearest Neighbour classifier as defined in link[class]{knn}
in package class. The classifier allows for supplying and tuning the following parameters of link[class]{knn}
:
k, l, use.all
tunePareto.svm
encapsulates the support vector machine svm
classifier in package e1071. The classifier allows for supplying and tuning the following parameters:
kernel, degree, gamma,
coef0, cost, nu,
class.weights, cachesize,
tolerance, epsilon,
scale, shrinking, fitted,
subset, na.action
tunePareto.tree
encapsulates the CART classifier tree
in package tree. The classifier allows for supplying and tuning the following parameters:
weights, subset,
na.action, method,
split, mincut, minsize, mindev
as well as the type
parameter of predict.tree
.
tunePareto.randomForest
encapsulates the randomForest
classifier in package randomForest. The classifier allows for supplying and tuning the following parameters:
subset, na.action,
ntree, mtry,
replace, classwt,
cutoff, strata,
sampsize, nodesize,
maxnodes
tunePareto.NaiveBayes
encapsulates the NaiveBayes
classifier in package klaR. The classifier allows for supplying and tuning the following parameters:
prior, usekernel, fL, subset,
na.action, bw, adjust, kernel, weights,
window, width, give.Rkern, n,
from, to, cut, na.rm
Returns objects of class TuneParetoClassifier
as described in tuneParetoClassifier
. These can be passed to functions like tunePareto
or trainTuneParetoClassifier
.
tuneParetoClassifier
, tunePareto
, trainTuneParetoClassifier
# tune a k-NN classifier with different 'k' and 'l' # on the 'iris' data set print(tunePareto(classifier = tunePareto.knn(), data = iris[, -ncol(iris)], labels = iris[, ncol(iris)], k = c(5,7,9), l = c(1,2,3), objectiveFunctions=list(cvError(10, 10), cvSpecificity(10, 10, caseClass="setosa")))) # tune an SVM with different costs on # the 'iris' data set # using Halton sequences for sampling print(tunePareto(classifier = tunePareto.svm(), data = iris[, -ncol(iris)], labels = iris[, ncol(iris)], cost = as.interval(0.001,10), sampleType = "halton", numCombinations=20, objectiveFunctions=list(cvWeightedError(10, 10), cvSensitivity(10, 10, caseClass="setosa")))) # tune a CART classifier with different # splitting criteria on the 'iris' data set print(tunePareto(classifier = tunePareto.tree(), data = iris[, -ncol(iris)], labels = iris[, ncol(iris)], split = c("deviance","gini"), objectiveFunctions=list(cvError(10, 10), cvErrorVariance(10, 10)))) # tune a Random Forest with different numbers of trees # on the 'iris' data set print(tunePareto(classifier = tunePareto.randomForest(), data = iris[, -ncol(iris)], labels = iris[, ncol(iris)], ntree = seq(50,300,50), objectiveFunctions=list(cvError(10, 10), cvSpecificity(10, 10, caseClass="setosa")))) # tune a Naive Bayes classifier with different kernels # on the 'iris' data set print(tunePareto(classifier = tunePareto.NaiveBayes(), data = iris[, -ncol(iris)], labels = iris[, ncol(iris)], kernel = c("gaussian", "epanechnikov", "rectangular", "triangular", "biweight", "cosine", "optcosine"), objectiveFunctions=list(cvError(10, 10), cvSpecificity(10, 10, caseClass="setosa"))))