(no description)
File Size: | 252 lines (7 kb) |
Included or required: | 0 times |
Referenced: | 0 times |
Includes or requires: | 0 files |
AdaBoost:: (9 methods):
__construct()
setBaseClassifier()
train()
predictSample()
getBestClassifier()
resample()
evaluateClassifier()
calculateAlpha()
updateWeights()
__construct(int $maxIterations = 50) X-Ref |
ADAptive BOOSTing (AdaBoost) is an ensemble algorithm to improve classification performance of 'weak' classifiers such as DecisionStump (default base classifier of AdaBoost). |
setBaseClassifier(string $baseClassifier = DecisionStump::class, array $classifierOptions = []) X-Ref |
Sets the base classifier that will be used for boosting (default = DecisionStump) |
train(array $samples, array $targets) X-Ref |
predictSample(array $sample) X-Ref |
return: mixed |
getBestClassifier() X-Ref |
Returns the classifier with the lowest error rate with the consideration of current sample weights |
resample() X-Ref |
Resamples the dataset in accordance with the weights and returns the new dataset |
evaluateClassifier(Classifier $classifier) X-Ref |
Evaluates the classifier and returns the classification error rate |
calculateAlpha(float $errorRate) X-Ref |
Calculates alpha of a classifier |
updateWeights(Classifier $classifier, float $alpha) X-Ref |
Updates the sample weights |