Boosting is a field of machine learning (branch of artificial intelligence). It’s a principle that brings together many algorithms that rely on sets of binary classifiers: boosting optimizes their performance.
The principle comes from the combination of classifiers (also called hypotheses). By successive iterations, the knowledge of a weak classifier — weak classifier — is added to the final classifier — strong classifier.
We call the weak learner an algorithm that provides weak classifiers, capable of recognizing two classes at least as well as chance would (ie it is not wrong more than once in two). average, if the class distribution is balanced). The classifier provided is weighted by the quality of its classification: the better it ranks, the more important it will be. Misclassified examples are boosted so that they become more important to the weak learner in the next round, so that he or she can make up for the lack.
One of the most used algorithms in lol boosting is called AdaBoost, abbreviation of adaptive boosting.
Boosting is based on the theory of PAC learning.