Skip to content

Latest commit

 

History

History
24 lines (21 loc) · 1.69 KB

AdaBoost.md

File metadata and controls

24 lines (21 loc) · 1.69 KB

ADABOOST

AIMA3e

function ADABOOST(examples, L, K) returns a weighted-majority hypothesis
inputs: examples, set of N labeled examples (x1, y1),…,(xN,yN)
    L, a learning algorithm
    K, the number of hypotheses in the ensemble
local variables: w, a vector of N example weights, initially 1 ⁄ N
        h, a vector of K hypotheses
        z, a vector of K hypothesis weights

for k = 1 to K do
   h[k] ← L(examples, w)
   error ← 0
   for j = 1 to N do
     if h[k](xj) ≠ yj then errorerror + w[j]
   for j = 1 to N do
     if h[k](xj) = yj then w[j] ← w[j] · error ⁄ (1 − error)
   w ← NORMALIZE(w)
   Z[k] ← log(1 − error) ⁄ error
return WEIGHTED-MAJORITY(h, z)


Figure ?? The ADABOOST variant of the boosting method for ensemble learning. The algorithm generates hypothesis by successively reweighting the training examples. The function WEIGHTED-MAJORITY generates a hypothesis that returns the output value with the highest vote from the hypotheses in h, with votes weighted by z.