Adaboost works by iteratively training multiple weak classifiers on a dataset. Initially, all instances are given equal weight. After each iteration, the weights of misclassified instances are increased, thereby focusing the next classifier on the harder-to-classify instances. The final model is a weighted sum of all the classifiers, which results in a stronger overall model.