Concept:
In machine learning, ensemble methods combine multiple models to improve predictive performance. One such method is Bagging (Bootstrap Aggregating).
Bagging reduces variance by training multiple models (often decision trees) on different random subsets of the training dataset and then combining their predictions.
Step 1: Understand Bagging.
Bagging works by creating several training datasets using bootstrap sampling (sampling with replacement). Each dataset is used to train a separate model.
- Each model sees a slightly different subset of the data.
- The models are trained independently.
Step 2: Combine predictions.
After training multiple models:
- For classification, predictions are combined using majority voting.
- For regression, predictions are combined using averaging.
This reduces model variance and improves stability.
Step 3: Example.
A popular algorithm based on bagging is the Random Forest, where many decision trees are trained on different subsets of the dataset.
Step 4: Conclusion.
Since the technique described trains multiple trees on different subsets of data to reduce variance, the correct answer is:
\[
\text{Bagging}
\]