Press "Enter" to skip to content

An Explanation of Boosting, Bagging, and Stacking Ensembles

Ivan Palomares Carrascosa disambiguates three terms:

Unity makes strength. This well-known motto perfectly captures the essence of ensemble methods: one of the most powerful machine learning (ML) approaches -with permission from deep neural networks- to effectively address complex problems predicated on complex data, by combining multiple models for addressing one predictive task. This article describes three common ways to build ensemble models: boosting, bagging, and stacking. Let’s get started!

My explanation, which makes sense for people who grew up during the 1980s: bagging is Voltron, boosting is Rocky, and stacking is three racoons in a trench coat.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.