Fisseha Berhane explains how to implement Extreme Gradient Boosting in R:
What makes it so popular are its speed and performance. It gives among the best performances in many machine learning applications. It is optimized gradient-boosting machine learning library. The core algorithm is parallelizable and hence it can use all the processing power of your machine and the machines in your cluster. In R, according to the package documentation, since the package can automatically do parallel computation on a single machine, it could be more than 10 times faster than existing gradient boosting packages.
xgboost
shines when we have lots of training data where the features are numeric or a mixture of numeric and categorical fields. It is also important to note thatxgboost
is not the best algorithm out there when all the features are categorical or when the number of rows is less than the number of fields (columns).
xgboost is a nice complement to neural networks, as they tend to be great at different things.