Koos van Strien moves from Python to R to run an xgboost algorithm:
Note that the parameters of xgboost used here fall in three categories:
-
General parameters
- nthread (number of threads used, here 8 = the number of cores in my laptop)
-
Booster parameters
- max.depth (of tree)
- eta
-
Learning task parameters
- objective: type of learning task (softmax for multiclass classification)
- num_class: needed for the “softmax” algorithm: how many classes to predict?
-
Command Line Parameters
-
nround: number of rounds for boosting
-
Read the whole thing.