XGBoost

Koos van Strien moves from Python to R to run an xgboost algorithm:

Note that the parameters of xgboost used here fall in three categories:

  • General parameters

    • nthread (number of threads used, here 8 = the number of cores in my laptop)
  • Booster parameters

    • max.depth (of tree)
    • eta
  • Learning task parameters

    • objective: type of learning task (softmax for multiclass classification)
    • num_class: needed for the “softmax” algorithm: how many classes to predict?
  • Command Line Parameters

    • nround: number of rounds for boosting

Read the whole thing.

Related Posts

Key Concepts of Convolutional Neural Networks

Srinija Sirobhushanam takes us through some of the key concepts around convolutional neural networks: How are convolution layer operations useful?CNN helps us look for specific localized image features like the edges in the image that we can use later in the network Initial layers to detect simple patterns, such as horizontal and vertical edges in […]

Read More

Icon Maps in R

Laura Ellis shows how you can build maps full of little icons: That was ok, but we should try to make the images more aesthetically pleasing using the magick package. We make each image transparent with the image_transparent() function. We can also make the resulting image a specific color with image_colorize(). I then saved the […]

Read More

Categories

September 2016
MTWTFSS
« Aug Oct »
 1234
567891011
12131415161718
19202122232425
2627282930