Ramandeep Kaur explains how decision trees work:
Simply put, a decision tree is a tree in which each branch node represents a choice between a number of alternatives, and each leaf node represents a decision.
It is a type of supervised learning algorithm (having a pre-defined target variable) that is mostly used in classification problems and works for both categorical and continuous input and output variables. It is one of the most widely used and practical methods for Inductive Inference. (Inductive inference is the process of reaching a general conclusion from specific examples.)
Decision trees learn and train itself from given examples and predict for unseen examples.
Click through for an example of implementing the ID3 algorithm and generating a decision tree from a data set.