Recursive partitioning
Statistical method for decision tree construction
Recursive partitioning is a statistical method used for constructing decision trees. It is a fundamental technique in machine learning and data mining for classification and regression tasks. The method involves splitting a dataset into subsets, which are then split into further subsets, recursively, to form a tree structure. This process continues until the subsets are sufficiently homogeneous or meet other stopping criteria.
Overview[edit]
Recursive partitioning is used to create a model that predicts the value of a target variable based on several input variables. The process begins with the entire dataset and involves the following steps:
1. Splitting: The dataset is split into two or more homogeneous sets based on a splitting criterion. 2. Stopping: The process stops when a stopping criterion is met, such as a minimum number of samples in a node or a maximum tree depth. 3. Pruning: After the tree is fully grown, it may be pruned to remove branches that have little importance, which helps to prevent overfitting.
Splitting Criteria[edit]
The choice of splitting criterion is crucial for the performance of the decision tree. Common criteria include:
- Gini impurity: Measures the impurity of a node, used in classification and regression tree (CART) algorithms.
- Information gain: Used in ID3 and C4.5 algorithms, it measures the reduction in entropy.
- Variance reduction: Used for regression trees, it measures the reduction in variance of the target variable.
Applications[edit]
Recursive partitioning is widely used in various fields, including:
- Medicine: For diagnostic and prognostic models.
- Finance: For credit scoring and risk assessment.
- Marketing: For customer segmentation and targeting.
Advantages and Disadvantages[edit]
Advantages[edit]
- Interpretability: Decision trees are easy to interpret and visualize.
- Non-parametric: They do not assume any underlying distribution of the data.
- Versatility: Can handle both numerical and categorical data.
Disadvantages[edit]
- Overfitting: Trees can become overly complex and fit the noise in the data.
- Instability: Small changes in the data can result in a completely different tree.
Example[edit]

The image shows a decision tree constructed using recursive partitioning to predict the survival of passengers on the RMS Titanic. The tree splits the data based on features such as age, sex, and class, illustrating how recursive partitioning can be used to model complex relationships in data.
Related pages[edit]
Ad. Transform your life with W8MD's Budget GLP-1 injections from $75


W8MD offers a medical weight loss program to lose weight in Philadelphia. Our physician-supervised medical weight loss provides:
- Weight loss injections in NYC (generic and brand names):
- Zepbound / Mounjaro, Wegovy / Ozempic, Saxenda
- Most insurances accepted or discounted self-pay rates. We will obtain insurance prior authorizations if needed.
- Generic GLP1 weight loss injections from $75 for the starting dose.
- Also offer prescription weight loss medications including Phentermine, Qsymia, Diethylpropion, Contrave etc.
NYC weight loss doctor appointmentsNYC weight loss doctor appointments
Start your NYC weight loss journey today at our NYC medical weight loss and Philadelphia medical weight loss clinics.
- Call 718-946-5500 to lose weight in NYC or for medical weight loss in Philadelphia 215-676-2334.
- Tags:NYC medical weight loss, Philadelphia lose weight Zepbound NYC, Budget GLP1 weight loss injections, Wegovy Philadelphia, Wegovy NYC, Philadelphia medical weight loss, Brookly weight loss and Wegovy NYC
|
WikiMD's Wellness Encyclopedia |
| Let Food Be Thy Medicine Medicine Thy Food - Hippocrates |
Medical Disclaimer: WikiMD is not a substitute for professional medical advice. The information on WikiMD is provided as an information resource only, may be incorrect, outdated or misleading, and is not to be used or relied on for any diagnostic or treatment purposes. Please consult your health care provider before making any healthcare decisions or for guidance about a specific medical condition. WikiMD expressly disclaims responsibility, and shall have no liability, for any damages, loss, injury, or liability whatsoever suffered as a result of your reliance on the information contained in this site. By visiting this site you agree to the foregoing terms and conditions, which may from time to time be changed or supplemented by WikiMD. If you do not agree to the foregoing terms and conditions, you should not enter or use this site. See full disclaimer.
Credits:Most images are courtesy of Wikimedia commons, and templates, categories Wikipedia, licensed under CC BY SA or similar.
Translate this page: - East Asian
中文,
日本,
한국어,
South Asian
हिन्दी,
தமிழ்,
తెలుగు,
Urdu,
ಕನ್ನಡ,
Southeast Asian
Indonesian,
Vietnamese,
Thai,
မြန်မာဘာသာ,
বাংলা
European
español,
Deutsch,
français,
Greek,
português do Brasil,
polski,
română,
русский,
Nederlands,
norsk,
svenska,
suomi,
Italian
Middle Eastern & African
عربى,
Turkish,
Persian,
Hebrew,
Afrikaans,
isiZulu,
Kiswahili,
Other
Bulgarian,
Hungarian,
Czech,
Swedish,
മലയാളം,
मराठी,
ਪੰਜਾਬੀ,
ગુજરાતી,
Portuguese,
Ukrainian