decision tree


Also found in: Dictionary, Thesaurus, Legal, Financial, Acronyms, Encyclopedia, Wikipedia.

de·ci·sion tree

(dē-sizh'ŭn trē),
A graphic construct showing available choices at each decision node of managing a clinical problem along with probabilities (if known) of possible outcomes for patient's freedom from disability, life expectancy, and mortality.
Farlex Partner Medical Dictionary © Farlex 2012

decision tree

Decision-making A schematic representation of the major steps taken in a clinical decision algorithm; a DT begins with the statement of a clinical problem that can be followed along branches, based on the presence or absence of certain objective features, and eventually arrive at a conclusion
McGraw-Hill Concise Dictionary of Modern Medicine. © 2002 by The McGraw-Hill Companies, Inc.

de·ci·sion tree

(dĕ-sizh'ŭn trē)
A graphic construct showing available choices at each decision node of managing a clinical problem along with probabilities (if known) of possible outcomes for patient's freedom from disability, life expectancy, and mortality.
Medical Dictionary for the Health Professions and Nursing © Farlex 2012

de·ci·sion tree

(dĕ-sizh'ŭn trē)
A graphic construct showing available choices at each decision node of managing a clinical problem along with probabilities (if known) of possible outcomes for patient's freedom from disability, life expectancy, and mortality.
Medical Dictionary for the Dental Professions © Farlex 2012
References in periodicals archive ?
Our previous decision tree model for detecting diabetes comprises five risk factors, including age, waist/hip ratio (WHR), waist, duration of hypertension and weight, for an AUC of 0.731.
Input: an attribute set dataset D Output: a decision tree (a) Tree = {} (b) if D is "pure" or other end conditions are met, then (c) terminate (d) end if (e) for each attribute a [member of] D do (f) compute information gain ratio (InGR) (g) end for (h) [a.sub.best] = attribute with the highest InGR (i) Tree = create a tree with only one node [a.sub.best] in the root (j) [D.sub.v] = generate a subset from D except [a.sub.best] (k) for all [D.sub.v] do (l) subtree = C4.5 ([D.sub.v]) (m) set the subtree to the corresponding branch of the Tree according to the InGR (n) end for The training steps of the LVQ algorithm are as follows.
The core algorithm for building Decision Tree used in this paper is based on iD3 and J 48 that uses Entropy and information Gain to construct a Decision Tree.
This section mentions the results of J48 Decision Tree algorithm for classification of skin disease as shown in Figure 3, Table 4 and Table 5 respectively.
Comparison of artificial neural network and decision tree algorithms used for predicting live weight at post weaning period from some biometrical characteristics in Harnai sheep.
During the development of the Assistant learning algorithm, I intuitively developed a >>simple statistical method<<, as I called it at that time and compared its results with decision trees. The surprisingly simple method performed on the primary tumor problem equally well as Assistant did.
The overall prediction accuracy of the decision tree model is (54+253)/347=88.5 percent, indicating an acceptable prediction model.
Min, Lee and Han (2006) mention other pros and cons of decision trees and its applicability.
(1) The establishment of decision tree model; (2) decision tree classification in ENVI; (3) accuracy evaluation in ENVI.
OECD countries were categorized in 6 groups according to the decision tree model.
A decision tree model provides an intuitive and visual interface useful for constructing an optimization ERM decision framework.

Full browser ?