C5 Causal Decision Tree
Keywords:
Decision tree, Causal relationship, Potential outcome model, Partial associationAbstract
A choice tree is fabricated best down from a root hub and includes apportioning the information into subsets that contain examples with comparative esteems (homogenous). On the off chance that the example is totally homogeneous the entropy is zero and if the example is a similarly partitioned it has entropy of one. C5.0 calculation and blue line demonstrates proposed calculation. With the assistance of diagram we can see that exactness of enhanced C5.0 is high when the information estimate is less. In any case, precision of C5.0 calculation diminishes with the expansion of information measure. Exactness of proposed show is superior to anything C5.0 for expansive information estimate.
References
- N. Cartwright, "What are randomised controlled trials good for?" Philosophical Studies, vol. 147, no. 1, pp. 59-70, 2009.
- P. R. Rosenbaum, Design of Observational Studies, ser. Springer Series in Statistics. Springer, 2010.
- P.-N. Tan, M. Steinbach, and V. Kumar, Introduction to data mining.Pearson, 2006.
- P. Spirtes, "Introduction to causal inference," Journal of Machine Learning Research, vol. 11, pp. 1643-1662, 2010.
- J. Pearl, Causality: Models, Reasoning, and Inference, 2nd ed. Cambridge University Press, 2009.
Downloads
Published
Issue
Section
License
Copyright (c) IJSRCSEIT

This work is licensed under a Creative Commons Attribution 4.0 International License.