Data Information in Contingency Tables: A Fallacy of Hierarchical Loglinear Models
Volume 4, Issue 4 (2006), pp. 387–398
Pub. online: 4 August 2022
Type: Research Article
Open Access
Published
4 August 2022
4 August 2022
Abstract
Abstract: Information identities derived from entropy and relative entropy can be useful in statistical inference. For discrete data analyses, a recent study by the authors showed that the fundamental likelihood structure with categorical variables can be expressed in different yet equivalent information decompositions in terms of relative entropy. This clarifies an essential difference between the classical analysis of variance and the analysis of discrete data, revealing a fallacy in the analysis of hierarchical loglinear models. The discussion here is focused on the likelihood information of a three-way contingency table, without loss of generality. A classical three-way categorical data example is examined to illustrate the findings.