• Document: Lecture 22: Introduction to Log-linear Models
• Size: 172.64 KB
• Uploaded: 2019-05-16 16:39:46
• Status: Successfully converted

Some snippets from your converted document:

Lecture 22: Introduction to Log-linear Models Dipankar Bandyopadhyay, Ph.D. BMTRY 711: Analysis of Categorical Data Spring 2011 Division of Biostatistics and Epidemiology Medical University of South Carolina Lecture 22: Introduction to Log-linear Models – p. 1/59 Log-linear Models • Log-linear models are a Generalized Linear Model • A common use of a log-linear model is to model the cell counts of a contingency table • The systematic component of the model describe how the expected cell counts vary as a result of the explanatory variables • Since the response of a log linear model is the cell count, no measured variables are considered the response Lecture 22: Introduction to Log-linear Models – p. 2/59 Recap from Previous Lectures • Lets suppose that we have an I × J × Z contingency table. • That is, There are I rows, J columns and Z layers. (picture of cube) Lecture 22: Introduction to Log-linear Models – p. 3/59 Conditional Independence We want to explore the concepts of independence using a log-linear model. But first, lets review some probability theory. Recall, two variables A and B are independent if and only if P (AB) = P (A) × P (B) Also recall that Bayes Law states for any two random variables P (AB) P (A|B) = P (B) and thus, when X and Y are independent, P (A)P (B) P (A|B) = = P (A) P (B) Lecture 22: Introduction to Log-linear Models – p. 4/59 Conditional Independence Definitions: In layer k where k ∈ {1, 2, . . . , Z}, X and Y are conditionally independent at level k of Z when P (Y = j|X = i, Z = k) = P (Y = j|Z = k), ∀i, j If X and Y are conditionally independent at ALL levels of Z, then X and Y are CONDITIONALLY INDEPENDENT. Lecture 22: Introduction to Log-linear Models – p. 5/59 Application of the Multinomial Suppose that a single multinomial applies to the entire three-way table with cell probabilities equal to πijk = P (X = i, Y = j, Z = k) Let P π·jk = P (X = i, Y = j, Z = k) X = P (Y = j, Z = k) Then, πijk = P (X = i, Z = k)P (Y = j|X = i, Z = k) by application of Bayes law. (The event (Y = j) = A and (X = i, Z = k) = B). Lecture 22: Introduction to Log-linear Models – p. 6/59 Then if X and Y are conditionally independent at level z of Z, πijk = P (X = i, Z = k)P (Y = j|X = i, Z = k) = πi·k P (Y = j|Z = k) = πi·k P (Y = j, Z = k)/P (Z = k) = πi·k π·jk /π··k for all i, j, and k. Lecture 22: Introduction to Log-linear Models – p. 7/59 (2 × 2) table • Lets suppose we are interested in a (2 × 2) table for the moment • Let X describe the row effect and Y describe the column effect • If X and Y are independent, then πij = πi· π·j • Then the expected cell count for the ij th cell would be nπij = µij = nπi· π·j Or, log µij = λ + λX Y i + λj • This model is called the log-linear model of independence Lecture 22: Introduction to Log-linear Models – p. 8/59 Interaction term • In terms of a regression model, a significant interaction term indicates that the response varies as a function of the combination of X and Y • That is, changes in the response as a function of X require the specification of Y to explain the change • This implies that X and Y are NOT INDEPENDENT • Let λXY ij denote the interaction term • Testing λXY ij = 0 is a test of independence Lecture 22: Introduction to Log-linear Models – p. 9/59 Log-linear Models for (2 × 2) tables • Unifies all probability models discussed.

Recently converted files (publicly available):