Abstract

Generalized estimating equations (GEEs) is a popular regression approach for correlated data that requires specification of the first two marginal moments of the data, along with a working correlation matrix capturing the covariation between responses e.g., temporal correlations within clusters in longitudinal data. The majority of research and application of GEEs has focused on estimation and inference of the regression coefficients in the marginal mean. When it comes to prediction using GEEs, practitioners often simply, and quite understandably, also based it on the regression model characterizing the marginal mean.

We propose a simple adjustment to predictions in GEEs based on utilizing information in the assumed working correlation matrix. By viewing the GEE from the perspective of solving a working linear model, we borrow ideas from kriging to construct a “conditional" predictor that leverages correlations between current and future observations within the same cluster. We establish some theoretical conditions required for the adjusted GEE predictor to outperform the standard (unadjusted) predictor. Simulations show even when we misspecify the working correlation, adjusted GEE predictors can achieve better performance relative to standard GEE predictors, the so-called "oracle" GEE predictor using all time points, and potentially even cluster-specific predictions from a generalized linear mixed model.

This is joint work with Samuel Mueller and A. H. Welsh.

Speaker

A/Prof. Francis K. C. Hui 

Research Area

Statistics seminar

Affiliation

Australian National University

Date

Friday, 13 Feb 2026, 4:00 pm

Venue

Microsoft Teams/ Anita B. Lawrence 4082