Introduction
This post is the practical follow‑up to When Linear Mixed Models Don’t Converge, where I discussed the theoretical reasons why convergence issues often arise in linear mixed models. Here, we move from theory to practice: simulated examples, diagnostic tools, interpretation of singular fits, and practical modeling alternatives.
If you haven’t read the first part yet, you may want to start there for the conceptual foundations.
Figure. Conceptual map showing how model complexity and available information interact
Legend of Figure 1. Green = identifiable models; yellow = risk of singularity; red = non‑identifiable models.
A Concrete Example Using R: When a Random-Slope Model Becomes Singular
Let’s simulate a simple dataset with only 8 groups and a very small random‑slope variance. This is a classic scenario where the model converges numerically but is statistically non‑identifiable.
```r
set.seed(123) n_groups <- 8 n_per_group <- 20 group <- factor(rep(1:n_groups, each = n_per_group)) x <- rnorm(n_groups * n_per_group) beta0 <- 2 beta1 <- 1 u0 <- rnorm(n_groups, 0, 1) u1 <- rnorm(n_groups, 0, 0.05) y <- beta0 + u0[group] + (beta1 + u1[group]) * x + rnorm(n_groups * n_per_group, 0, 1) df <- data.frame(y, x, group) library(lme4) m <- lmer(y ~ x + (x | group), data = df) summary(m) isSingular(m)
Typical output:
Random‑slope variance ≈ 0.05
Correlation between intercept and slope = 1.00
Message: boundary (singular) fit
isSingular(m)returns TRUE
This is not a numerical failure. It is a mathematical signal that the model is too complex for the available information.
How to Interpret a Singular Fit
A singular fit occurs when the estimated random‑effects variance–covariance matrix is not of full rank. This typically means:
A random‑effect variance is estimated as zero
A correlation hits ±1
The Hessian is not invertible
The model lies on the boundary of the parameter space
In practice, this means the data do not support the random‑effects structure you specified.
Diagnostic Checklist
Red flags
Variance of a random effect is exactly zero
Correlation between random effects is ±1
isSingular(model)returns TRUELarge gradients or warnings about the Hessian
Estimates change dramatically when switching optimizers
Useful tools
lme4::isSingular()lmerControl(optimizer = ...)for diagnosis onlyLikelihood‑ratio tests between nested models
Checking the number of levels per random effect
Practical Alternatives When the Model Is Too Complex
1. Simplify the random‑effects structure
Often, the most appropriate solution.
Examples:
Remove the random slope
Remove correlations using
(1 | group) + (0 + x | group)Fit a random‑intercept model first and evaluate necessity of slopes
2. Bayesian models with weakly informative priors
Bayesian estimation stabilizes variance estimates near zero.
Example:
```r
brm(y ~ x + (x | group), data = df, prior = prior(exponential(2), class = sd))
3. Penalized mixed models (e.g., glmmTMB)
Penalization shrinks unstable variance components.
4. Marginal models (GEE)
When the focus is on fixed effects rather than subject‑specific effects.
Summary
Singular fits are not numerical accidents. They are statistical messages telling you that:
the model is over‑parameterized,
the data do not contain enough information,
and the random‑effects structure needs to be reconsidered.
Understanding this distinction leads to more robust modeling decisions and clearer scientific conclusions.

No comments:
Post a Comment