Algorithms
Solving Linear Mixed model Equations
lmt supports two types of solver for solving MME's: a direct solver and an iterative solver
Iterative solver
The iterative solver uses the preconditioned conjugate gradient method and is lmt's default solver. It does not require the explicit construction of any mixed model equation, and is therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the iterative solver. Even for small models the iterative solver usually outperforms the direct solver in terms of total processing time.
The iterative solver has converged to a stable solution if $$log_e\left(\sqrt{\frac{(Cx-b)'(Cx-b)}{b'b}}\right)<t$$, where $$C$$ is the mixed-model coefficient matrix, $$x$$ is the solution vector, $$b$$ is the right-hand side and $$t$$ is the convergence threshold. The default convergence threshold is -18.42, which is equivalent to $$ \sqrt{\frac{\sum_{i=1}^n ((Ax)_i-b_i)^2}{n}/\frac{\sum_{i=1}^n b_i^2}{n}}<10^{-8} $$
Direct solver
The direct solver requires the mixed model coefficient matrix to be build and all Kronecker products to be resolved. This can be quite memory demanding and should therefore be used carefully. The direct solver uses a Cholesky decomposition and forward-backward-substitution to solve the mixed model equation system, where especially the decomposition step can be very resource demanding and time consuming.
Gibbs sampling of variance components
Single pass Gibbs sampling
Single pass Gibbs sampling requires solving the mixed model equation system once per round of sampling [1]
Blocked Gibbs sampling
Prediction error variance sampling
REML variance component estimation
References
- ↑ D. Sorensen and D. Gianola, Likelihood, Bayesian, and MCMC Methods in Quantitative Genetics, 2002, 584-588