Difference between revisions of "Algorithms"

From Linear Mixed Models Toolbox
Jump to navigation Jump to search
Line 4: Line 4:
===preconditioned gradient solver===
===preconditioned gradient solver===
The preconditioned gradient solver is {{{lmt}}} default solver. It does not require the explicit construction of any mixed model equation, and it therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the pre-conditioned gradient solver. The solver has converged to a stable solution if
The preconditioned gradient solver is {{{lmt}}} default solver. It does not require the explicit construction of any mixed model equation, and it therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the pre-conditioned gradient solver. The solver has converged to a stable solution if
$$log(\frac{|Ax-b|}{|b|})<t$$
$$log\left(\frac{|Ax-b|}{|b|}\right)<t$$

Revision as of 12:55, 4 January 2021

Algorithms

Solving Linear Mixed model Equations

lmt supports two types of solver for solving MME's: a direct solver and a pre-conditioned gradient solver

preconditioned gradient solver

The preconditioned gradient solver is {{{lmt}}} default solver. It does not require the explicit construction of any mixed model equation, and it therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the pre-conditioned gradient solver. The solver has converged to a stable solution if $$log\left(\frac{|Ax-b|}{|b|}\right)<t$$