Difference between revisions of "Algorithms"
Line 5: | Line 5: | ||
The iterative solver uses the [https://en.wikipedia.org/wiki/Conjugate_gradient_method#The_preconditioned_conjugate_gradient_method preconditioned conjugate gradient method] and is {{{lmt}}} default solver. It does not require the explicit construction of any mixed model equation, and it therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the iterative solver. | The iterative solver uses the [https://en.wikipedia.org/wiki/Conjugate_gradient_method#The_preconditioned_conjugate_gradient_method preconditioned conjugate gradient method] and is {{{lmt}}} default solver. It does not require the explicit construction of any mixed model equation, and it therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the iterative solver. | ||
The iterative solver has converged to a stable solution if $$log_e\left(\sqrt{\frac{(Cx-b)'(Cx-b)}{b'b}}\right)<t$$, where $$C$$ is the mixed-model coefficient matrix, $$x$$ is the solution vector, $$b$$ is the right-hand side and $$t$$ is the convergence threshold. The default convergence threshold is -18.42, which implies that the average [https://en.wikipedia.org/wiki/Euclidean_distance Euclidean distance] between $$\frac{\sum_{i=1}^n ((Ax)_i-b_i)^2}{n} | The iterative solver has converged to a stable solution if $$log_e\left(\sqrt{\frac{(Cx-b)'(Cx-b)}{b'b}}\right)<t$$, where $$C$$ is the mixed-model coefficient matrix, $$x$$ is the solution vector, $$b$$ is the right-hand side and $$t$$ is the convergence threshold. The default convergence threshold is -18.42, which implies that the average [https://en.wikipedia.org/wiki/Euclidean_distance Euclidean distance] between | ||
$$ | |||
\frac{\frac{\sum_{i=1}^n ((Ax)_i-b_i)^2}{n}}{\frac{\sum_{i=1}^n b_i^2}{n}} | |||
$$ |
Revision as of 13:18, 4 January 2021
Algorithms
Solving Linear Mixed model Equations
lmt supports two types of solver for solving MME's: a direct solver and an iterative solver
Iterative solver
The iterative solver uses the preconditioned conjugate gradient method and is {{{lmt}}} default solver. It does not require the explicit construction of any mixed model equation, and it therefore less resource demanding than the direct solver. That is, many models which cannot be solved using the direct solver can still be solved using the iterative solver.
The iterative solver has converged to a stable solution if $$log_e\left(\sqrt{\frac{(Cx-b)'(Cx-b)}{b'b}}\right)<t$$, where $$C$$ is the mixed-model coefficient matrix, $$x$$ is the solution vector, $$b$$ is the right-hand side and $$t$$ is the convergence threshold. The default convergence threshold is -18.42, which implies that the average Euclidean distance between
$$ \frac{\frac{\sum_{i=1}^n ((Ax)_i-b_i)^2}{n}}{\frac{\sum_{i=1}^n b_i^2}{n}} $$