From Surf Wiki (app.surf) — the open knowledge base
Matrix determinant lemma
In linear algebra
In linear algebra
In mathematics, in particular linear algebra, the matrix determinant lemma computes the determinant of the sum of an invertible matrix A and the dyadic product, uvT, of a column vector u and a row vector vT.
Statement
Suppose A is an invertible square matrix and u, v are column vectors. Then the matrix determinant lemma states that :\det(\mathbf{A} + \mathbf{uv}^\textsf{T}) = (1 + \mathbf{v}^\textsf{T}\mathbf{A}^{-1}\mathbf{u}),\det(\mathbf{A}),.
Here, uvT is the outer product of two vectors u and v.
The theorem can also be stated in terms of the adjugate matrix of A: :\det(\mathbf{A} + \mathbf{uv}^\textsf{T}) = \det(\mathbf{A}) + \mathbf{v}^\textsf{T}\mathrm{adj}(\mathbf{A})\mathbf{u},,
in which case it applies whether or not the matrix A is invertible.
Proof
First the proof of the special case A = I follows from the equality:{{cite journal |author=Ding, J. |author2=Zhou, A. | year = 2007 | title = Eigenvalues of rank-one updated matrices with some applications | journal = Applied Mathematics Letters | volume = 20 | issue = 12 | pages = 1223–1226 | issn = 0893-9659 | doi = 10.1016/j.aml.2006.11.016 | doi-access = free }}
: \begin{pmatrix} \mathbf{I} & 0 \ \mathbf{v}^\textsf{T} & 1 \end{pmatrix} \begin{pmatrix} \mathbf{I} + \mathbf{uv}^\textsf{T} & \mathbf{u} \ 0 & 1 \end{pmatrix} \begin{pmatrix} \mathbf{I} & 0 \ -\mathbf{v}^\textsf{T} & 1 \end{pmatrix} = \begin{pmatrix} \mathbf{I} & \mathbf{u} \ 0 & 1 + \mathbf{v}^\textsf{T}\mathbf{u} \end{pmatrix}.
The determinant of the left hand side is the product of the determinants of the three matrices. Since the first and third matrix are triangular matrices with unit diagonal, their determinants are just 1. The determinant of the middle matrix is our desired value. The determinant of the right hand side is simply (1 + vTu). So we have the result:
:\det(\mathbf{I} + \mathbf{uv}^\textsf{T}) = 1 + \mathbf{v}^\textsf{T}\mathbf{u}.
Then the general case can be found by setting u to be A-1u: :\begin{align} \det(\mathbf{A} + \mathbf{uv}^\textsf{T}) &= \det(\mathbf{A}) \det(\mathbf{I} + (\mathbf{A}^{-1}\mathbf{u})\mathbf{v}^\textsf{T})\ &= \det(\mathbf{A}) \left(1 + \mathbf{v}^\textsf{T} (\mathbf{A}^{-1}\mathbf{u})\right). \end{align}
Application
If the determinant and inverse of A are already known, the formula provides a numerically cheap way to compute the determinant of A corrected by the matrix uvT. The computation is relatively cheap because the determinant of A + uvT does not have to be computed from scratch (which in general is expensive). Using unit vectors for u and/or v, individual columns, rows or elements of A may be manipulated and a correspondingly updated determinant computed relatively cheaply in this way.
When the matrix determinant lemma is used in conjunction with the Sherman–Morrison formula, both the inverse and determinant may be conveniently updated together.
Generalization
Suppose A is an invertible n-by-n matrix and U, V are n-by-m matrices. Then :\det(\mathbf{A} + \mathbf{UV}^\textsf{T}) = \det(\mathbf{I_m} + \mathbf{V}^\textsf{T}\mathbf{A}^{-1}\mathbf{U})\det(\mathbf{A}).
In the special case \mathbf{A} = \mathbf{I_n} this is the Weinstein–Aronszajn identity.
Given additionally an invertible m-by-m matrix W, the relationship can also be expressed as :\det(\mathbf{A} + \mathbf{UWV}^\textsf{T}) = \det(\mathbf{W}^{-1} + \mathbf{V}^\textsf{T}\mathbf{A}^{-1}\mathbf{U})\det(\mathbf{W})\det(\mathbf{A}).
References
References
- Harville, D. A.. (1997). "Matrix Algebra From a Statistician's Perspective". Springer-Verlag.
- Brookes, M.. (2005). "The Matrix Reference Manual (online)".
- William H. Press. (1992). "Numerical Recipes in C: The Art of Scientific Computing". Cambridge University Press.
This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.
Ask Mako anything about Matrix determinant lemma — get instant answers, deeper analysis, and related topics.
Research with MakoFree with your Surf account
Create a free account to save articles, ask Mako questions, and organize your research.
Sign up freeThis content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.
Report