# Matrix Decompositions

Go to: Introduction, Notation, Index

### Cholesky Decomposition

Iff A is hermitian positive definite there exists a non-singular upper triangular U with positive real diagonal entries such that UHU=A . This is the Cholesky decomposition of A.

• If A is real, then U is unique and real.
• We can also decompose A as LHL=A where L is lower triangular. If VHV=B is the Cholesky decomposition of B=JAJ, then LHL=A where L=JVJ. [Note: J is the exchange matrix.]
• A=SHS iff S[m#n]=Q[U; 0] for some unitary Q.
• For A[n#n], the calculation of U requires n3/6 flops and is numerically well conditioned.
• A numerically and computationally good way to solve Ax = b, is to calculate U, then solve UHy=b and finally solve Ux = y.

### Dual Conjunctive Diagonalization

[A,B:n#n, hermitian] If B is positive definite there exists X such that XHBX=I and XHAX=D where X and D may be obtained from the eigendecomposition B-1A=XDX-1 with D=DIAG(d) a diagonal matrix of eigenvalues in non-increasing order. [4.10]

• If A and B are real then so are X and d. [4.10]
• If S is the +ve definite hermitian square root of B-1 (i.e. S2B=I), then the eigendecomposition of the hermitian matrix SAS is SAS=WDWH where W=S-1X is unitary. [4.10]

### Hermite Normal Form or Row-Echelon Form

Every matrix A[m#n] can be expressed as A=BCP where B[m#m] is non-singular, P[n#n] is a permutation matrix and C[m#n] is of the form [I D;0] for some D. The matrix C is the row-echelon or Hermite-normal form of A.

• The matrix C is uniquely determined by A.
• The number of non-zero rows of C equals the rank of A.

### kth Root of a Hermitian Matrix

If A is hermitian +ve semidefinite, then for any integer k>0, there exists a unique +ve semidefinite B such that A=Bk.

• AB = BA
• rank(B) = rank(A)
• B is +ve definite iff A is.
• B is real iff A is.
• B=p(A) for some polynomial p().

### Jordan Normal Form

For any A[n#n], there exists X such that A=X-1BX where B is of the following form:

1. B is upper bidiagonal and its main diagonal consists of the eigenvalues of A repeated according to their algebraic multiplicities.
2. B = DIAG(X,Y,…) where each square block X, Y, … has one of A's eigenvalues repeated along its main diagonal, 1's along the diagonal above it and 0's elsewhere (these blocks are hypercompanion matrices).
• The matrix B is unique except for the ordering of the blocks in DIAG(X, Y, …).
• Two matrices are similar iff they have the same Jordan form for some ordering of the blocks in DIAG(X, Y, …).
• The matrix B may be complex even if A is real.
• Each distinct eigenvalue of A corresponds to one or more blocks in B that have that eigenvalue along their diagonal:
• The matrix B is diagonal iff A is non-defective, i.e. the geometric and algebraic multiplicities are equal for each eigenvalue, i.e. each block is of size 1.
• The minimum polynomial is equal to the characteristic polynomial iff all eigenvalues have a geometric multiplicity of 1, i.e. there is only one distinct eigenvector for each eigenvalue.
• Some authors call A irreducible iff B is a hypercompanion matrix (i.e. it consists of a single block) but a more common definition of reducible is here.

Finding the Jordan form of a defective or nearly defective matrix is numerically ill-conditioned.

### LDU Decomposition

Every non-singular square matrix A can be expressed as A=PLDU where P is a permutation matrix, L is unit lower triangular, D is diagonal and U is unit upper triangular.

• If A is hermitian then U=LH.
• You can also decompose as A=PUDL by expressing  JAJ=(JPJ)(JUJ)(JDJ)(JLJ).  [Note: J is the exchange matrix.]

### LU Decomposition

Every A[n#n] can be expressed as A=PLU where P is a permutation matrix, L is a unit lower triangular matrix and U is upper triangular.

• This decomposition is the standard way of solving the simultaneous equations Ax = b.
• You can also decompose as A=LUP where U is unit upper triangular by expressing AT=PTUTLT or alternatively AH=PHUHLH.
• You can also decompose as A=PUL where U is unit upper triangular by expressing JAJ=(JPJ)(JUJ)(JLJ).  [Note: J is the exchange matrix.]

### Polar Factorisation

Every A[n#n] can be expressed as A=UH=KV with U, V unitary and H, K positive semidefinite hermitian. H and K are unique; U and V are unique iff A is nonsingular. If A is real then H and K are real and U and V may be taken to be real.

• H2=AHA and K2 = AAH
• A is normal iff UH = HU iff KV = VK. If A is normal and nonsingular then U=V and H=K.

### QR Decomposition

For any A[m#n], A=QR for Q[m#m] unitary and R[m#n] upper triangular. If A is real then Q and R may be taken to be real [1.9].

• For any A[m#n] with m>=n, A=Q[m#n]R[n#n] with QHQ=I and R upper triangular. If A is real then Q and R may be taken to be real [1.9].
• You can also decompose as A=RQ  by decomposing JATJ=(JQTJ)(JRTJ).  [Note: J is the exchange matrix.]
• You can also decompose as A=QL where L is lower triangular by decomposing JAJ=(JQJ)(JLJ).
• You can also decompose as A=LQ where L is lower triangular by decomposing AT=QTLT

### Schur Decomposition

Every square matrix A is unitarily similar to an upper triangular matrix T with A=UHTU.

• The main diagonal of T contains the eigenvalues of A repeated according to their algebraic multiplicities.
• The eigenvalues may be chosen to occur in any order along the diagonal of T and for each possible order the matrix U is unique.
• T is diagonal iff A is normal. The sum of the squares of the absolute values of the off-diagonal elements of T is A's departure from normality.

### Schur Decomposition, Real

Every square real matrix A is orthogonally similar to an upper block triangular matrix T with A=QTTQ where each block of T is either a 1#1 matrix or a 2#2 matrix having complex conjugate eigenvalues.

### Schur Decomposition, Generalized

For A[n#n] and B[n#n], there exist unitary U and V and upper triangular S and T such that A=UHSV and B=UHTV.

### Schur Decomposition, Generalized Real

If A[n#n] and B[n#n] are real, then there exist orthogonal U and V and upper block triangular S and T such that A=UTSV and B=UTTV  where each block of S and T is either a 1#1 matrix or else a 2#2 matrix having complex conjugate eigenvalues.

### Singular Value Decomposition (SVD)

Every matrix A[m#n] can be expressed as A=UDVH where U[m#m] and V[n#n] are unitary and D[m#n] is real, non-negative and diagonal with its diagonal elements arranged in non-increasing order (i.e. di,i <= dj,j for i < j). The singular values of A are the diagonal elements of D (or to some people, the non-zero diagonal elements of D).

• If A is real then U and V can be chosen to be real orthogonal matrices.
• The matrix D is unique but the matrices U and V are not.
• A=LDMH is an alternative singular value decomposition of A iff UHL = DIAG(Q1, Q2, ..., Qk, R) and VHM = DIAG(Q1, Q2, ..., Qk, S) where Q1, Q2, ..., Qk are unitary matrices whose sizes are given by the multiplicities of the corresponding distinct non-zero singular values and R and S are unitary matrices whose size equals the number of zero singular values [1.14]
• If the singular values are all distinct and non-zero then L = U Q and M= V Q where Q is a diagonal matrix whose diagonal elements have unit magnitude. In the case of a real SVD of a real matrix A, the diagonal elements of Q equal +-1.
• The non-zero singular values of A are the square roots of the non-zero eigenvalues of AHA.
• A is non-singular iff all its singular values are > 0.
• The rank of A is the number of non-zero singular values. If the rank of A is r, then the first r columns of U form a basis for the range of A and the last n-r columns of V form a basis for the null space of A.
• [Real, Symmetric, 2#2] A=[a b; b d]=RDRT where R=[cos(t) -sin(t); sin(t) cos(t)] and t=½tan-1(2b/(a-d)) and D is diagonal with D=DIAG([ 1 cos(2t) sin(2t); 1 -cos(2t) -sin(2t)] *[ (a+d)/2; (a-d)/2; b]).