Special Matrices

Go to: Introduction, Notation, Index


see  skew-symmetric .


A is upper bidiagonal if a(i,j)=0 unless i=j or i=j-1.
A is lower bidiagonal if a(i,j)=0 unless i=j or i=j+1

A bidiagonal matrix is also tridiagonal, triangular and Hessenberg.


A[n#n] is bisymmetric if it is symmetric about both main diagonals, i.e. if A=AT=JAJ where J is the exchange matrix.

WARNING: The term persymmetric is sometimes used instead of bisymmetric. Also bisymmetric is sometimes used to mean centrosymmetric and sometimes to mean symmetric and perskewsymmetric.

Block Diagonal

A is block diagonal if it has the form [A 0 ... 0; 0 B ... 0;...;0 0 ... Z] where A, B, ..., Z are matrices (not necessarily square).


A[m#n] is centrohermitian if it is rotationally hermitian symmetric about its centre, i.e. if AT=JAHJ where J is the exchange matrix.


A[m#n] is centrosymmetric (also called perplectic) if it is rotationally symmetric about its centre, i.e. if A=JAJ where J is the exchange matrix. It is centrohermitian if AT=JAHJ and centroskew-symmetric if  A= -JAJ.


A circulant matrix, A[n#n], is a Toeplitz matrix in which ai,j is a function of {(i-j) modulo n}. In other words each column of A is equal to the previous column rotated downwards by one element.

WARNING: The term circular is sometimes used instead of circulant.


A Circular matrix, A[n#n], is one for which AAC = I.

WARNING: The term circular is sometimes used for a circulant matrix.

Companion Matrix

If p(x) is a polynomial of the form a(0) + a(1)*x + a(2)*x2 + ... + a(n)*xn then the polynomial's companion matrix is n#n and equals [0 I; -a(0:n-1)/a(n)] where I is n-1#n-1. For n=1, the companion matrix is [-a(0)/a(1)].

The rows and columns are sometimes given in reverse order [-a(n-1:0)/a(n) ; I 0].


A matrix is complex if it has complex elements.

Complex to Real Isomporphism

We can associate a complex matrix C[m#n] with a corresponding real  matrix R[2m#2n] by replacing each complex element, z, of C by a 2#2 real matrix [zR -zI; zI zR]=|z|×[cos(t) -sin(t); sin(t) cos(t)] where t=arg(z). We will write C <=> R for this mapping below.

Vector mapping
: Under the isomorphism a complex vector maps to a real matrix: z[n] <=> Y[2n#2]. We can also define a simpler mapping, <->, from a vector to a vector as  z[n] <-> x[2n] = z ⊗ [()R; ()I] = Y [1; 0]

In the results below, we assume z[n] <-> x[2n]w[n] <-> u[2n] and  C <=> R:

To relate the martrix and vector mappings, <->  and <=>, we define the following two block-diagonal matrices: E = I[n#n] ⊗ [0 1; 1 0] and N = I[n#n] ⊗ [1 0; 0 -1]. We now have the following properties (assuming z[n] <-> x[2n]  and  C <=> R):


A matrix A is convergent if Ak tends to 0 as k tends to infinity.

See also: Stability

Cyclic Permutation Matrix

The n#n cyclic permutation matrix (or cyclic shift matrix), C, is equal to [0n-1T 1; In-1#n-1 0n-1]. Its elements are given by ci,j = δi,1+(j mod n) where δi,j is the Kronecker delta.


A matrix, A, is fully decomposable (or reducible) if there exists a permutation matrix P such that PTAP is of the form [B C; 0 D] where B and D are square.
A matrix, A, is partly-decomposable if there exist permutation matrices P and Q such that PTAQ is of the form [B C; 0 D] where B and D are square.
A matrix that is not even partly-decomposable is fully-indecomposable.


A matrix, X:n#n, is defective if it does not have n linearly independent eigenvectors, otherwise it is simple.


An n*n square matrix is derogatory if its minimal polynomial is of lower order than n.


A is diagonal if a(i,j)=0 unless i=j.

The functions DIAG(x) and diag(X) respectively convert a vector into a diagonal matrix and the diagonal of a matrix into a vector. In the expression below,  •  denotes elementwise multiplication.

Diagonable or Diagonalizable or Simple or Non-Defective

A matrix, X, is diagonable (or, equivalently, simple or diagonalizable  or non-defective) if it is similar to a diagonal matrix otherwise it is defective.

Diagonally Dominant

A square matrix An#n is diagonally dominant if the absolute value of each diagonal element is greater than the sum of absolute values of the non-diagonal elements in its row. That is if for each i we have |a(i,i)| > sumj != i(|a(i,j)|) or equivalently abs(diag(A)) > ½ABS(A) 1n#1.

Discrete Fourier Transform

The discrete fourier transform matrix, F[n#n], has fp,q = exp(-2jπ(p-1) (q-1) n-1).


A real non-negative square matrix A is doubly-stochastic if its rows and columns all sum to 1.

See under stochastic for properties.


An essential matrix, E, is the product E=US of a 3#3 orthogonal matrix, U, and a 3#3 skew-symmetric matrix, S = SKEW(s). In 3-D euclidean space, a translation+rotation transformation is associated with an essential matrix.


The exchange matrix J[n#n] is equal to [en en-1e2 e1] where ei is the ith column of I. It is equal to I but with the columns in reverse order.

Givens Reflection

[Real]: A Givens Reflection is an n#n matrix of the form PT[Q 0 ; 0 I]P where P is any permutation matrix and Q is a matrix of the form [cos(x) sin(x); sin(x) -cos(x)].

Givens Rotation

[Real]: A Givens Rotation is an n#n matrix of the form PT[Q 0 ; 0 I]P where P is a permutation matrix and Q is a matrix of the form [cos(x) sin(x); -sin(x) cos(x)].

Hadamard [!]

An n*n Hadamard matrix has orthogonal columns whose elements are all equal to +1 or -1.


A real 2n*2n matrix, A, is Hamiltonian if KA is symmetric where K = [0 I; -I 0].

See also: symplectic


A Hankel matrix has constant anti-diagonals. In other words a(i,j) depends only on (i+j).


A square matrix A is Hermitian if A = AH, that is A(i,j)=conj(A(j,i))

For real matrices, Hermitian and symmetric are equivalent. Except where stated, the following properties apply to real symmetric matrices as well.

See also: Definiteness, Loewner partial order


A Hessenberg matrix is like a triangular matrix except that the elements adjacent to the main diagonal can be non-zero.
A is upper Hessenberg if A(i,j)=0 whenever i>j+1. It is like an upper triangular matrix except for the elements immediately below the main diagonal.
A is lower Hessenberg if a(i,j)=0 whenever i<j-1. It is like a lower triangular matrix except for the elements immediately above the main diagonal.


A Hilbert matrix is a square Hankel matrix with elements a(i,j)=1/(i+j-1).


If we define an equivalence relation in which X ~ Y iff X = cY for some non-zero scalar c, then the equivalence classes are called homogeneous matrices and homogeneous vectors.


A Householder matrix (also called Householder reflection or transformation) is a matrix of the form (I-2vvH) for some vector v with ||v||=1.

Multiplying a vector by a Householder transformation reflects it in the hyperplane that is orthogonal to v.

Householder matrices are important because they can be chosen to annihilate any contiguous block of elements in any chosen vector.


The hypercompanion matrix of the polynomial p(x)=(x-a)n is an n#n upper bidiagonal matrix, H,  that is zero except for the value a along the main diagonal and the value 1 on the diagonal immediately above it. That is, hi,j = a if j=i, 1 if j=i+1 and 0 otherwise.

If the real polynomial p(x)=(x2-ax-b)n with a2+4b<0 (i.e. the quadratic term has no real factors) then its Real hypercompanion matrix is a 2n#2n  tridiagonal matrix that is zero except for a at even positions along the main diagonal, b at odd positions along the sub-diagonal and 1 at all positions along the super-diagonal. Thus for odd ihi,j = 1 if j=i+1 and 0 otherwise while for even ihi,j = 1 if j=i+1, a if j=i and b if j=i-1.

Idempotent [!]

P matrix P is idempotent if P2 = P . An idempotent matrix that is also hermitian is called a projection matrix.

WARNING: Some people call any idempotent matrix a projection matrix and call it an orthogonal projection matrix if it is also hermitian.


The identity matrix , I, has a(i,i)=1 for all i and a(i,j)=0 for all i !=j


A non-negative matrix T is impotent if min(diag(Tn)) = 0 for all integers n>0 [see potency].


An incidence matrix is one whose elements all equal 1 or 0.


An Integral matrix is one whose elements are all integers.

Involutary (also written Involutory)

An Involutary matrix is one whose square equals the identity.


see under Reducible


see under Tridiagonal


A matrix, A, is monotone iff  A-1 is non-negative, i.e. all its entries are >=0.

In computer science a matrix is monotone if its entries are monotonically non-decreasing as you move away from the main diagonal along either a row or column.

Nilpotent [!]

A matrix A is nilpotent to index k if Ak = 0 but Ak-1 != 0.


see under positive


A square matrix A is normal if AHA = AAH

Orthogonal [!]

A real square matrix Q is orthogonal if Q'Q = I. It is a proper orthogonal matrix if det(Q)=1 and an improper orthogonal matrix if det(Q)=-1.

For real matrices, orthogonal and unitary mean the same thing. Most properties are listed under unitary.

Geometrically: Orthogonal matrices in 2 and 3 dimensions correspond to rotations and reflections.


A square matrix P is a permutation matrix if its columns are a permutation of the columns of I.


A matrix A[n#n] is persymmetric if it is symmetric about its anti-diagonal, i.e. if A=JATJ where J is the exchange matrix. It is perhermitian if A=JAHJ and perskewsymmetric if  A= -JATJ.

WARNING: The term persymmetric is sometimes used for a bisymmetric matrix.

Polynomial Matrix

A polynomial matrix of order p is one whose elements are polynomials of a single variable x. Thus A=A(0)+A(1)x+...+A(p)xp where the A(i) are constant matrices and A(p) is not all zero.

See also regular.


A real matrix is positive if all its elements are strictly > 0.
A real matrix is non-negative if all its elements are >= 0.

Positive Definite

see under definiteness


If k is the eigenvalue of a matrix An#n having the largest absolute value, then A is primitive if the absolute values of all other eigenvalues are < |k|.


A projection matrix (or orthogonal projection matrix) is a square matrix that is hermitian and idempotent: i.e. PH=P2=P.

WARNING: Some people call any idempotent matrix a projection matrix and call it an orthogonal projection matrix if it is also hermitian.


Quaternions are a generalization of complex numbers. A quaternion  consists of a real component and three independent imaginary components and is written as r+xi+yj+zk where i2=j2=k2=ijk=-1. It is approximately true that whereas the polar decomposition of a complex number has a magnitude and 2-dimensional rotation, that of a quaternion has a magnitude and a 3-dimensionl rotation (see below). Quaternions form a division ring rather than a field because although every non-zero quaternion has a multiplicative inverse but multiplication is not commutative (e.g. ij=-ji=k). Quaternions are widely used to represent three-dimensional rotations in computer graphics and computer vision as an alternative to orthogonal matrices with the following advantages: (a) more compact, (b) possible to interpolate, (c) does not suffer from "gimbal lock", (d) easy to correct for drift due to rounding errors.

We can represent a quaternion either as a real 4-vector qR=[r x y z]T or a complex 2-vector qC=[r+jy  x+jz]T. This gives  r+xi+yj+zk = [1 i j k]qR  = [1 i]qC. We can also represent it as a real 4#4 matrix QR=[r -x -y -z; x r -z y; y z r -x; z -y x r] or a complex 2#2 matrix QC=[r+jy -x+jz; x+jz r-jy]. Both the real and the complex quaternion matrices obey the same arithmetic rules as quaternions, i.e. the quaternion matrix representing the result of applying +, -, * and / operations to quaternions is the same as the result of applying the same operations to the corresponding quaternion matrices. Note that qR=QR[1 0 0 0]T and qC=QC[1 0]T; we can also define the inverse functions QR=QUATR(qR) and QC=QUATC(qC). Note that the real and complex representations given above are not the only possible choices.

In the following, PR=QUATR(pR), QR=QUATR(qR), K=DIAG([-1 1 1 1]) and qR=[r x y z]T=[r; w]. PC,pC,QC and qC are the corresponding complex quantities; the subscripts R and C are omitted below for results that apply to both real and complex representations. 


A non-zero matrix A is a rank-one matrix iff it can be decomposed as A=xyT.


A matrix An#n is reducible (or fully decomposable) if if there exists a permutation matrix P such that PTAP is of the form [B C; 0 D] where B and D are square. As a special case 01#1 is regarded as reducible. A matrix that is not reducible is irreducible.

WARNING: The term reducible is sometimes used to mean one that has more than one block in its Jordan Normal Form.


A polynomial matrix, A, of order p is regular if det(A) is non-zero.

Rotation Matrix

[Real]: A Rotation matrix, R, is an n*n matrix of the form R=U[Q 0 ; 0 I]UT where U is any orthogonal matrix and Q is a matrix of the form [cos(x) -sin(x); sin(x) cos(x)]. Multiplying a vector by R rotates it by an angle x in the plane containing u and v, the first two columns of U. The direction of rotation is such that if x=90 degrees, u will be rotated to v

Shift Matrix

A  shift matrix, or lower shift matrix, Z,  is a matrix with ones below the main diagonal and zeros elsewhere.
ZT has ones above the main diagonal and zeros elsewhere and is an upper shift matrix.


A signature matrix is a diagonal matrix whose diagonal entries are all +1 or -1.


An n*n square matrix is simple (or, equivalently, diagonable or diagonalizable  or non-defective) if all its eigenvalues are regular, otherwise it is defective.


A matrix is singular if it has no inverse.


A square matrix K is Skew-Hermitian (or antihermition) if K = -KH, that is a(i,j)=-conj(a(j,i))

For real matrices, Skew-Hermitian and skew-symmetric are equivalent. The following properties apply also to real skew-symmetric matrices.


A square matrix K is skew-symmetric (or antisymmetric) if K = -KT, that is a(i,j)=-a(j,i)

For real matrices, skew-symmetric and Skew-Hermitian are equivalent. Most properties are listed under skew-Hermitian .


A matrix is sparse if it has relatively few non-zero elements.


A Stability or Stable matrix is one whose eigenvalues all have strictly negative real parts.
A semi-stable matrix is one whose eigenvalues all have non-positive real parts.

See also: Convergent


A real non-negative square matrix A is stochastic if all its rows sum to 1.!. If all its columns also sum to 1 it is Doubly Stochastic.


A real non-negative square matrix A is sub-stochastic if all its rows sum to <=1.


A is subunitary if ||AAHx|| = ||AHx|| for all x. A is also called a partial isometry.

The following are equivalent:

  1. A is subunitary
  2. AHA is a projection matrix
  3. AAHA = A
  4. A+ = AH


A square matrix A is symmetric if A = AT, that is a(i,j) = a(j,i).

Most properties of real symmetric matrices are listed under Hermitian .

See also Hankel.


A real matrix, A, is symmetrizable if ATM = MA for some positive definite M.


A matrix, A[2n#2n], is symplectic if AHKA=K where K is the antisymmetric orthogonal matrix [0 I; -I 0].

See also: hamiltonian


A toeplitz matrix, A, has constant diagonals. In other words ai,j depends only on i-j.
We define A=TOE(b[m+n-1])[m#n] to be the m#n matrix with ai,j = bi-j+n. Thus, b is the column vector formed by starting at the top right element of A, going backwards along the top row of A and then down the left column of A.
In the topics below, J is the exchange matrix.

Some special cases of this are:


A is upper triangular if a(i,j)=0 whenever i>j.
A is lower triangular if a(i,j)=0 whenever i<j.
A is triangular iff it is either upper or lower triangular.
A triangular matrix A is strictly triangular if its diagonal elements all equal 0.
A triangular matrix A is unit triangular if its diagonal elements all equal 1.

Tridiagonal or Jacobi

A is tridiagonal or Jacobi if A(i,j)=0 whenever |i-j|>1. In other words its non-zero elements lie either on or immediately adjacent to the main diagonal.


A complex square matrix A is unitary if AHA = I. A is also sometimes called an isometry.

A real unitary matrix is called orthogonal .The following properties apply to orthogonal matrices as well as to unitary matrices.


An Vandermonde matrix, V[n#n], has the form [1 x x•2x•n-1] for some column vector x. (where x•2 denotes elementwise squaring). A general element is given by v(i,j) = (xi)j-1. All elements of the first column of the matrix equal 1. Vandermonde matrices arise in connection with fitting polynomials to data.

WARNING: Some authors define a Vandermonde matrix to be either the transpose or the horizontally flipped version of the above definition.

Vectorized Transpose Matrix

The vectorized transpose matrix, TVEC(m,n), is the mn#mn  permutation matrix whose i,jth element is 1 if j=1+m(i-1)-(mn-1)floor((i-1)/n) or 0 otherwise.

For clarity, we write Tm,n = TVEC(m,n) in this section.


The zero matrix, 0, has a(i,j)=0 for all i,j

This page is part of The Matrix Reference Manual. Copyright © 1998-2017 Mike Brookes, Imperial College, London, UK. See the file gfl.html for copying instructions. Please send any comments or suggestions to "mike.brookes" at "imperial.ac.uk".
Updated: $Id: special.html 10106 2017-09-07 06:52:40Z dmb $