Show Menu
Cheatography

Matrices cheat sheet

Matrices

Addition
X + Y = [zij] = [xij + yij]
Subtra­ction
X - Y = [zij] = [xij - yij]
Multip­lic­ation
X * Y = [zij] = [ xi * yj]
Constant
c * X = [zij] = [c * xij]

Transpose & Identity

Transpose
XT = [zij] = [xji]
Tr of Tr
(XT)T = X
Tr of Mul
(XY)T = YT XT != XT YT
Sym Matrix
XT = X
Identity Matrix I
[zii=1, zij=0]
X I = I X = X

Inverse

Inverse
X X-1 = I = X-1X
if X-1 exists then X is non singular or invertible
Inv of Inv
(X-1)-1 = X
Inv of Mul
(XY)-1 = Y-1X-1 != X-1Y-1
Inv of Tr
(XT)-1 = (X-1)T
Determ­inant
|A| = n
i=1
a
i
j
x Det |a
i
j
|
Determ­inant is computed over first row of matrix where each element of first row is multiplied by its minor
minor M
i
j
is a determ­inant obtained by deleting the ith row and jth column in which a
i
j
lies. Minor of a
i
j
is denoted by m
i
j
.
Cofactor
A
i
j
= (-1)i+j m
i
j
Adjoint
adj(A) = (Cofactor)T = (A
i
j
)T
Inverse
A-1 = adj(A) / |A|
 

Orthogonal

Two n x 1 vectors are orthogonal if XT Y = 0
A vector is orthon­ormal if XTX = ||X2||
Sq root of ||X|| is length or norm of vector
{X
1
, X
2
, X
3
.... X
n
) are said to be orthon­ormal if, each pair is orthogonal and have unit length
A sq matrix is orthogonal if XTX= I or XT=X-1

Eigen Values & Eigen Vectors

A is nxn matrix, X is nx1 matrix, λ is a scalar, then
AX = λX or (A-λI)X = 0 or X = (A-λI)-1
λ is the eigen value and X is the eigen vector (non zero)
Since X is non zero, |A-λI| should be 0
Determ­inant for [a b] = ad - bc
                         [c d]
If A => symmetric, then eigenv­alues => real &
eigenv­ectors => orthogonal
Diagon­ali­zation: P => orthogonal matrix, then Z = PTAP, Z is diagonal matrix with eigen values of A

Linear Indepe­ndence

Given a
1
x
1
+ a
2
x
2
+ ...a
n
x
n
= 0, if a vector [a
1
, a
2
, ...a
n
] exists such that
a. all a
i
are 0, then x
i
are linearly indepe­ndent.
b. if some a
i
!=0 then x
i
are linearly dependent.
If a set of vectors are linearly dependent, then one of them can be written as some combin­ation of others
A set of two vectors is linearly dependent if and only if one of the vectors is a constant multiple of the other.

Idempo­tence

a nxn matrix A is idempotent iff A2 = A
The identity matrix I is idempo­tent.
Let X be an n×k matrix of full rank ,n≥k then H exists as H=X(XTX)−1XT and is idempo­tent.
 

Rank

For a nxk matrix say X, the column vectors are [x
1
, x
2
, ...x
k
] and rank is given by max num of linearly indepe­ndent vectors.
If X is a nxk matrix and r(X) = k, then X is of full rank for n≥k.
r(X) = r(XT) = r(XTX)
If X is kxk, then X is non singular iff r(X) = k.
If X is n×k, P is n×n and non-si­ngular, and Q is k×k and nonsin­gular, then r(X) =r(PX) =r(XQ).
The rank of a diagonal matrix is equal to the number of non zero diagonal entries in the matrix.
r(XY) ≤ r(X) r(Y)

Trace

The trace of a square k×k matrix X is sum of its diagonal entries -
tr(X) = ∑ xii
If c is a scalar, tr(cX) =c * tr(X)
tr(X±Y) =tr(X) ± tr(Y).
If XY and YX both exist, tr(XY) =tr(YX).

Quadratic Forms

A be a k × k, y be k × 1 vector containing variables q = yT Ay is called a quadratic form in y, A is called the matrix of the quadratic form
q = ∑ ∑ a
ij
y
i
y
j
If yTAy > 0 for all y != 0, yTAy & A are +ve definite
If yTAy >= 0 for all y != 0, yTAy & A are +ve semide­finite

Matrix Differ­ent­iation

y = (y1, y2, . . . , yk)T, z = f(y) then ∂z/∂y = [∂z/∂y
1
∂z/∂y
2
∂z/∂y
3
]T
z=aTy, ∂z/∂y = a
z=yTy, ∂z/∂y = 2y
z=yTAy, ∂z/∂y = Ay+ATy , if A is symmetrix then ∂z/∂y = 2Ay

Theorems

Theorem 1
Let A be a symmetric k×k matrix. Then an orthogonal matrix P exists such that PTAP = λ x I, where λ = [λ
1
, λ
2
, .... λ
n
] are the eigen values of A as nx1 vector
Theorem 2
The eigenv­alues of idempotent matrices are always either 0 or 1.
Theorem 3
If A is a symmetric and idempotent matrix, r(A) =tr(A)
Theorem 4
Let A
1
,A
2
,...,A
m
be a collection of symmetric k×k matrices. Then the following are equiva­lent:
a. There exists an orthogonal matrix P such that PTA
i
P is diagonal for all i= 1,2,...,m;
b. A
i
A
j
=A
j
A
i
for every pair i,j= 1,2,...,m.
Theorem 5
Let A
1
,A
2
,...,A
m
be a collection of symmetric k×k matrices. Then any two of the following conditions implies the third:
a. All Ai, i= 1,2,...,m are idempo­tent;
b. ∑ Ai is idempo­tent;
c. AiAj= 0for i6=j
Theorem 6
Let A
1
,A
2
,...,A
m
be a collection of symmetric k×k matrices. If the conditions in Theorem 5 are true, then
r(∑A
i
) = ∑r(A
i
)
Theroem 7
A symmetric matrix A is positive definite if and only if its eigen values are all (strictly) positive
Theorem 8
A symmetric matrix A is positive semi-d­efinite if and only if its eigenv­alues are all non-ne­gative.
 

Comments

No comments yet. Add yours below!

Add a Comment

Your Comment

Please enter your name.

    Please enter your email address

      Please enter your Comment.

          Related Cheat Sheets

          Penn State: Math 220 Cheat Sheet