Matrices
Addition |
X + Y = [zij] = [xij + yij] |
Subtraction |
X - Y = [zij] = [xij - yij] |
Multiplication |
X * Y = [zij] = [ xi * yj] |
Constant |
c * X = [zij] = [c * xij] |
Transpose & Identity
Transpose |
XT = [zij] = [xji] |
Tr of Tr |
(XT)T = X |
Tr of Mul |
(XY)T = YT XT != XT YT |
Sym Matrix |
XT = X |
Identity Matrix I [zii=1, zij=0] |
X I = I X = X |
Inverse
Inverse |
X X-1 = I = X-1X |
if X-1 exists then X is non singular or invertible |
Inv of Inv |
(X-1)-1 = X |
Inv of Mul |
(XY)-1 = Y-1X-1 != X-1Y-1 |
Inv of Tr |
(XT)-1 = (X-1)T |
Determinant |
|A| = n∑ i=1
a i
j
x Det |a i
j
| |
Determinant is computed over first row of matrix where each element of first row is multiplied by its minor |
minor M i
j
is a determinant obtained by deleting the i th row and j th column in which a i
j
lies. Minor of a i
j
is denoted by m i
j
. |
Cofactor |
|
Adjoint |
adj(A) = (Cofactor) T = (A i
j
) T |
Inverse |
A-1 = adj(A) / |A| |
|
|
Orthogonal
Two n x 1 vectors are orthogonal if XT Y = 0 |
A vector is orthonormal if XTX = ||X2|| |
Sq root of ||X|| is length or norm of vector |
{X 1
, X 2
, X 3
.... X n
) are said to be orthonormal if, each pair is orthogonal and have unit length |
A sq matrix is orthogonal if XTX= I or XT=X-1 |
Eigen Values & Eigen Vectors
A is nxn matrix, X is nx1 matrix, λ is a scalar, then |
AX = λX or (A-λI)X = 0 or X = (A-λI)-1 |
λ is the eigen value and X is the eigen vector (non zero) |
Since X is non zero, |A-λI| should be 0 |
Determinant for [a b] = ad - bc [c d] |
If A => symmetric, then eigenvalues => real & eigenvectors => orthogonal |
Diagonalization: P => orthogonal matrix, then Z = PTAP, Z is diagonal matrix with eigen values of A |
Linear Independence
Given a 1
x 1
+ a 2
x 2
+ ...a n
x n
= 0, if a vector [a 1
, a 2
, ...a n
] exists such that |
a. all a i
are 0, then x i
are linearly independent. |
b. if some a i
!=0 then x i
are linearly dependent. |
If a set of vectors are linearly dependent, then one of them can be written as some combination of others |
A set of two vectors is linearly dependent if and only if one of the vectors is a constant multiple of the other. |
Idempotence
a nxn matrix A is idempotent iff A2 = A |
The identity matrix I is idempotent. |
Let X be an n×k matrix of full rank ,n≥k then H exists as H=X(XTX)−1XT and is idempotent. |
|
|
Rank
For a nxk matrix say X, the column vectors are [x 1
, x 2
, ...x k
] and rank is given by max num of linearly independent vectors. |
If X is a nxk matrix and r(X) = k, then X is of full rank for n≥k. |
r(X) = r(XT) = r(XTX) |
If X is kxk, then X is non singular iff r(X) = k. |
If X is n×k, P is n×n and non-singular, and Q is k×k and nonsingular, then r(X) =r(PX) =r(XQ). |
The rank of a diagonal matrix is equal to the number of non zero diagonal entries in the matrix. |
r(XY) ≤ r(X) r(Y) |
Trace
The trace of a square k×k matrix X is sum of its diagonal entries - tr(X) = ∑ xii |
If c is a scalar, tr(cX) =c * tr(X) |
tr(X±Y) =tr(X) ± tr(Y). |
If XY and YX both exist, tr(XY) =tr(YX). |
Quadratic Forms
A be a k × k, y be k × 1 vector containing variables q = yT Ay is called a quadratic form in y, A is called the matrix of the quadratic form |
|
If yTAy > 0 for all y != 0, yTAy & A are +ve definite |
If yTAy >= 0 for all y != 0, yTAy & A are +ve semidefinite |
Matrix Differentiation
y = (y1, y2, . . . , yk) T, z = f(y) then ∂z/∂y = [∂z/∂y 1
∂z/∂y 2
∂z/∂y 3
] T |
z=aTy, ∂z/∂y = a |
z=yTy, ∂z/∂y = 2y |
z=yTAy, ∂z/∂y = Ay+ATy , if A is symmetrix then ∂z/∂y = 2Ay |
Theorems
Theorem 1 Let A be a symmetric k×k matrix. Then an orthogonal matrix P exists such that P TAP = λ x I, where λ = [λ 1
, λ 2
, .... λ n
] are the eigen values of A as nx1 vector |
Theorem 2 The eigenvalues of idempotent matrices are always either 0 or 1. |
Theorem 3 If A is a symmetric and idempotent matrix, r(A) =tr(A) |
Theorem 4 Let A 1
,A 2
,...,A m
be a collection of symmetric k×k matrices. Then the following are equivalent: a. There exists an orthogonal matrix P such that P TA i
P is diagonal for all i= 1,2,...,m; b. A i
A j
=A j
A i
for every pair i,j= 1,2,...,m. |
Theorem 5 Let A 1
,A 2
,...,A m
be a collection of symmetric k×k matrices. Then any two of the following conditions implies the third: a. All Ai, i= 1,2,...,m are idempotent; b. ∑ Ai is idempotent; c. AiAj= 0for i6=j |
Theorem 6 Let A 1
,A 2
,...,A m
be a collection of symmetric k×k matrices. If the conditions in Theorem 5 are true, then r(∑A i
) = ∑r(A i
) |
Theroem 7 A symmetric matrix A is positive definite if and only if its eigen values are all (strictly) positive |
Theorem 8 A symmetric matrix A is positive semi-definite if and only if its eigenvalues are all non-negative. |
|
Created By
Metadata
Comments
No comments yet. Add yours below!
Add a Comment
Related Cheat Sheets