MatricesAddition  X + Y = [zij] = [xij + yij]  Subtraction  X  Y = [zij] = [xij  yij]  Multiplication  X * Y = [zij] = [ xi * yj]  Constant  c * X = [zij] = [c * xij] 
Transpose & IdentityTranspose  X^{T} = [zij] = [xji]  Tr of Tr  (X^{T})^{T} = X  Tr of Mul  (XY)^{T} = Y^{T} X^{T} != X^{T} Y^{T}  Sym Matrix  X^{T} = X  Identity Matrix I [zii=1, zij=0]  X I = I X = X 
InverseInverse  X X^{1} = I = X^{1}X  if X^{1} exists then X is non singular or invertible  Inv of Inv  (X^{1})^{1} = X  Inv of Mul  (XY)^{1} = Y^{1}X^{1} != X^{1}Y^{1}  Inv of Tr  (X^{T})^{1} = (X^{1})^{T}  Determinant  A = ^{n}∑i=1 ai j x Det ai j   Determinant is computed over first row of matrix where each element of first row is multiplied by its minor  minor Mi j is a determinant obtained by deleting the i^{th} row and j^{th} column in which ai j lies. Minor of ai j is denoted by mi j .  Cofactor  Ai j = (1)^{i+j} mi j  Adjoint  adj(A) = (Cofactor)^{T} = (Ai j )^{T}  Inverse  A^{1} = adj(A) / A 
  OrthogonalTwo n x 1 vectors are orthogonal if X^{T} Y = 0  A vector is orthonormal if X^{T}X = X^{2}  Sq root of X is length or norm of vector  {X1 , X2 , X3 .... Xn ) are said to be orthonormal if, each pair is orthogonal and have unit length  A sq matrix is orthogonal if X^{T}X= I or X^{T}=X^{1} 
Eigen Values & Eigen VectorsA is nxn matrix, X is nx1 matrix, λ is a scalar, then  AX = λX or (AλI)X = 0 or X = (AλI)^{1}  λ is the eigen value and X is the eigen vector (non zero)  Since X is non zero, AλI should be 0  Determinant for [a b] = ad  bc [c d]  If A => symmetric, then eigenvalues => real & eigenvectors => orthogonal  Diagonalization: P => orthogonal matrix, then Z = P^{T}AP, Z is diagonal matrix with eigen values of A 
Linear IndependenceGiven a1 x1 + a2 x2 + ...an xn = 0, if a vector [a1 , a2 , ...an ] exists such that  a. all ai are 0, then xi are linearly independent.  b. if some ai !=0 then xi are linearly dependent.  If a set of vectors are linearly dependent, then one of them can be written as some combination of others  A set of two vectors is linearly dependent if and only if one of the vectors is a constant multiple of the other. 
Idempotencea nxn matrix A is idempotent iff A^{2} = A  The identity matrix I is idempotent.  Let X be an n×k matrix of full rank ,n≥k then H exists as H=X(X^{T}X)^{−1}X^{T} and is idempotent. 
  RankFor a nxk matrix say X, the column vectors are [x1 , x2 , ...xk ] and rank is given by max num of linearly independent vectors.  If X is a nxk matrix and r(X) = k, then X is of full rank for n≥k.  r(X) = r(X^{T}) = r(X^{T}X)  If X is kxk, then X is non singular iff r(X) = k.  If X is n×k, P is n×n and nonsingular, and Q is k×k and nonsingular, then r(X) =r(PX) =r(XQ).  The rank of a diagonal matrix is equal to the number of non zero diagonal entries in the matrix.  r(XY) ≤ r(X) r(Y) 
TraceThe trace of a square k×k matrix X is sum of its diagonal entries  tr(X) = ∑ xii  If c is a scalar, tr(cX) =c * tr(X)  tr(X±Y) =tr(X) ± tr(Y).  If XY and YX both exist, tr(XY) =tr(YX). 
Quadratic FormsA be a k × k, y be k × 1 vector containing variables q = y^{T} Ay is called a quadratic form in y, A is called the matrix of the quadratic form  q = ∑ ∑ aij yi yj  If y^{T}Ay > 0 for all y != 0, y^{T}Ay & A are +ve definite  If y^{T}Ay >= 0 for all y != 0, y^{T}Ay & A are +ve semidefinite 
Matrix Differentiationy = (y1, y2, . . . , yk)^{T}, z = f(y) then ∂z/∂y = [∂z/∂y1 ∂z/∂y2 ∂z/∂y3 ]^{T}  z=a^{T}y, ∂z/∂y = a  z=y^{T}y, ∂z/∂y = 2y  z=y^{T}Ay, ∂z/∂y = Ay+A^{T}y , if A is symmetrix then ∂z/∂y = 2Ay 
TheoremsTheorem 1 Let A be a symmetric k×k matrix. Then an orthogonal matrix P exists such that P^{T}AP = λ x I, where λ = [λ1 , λ2 , .... λn ] are the eigen values of A as nx1 vector  Theorem 2 The eigenvalues of idempotent matrices are always either 0 or 1.  Theorem 3 If A is a symmetric and idempotent matrix, r(A) =tr(A)  Theorem 4 Let A1 ,A2 ,...,Am be a collection of symmetric k×k matrices. Then the following are equivalent: a. There exists an orthogonal matrix P such that P^{T}Ai P is diagonal for all i= 1,2,...,m; b. Ai Aj =Aj Ai for every pair i,j= 1,2,...,m.  Theorem 5 Let A1 ,A2 ,...,Am be a collection of symmetric k×k matrices. Then any two of the following conditions implies the third: a. All Ai, i= 1,2,...,m are idempotent; b. ∑ Ai is idempotent; c. AiAj= 0for i6=j  Theorem 6 Let A1 ,A2 ,...,Am be a collection of symmetric k×k matrices. If the conditions in Theorem 5 are true, then r(∑Ai ) = ∑r(Ai )  Theroem 7 A symmetric matrix A is positive definite if and only if its eigen values are all (strictly) positive  Theorem 8 A symmetric matrix A is positive semidefinite if and only if its eigenvalues are all nonnegative. 

Created By
Metadata
Comments
No comments yet. Add yours below!
Add a Comment
Related Cheat Sheets