Show Menu
Cheatography

Penn State: Math 220 Cheat Sheet by

Cheat sheet for MATH 220 Penn State Students, Matrices Course

1.1

A Matrix
row, columns
Coeffi­cients Matrix
Just Left Hand Side
Augmented Matrix
Left and Right Hand Side
Solving Linear Systems
(1) Augmented Matrix
(2) Row Operations
(3) Solution to Linear System
The RHS is the solution
One Solution
Upper triangle with Augmented Matrix
No Solution
Last row is all zeros = RHS number
Infinitely Many Solutions
Last row (including RHS) is all zeros
Incons­istent
Has No Solution

1.1 Example(1)

1.2

Echelon Matrix
(1) Zero Rows at the bottom
(2) Leading Entries are down and to the right
(3) Zeros are below each leading entry
Reduced Echelon Matrix
(1) The leading entry of each nonzero row is 1
(2) Zeros are below AND above each 1
Pivot Position
Location of Matrix that Corres­ponds to a leading 1 in REF
Pivot Column
Column in Matrix that contains a pivot
To get to EF
down and right
To get to REF
up and left
Free Variables
Variables that don't correspond to pivot columns
Consistent System
Pivot in every Column

1.2 Example (1)

1.3

RR2
Set of all vectors with 2 rows

1.3 Example (1)

1.3 Example (2)

1.4

Vector Equation
x1a1+x2a2+x3a3 =b
Matrix Equation
Ax=b
If A is an m x n matrix the following are all true or all false
Ax = b has a solution for every b in RRm
Every b in RRm is a lin. combo of columns in A
Columns of A span RRm
Matrix A has a pivot in every row (i.e. no row of zeros)
Anything in Bold means it is a vector.

1.4 Example (1)

1.4 Example (2)

1.5

Homoge­neous
Ax = 0
Trivial Solution
Ax = 0 if at lease one column is missing a pivot
Determine if homogenous Linear System has a non trivial solution
(1) Write as Augmented Matrix
(2) Reduce to EF
(3) Determine if there are any free variab­les­(column w/o pivot)
(4) If any free variables, than a non-tr­ivial solution exists
(5) Non-Tr­ivial Solution can be found by further reducing to REF and solving for x
If Ax = 0 has one free variable
Than x is a line that passes through the origin
If Ax = 0 has two free variables
Than x has a plane that passes through the origin

1.5 Example (1)

1.5 Example (2)

1.7

Linear Indepe­ndence
No free Variables, none of the vectors are multiples of each other
To check ind/dep
reduce augmented matrix to EF and see if there are free variab­les(ie. every column must have a pivot to be linearly indepe­ndent)
To check if multiples
u = c * v
find value of c, then it is a multiple
therefore linearly dependent
Linearly Dependent
If there are more columns than rows

1.7 Example (1)

1.8

Every Matrix Transf­orm­ation is a:
Linear Transf­orm­ation
T(x) =
A(x)
If A is m x n Matrix, then the properties are
(1) T(u + v) = T(u) + T(v)
(2) T(cu) = cT(u)
(3) T(0) = 0
(4) T(cu + dv) = cT(u) + dT(v)

1.8 Example (1)

1.8 Example (2)

1.9

RRn --> RRm is said to be 'onto'
Equation T(x) =Ax=b has a unique solution or more than one solution
each row has a pivot
RRn --> RRm is said to be one-to-one
Equation T(x) =Ax=b has a unique solution or no solution
each row has a pivot

2.1

Addition of Matrices
Can Add matrices if they have same # of rows and columns
(ie A(3x4) and B(3x4) so you can add them)
Multiply by Scalar
Multiply each entry by scalar
Matrix Multip­lic­ation (A x B)
Must each row of A by each column of B
Powers of a Matrix
Can compute powers by if the matrix has the same number of columns as rows
Transpose of Matrix
row 1 of A becomes column 1 of A
row 2 of A becomes column 2 of A
Properties of Transpose
(1) if A is m x n, then AT is n x m
(2) (AT)T = A
(3) (A + B)T = A T + B T
(4) (tA)T = tAT
(5) (A B)T = BT AT

2.2

Singular matrix
A matrix that is NOT investable
Determ­inate of A (2 x 2) Matrix
det A = ad - bc
If A is invertable & (nxn)
There will never be no solution or infinitely many solutions to Ax = b
Properties of Invertable Matricies
(A-1)-1 = A
(assuming A & B are invest­able) (AB)-1 = B-1 A-1
(AT)-1 = (A-1)T
Finding Inverse Matrix
[A | I ] --> [ I | A-1] Use row operations
STOP when you get a row of Zeros, it cannot be reduced

2.2 Example (1)

2.3 Invertable Matrix Theorem

 

2.8

A subspace S of RRn is a subspace is S satisfies:
(1) S contains zero vector
(2) If u & v are in S, then u + v is also in S
(3) If r is a real # & u is in S, then ru is also in S
Subspace RR3
Any Plane that Passes through the origin forms a subspace RR3
Any set that contains nonlinear terms will NOT form a subspace RR3
Null Space (Nul A)
To determine in u is in the Nul(A), check if: Au = 0
If yes --> then u is in the Nullspace

2.8 Example (1)

2.8 Example (2)

2.9

Dimension of a non-zero Subspace
# of vectors in any basis; it is the # of linearly indepe­ndent vectors
Dimension of a zero Subspace
is Zero
Dimension of a Column Space
# of pivot columns
Dimension of a Null Space
# of free variables in the solution Ax=0
Rank of a Matrix
# of pivot columns
The Rank Theorem
Matrix A has n columns: rank A (# pivots) + dim Nul A (# free var.) = n
dim = dimension; var. = variable

2.9 Refrence

3.1

Calcul­ating Determ­inant of Matrix A is another way to tell if a linear system of equations has a solution
(1) Det(A) not =0, then Ax=b has a unique solution
(2) Det(A) =0, then Ax=b has no solutions or inf many
If Ax not= 0
A-1 exist
If Ax = 0
A-1 Does NOT exist
Cofactor Expansion
Use row/column w/ most zeros
If Matrix A has an upper or lower triangle of zeros
The det(A) is the multip­lic­ation down the diagonals

3.1 Reference (1)

3.1 Example (1)

3.1 Reference (2)

3.1 Example (2)

3.2

Determ­inate Property 1
If a multiple of 1 row of A is added to another row to produce Matrix B, then det(B)­=det(A)
Determ­inate Property 2
If 2 rows of A are interc­hanged to produce B, then det(B)­=-d­et(A)
Determ­inate Property 3
If one row of A is multiplied to produce B, then det(B)­=k*­det(A)
Assuming both A & B are n x n Matrices
(1) det(AT) = det(A)
(2) det(AB) = det(A)­*det(B)
(3) det(A-1) = 1/det(A)
(4) det(cA) = cn det(A)
(5) det(Ar) = (detA)r

3.3 AKA Cramer's Rule

Cramer's Rule
Can be used to find the solution to a linear system of equations Ax=b when A is an investable square matrix
Def. of Cramer's Rule
Let A be an n x n invertible matrix. For any b in RRn, the unique solution x of Ax=b has entries given by
xi = detAi(b)/det(A) i = 1,2,...n
Ai(b)
is the matrix A w/ column i replaced w/ vector b

3.3 Example (1)

5.1

Au=λu
A is an nxn matrix. A nonzero vector u is an eigenv­ector of A if there exists such a scalar λ
To determine if λ is an eigenvalue
reduce [(A-λI)|0] to echelon form and see if it has any free variables.
yes -> λ is Eigenvalue
no -> λ is not eigenvalue
To determine if given vector is an eigenv­ector
Ax=λx
Eigenspace of A =
Nullspace of (A-λI)
Eigenv­alues of triangular Matrix
entries along diagonal *you CANNOT row reduce a matrix to find its eigenv­alues

5.1 Example (1)

5.1 Example (2)

5.1 Example (3)

5.2

If λ is an eigenvalue of a Matrix A
then (A-λI)x=0 will have a nontrivial solution
A nontrivial solution will exist
if det(A-­Î»I)=0 (Chara­cte­ristic Equation)
A is nxn Matrix. A is invertible if and only if
(1) The # 0 is NOT an λ of A
(2) The det(A) is not zero
Similar Matrices
If nxn Matrices A and B are similar, then they have the same charac­ter­istic polynomial (same λ) with same multip­lic­ities

5.2 Example (1)

5.3

A matrix A written in diagonal form
A=PDP-1
Power of Matrix
Ak = Diagonal matrix and #'s on diagonal get raised to the k
Determ­ining if Matrix is Diagon­ali­zable
λ of a nxn matrix
n distinct (or real) λ then matrix is diagon­ali­zable
less than n λ, it may or may not be diagon­ali­zable; it will be if # of linearly dependent eigenv­ectors = n
eigenv­ectors of nxn matrix
n linearly indepe­ndent eigenv­ectors, then diagon­ali­zable
less than n linearly indepe­ndent eigenv­ectors, then matrix is NOT diagon­lizable
D
matrix w/ λ down diagonal
P
columns of P have linearly n linearly indepe­ndent eigenv­ectors
Finding P
solve A-λI and plug in the λ values. Reduce to EF, solve for x, & find eigenv­ector

5.3 Example (1)

5.3 Example (2)

5.3 Example (3)

 

6.1

Length of vector x
||x|| = sqrt(x12+x22)
Length fo vector x in RR2
||x|| = sqrt(x • x)
The Unit Vector
u = v/||v||
Two vectors u & v in RRn, the distance between u & v
||u - v||
Two vectors u & v are orthogonal if and only if
||u+v||2= ||u||2 +||v||2
u • v = 0

6.2

The distance from y to the line through u & the origin
||z|| = ||y - y-hat||

6.2 Example (1)

6.2 Example (2)

6.2 Example (3)

6.2 Example (4)

6.2 Example (5)

6.2 Example (6)

6.2 Example (7)

6.2 Reference (1)

6.2 Reference (2)

6.3 Example (1.1)

6.3 Example (1.2)

6.3 Example (2)

6.3 Example (3)

6.3 Example (4)

6.4

Gram- Schmidt Process Overview
take a given set of vectors & transform them into a set of orthogonal or orthon­ormal vectors
Given x1 & x2, produce v1 & v2 where the v's are perp. to each other
(1) Let v1=x1
(2) Find v2; v2=x2 - x2hat
x2 hat
(x2•v1­)/(­v1•v1) * v1
Orthogonal Basis
{v1,v2,...,vn}
Orthon­ormal Basis
{v1/||v1||, v2/ ||v2||,..., vn/||vn||}

6.4 Reference (1)

6.4 Example (1)

7.1

Symmetric Matrix
A square matrix where AT=A
If A is a symmetric Matrix
then eigenv­ectors associated w/ distinct eigenv­alues are orthogonal
If a matrix is symmet­rical, it has an orthogonal & orthon­ormal basis of vectors
Orthogonal matrix is a square matrix w/ orthon­ormal columns
(1) Matrix is square
(2) Columns are orthogonal
(3) Columns are unit vectors
If Matrix P has orthon­ormal columns
PTP=I
If P is a nxn orthogonal matrix
PT=P-1
A=PDPT
A must be symmetric, P must be normalized

7.1 Reference (1)

7.1 Example (1)

7.1 Example (2.1)

7.1 Example (2.2)

                           
 

Comments

Won't download as PDF link is broken

Add a Comment

Your Comment

Please enter your name.

    Please enter your email address

      Please enter your Comment.

          Related Cheat Sheets

          Discrete Math Cheat Sheet
          Matrices Cheat Sheet