BasisA set S is a basis for V if | 1. S spans V | 2. S is LI. |
If S is a basis for V then every vector in V can be written in one and only one way as a linear combo of vectors in S and every set containing more than n vectors is LD.
Basis Test1. If S is a LI set of vectors in V, then S is a basis for V | 2. If S spans V, then S is a basis for V |
Change of BasisP[x]_B' = [x]_B | [x]_B' = P-1 [x]_B | [B B'] -> [ I P-1 ] | [B' B ] -> [ I P ] |
Cross Productif u = u1i + u2j + u3k | AND | v = v1i + v2j + v3k | THEN | u x v = (u2v3 - u3v2)i - (u1v3 -u3v1)j + (u1v2 - u2v1)k |
Definition of a Vector Spaceu + v is within V | u+v = v+u | u+(v+w) = (u+v)+w | u+0 = u | u-u = 0 | cu is within V | c(u+v) = cu+cv | (c+d)u = cu+du | c(du) = (cd)u | 1*u = u |
| | Diagonalizable MatricesA is diagonalizable when A is similar to a diagonal matrix.
That is, A is diagonalizable when there exists an invertible matrix P such that P-1AP is a diagonal matrix |
Dot Products Etc.length/norm ||v|| = sqrt(v_12 +...+ v_n2 | ||cv|| = |c| ||v|| | v / ||v|| is the unit vector | distance d(u,v) = ||u-v|| | Dot product u•v = (u_1v_1 +...+ u_nv_n) | n cos(theta) = u•v / (||u|| ||v||) | u&v are orthagonal when dot(u,v) = 0 |
EigenshitThe scalar lambda(Y) is called an Eigenvalue of A when there is a nonzero vector x such that Ax = Yx. | Vector x is an Eigenvector of A corresponding to Y. | The set of all eigenvectors with the zero vector is a subspace of Rn called the Eigenspace of Y. | 1. Find Eigenvalues: det(YI - A) = 0 | 2. Find Eigenvectors: (YI - A)x = 0 | If A is a triangular matrix then its eigenvalues are on its main diagonal |
Gram-Schmidt Orthonormalization1. B = {v1, v2, ..., vn} | 2. B' = {w1, w2, ..., wn}: | | w1 = v1 | w2 = v2 - projw1v2 | w3 = v3 - projw1v3 - projw2v3 | wn = vn - ... | | 3. B'' = {u1, u2, ..., un}: | | ui = wi/||wi|| | | B'' is an orthonormal basis for V | span(B) = span(B'') |
| | Important Vector SpacesRn | C(-inf, +inf) | C[a, b] | P | P_n | M_m,n |
Inner Products||u|| = sqrt<u,u> | d(u,v) = ||u-v|| | cos(theta) = <u,v> / (||u|| ||v||) | u&v are orthagonal when <u,v> = 0 | proj_v u = <u,v>/<v,v> * v |
KernalFor T:V->W The set of all vectors v in V that satisfies T(v)=0 is the kernal of T. ker(T) is a subspace of v.
For T:Rn ->Rm by T(x)=Ax ker(T) = solution space of Ax=0 & Cspace(A) = range(T) |
Linear Combov is a linear combo of u_1 ... u_n | . |
Linear Independencea set of vectors S is LI if c1v1 +...+ ckvk = 0 has only the trivial solution.
If there are other solutions S is LD. A set S is LI iff one of its vectors can
be written as a combo of other S vectors. |
Linear TransformationV & W are Vspaces. T:V->W is a linear transformation of V into W if: | 1. T(u+v) = T(u) = T(v) | 2. T(cu) = cT(u) |
Non-HomogenyIf xp is a solution to Ax = b then every solution to the system can be written as x = xp |
| | NullityNullspace(A) = {x ε Rn : Ax = 0
Nullity(A) = dim(Nullspace(A))
= n - rank(A) |
Orthogonal SetsSet S in V is orthogonal when every pair of vectors in S is orthogonal. If each vector is a unit vector, then S is orthonormal |
One-to-One and OntoT is one-to-one iff ker(T) = {0} | T is onto iff rank(T) = dim(W) | If dim(T) = dim(W) then T is one-to-one iff it is onto |
Rank and Nullity of Tnullity(T) = dim(kernal) | rank(T) = dim(range) | range(T) + nullity(T) = n (in m_x n) | dim(domain) = dim(range) + dim(kernal) |
Rank of a MatrixRank(A) = dim(Rspace) = dim(Cspace) |
Similar MatricesFor square matrices A and A' of order n, A' is similar to A when there exits an invertible matrix P such that A' = P-1 AP |
Spanning SetsS = {v1...vk} is a subset of vector space V. S spans V if every vector in v can be written as a linear combo of vectors in S. |
Test for Subspace1. u+v are in W | 2. cu is in w |
|
Created By
Metadata
Favourited By
Comments
No comments yet. Add yours below!
Add a Comment
Related Cheat Sheets