Basic Equations
Network Flows |
1. the flow in an arc is only in one directions |
2. flow into a node = flow out of a node |
3. flow into the network = flow out of the network |
Balancing Chemical Equations |
1. add x's before each combo and both side |
2. carbo = x1 + 2(x3), set as system, solve |
Matrix |
augmented matrix |
variables and solution(rhs) |
coefficient matrix |
coefficients only, no rhs |
Vectors, Norm, Dot Product
maginitude (norm) of vector v is ||v||; ||v|| ≥ 0 |
if k>0, kv same direction as v |
magnitude = k||v|| |
if k<0, kv opposite direction to v |
magnitude = |k| ||v|| |
vectors in Rn (n = dimension) |
v = (v1, v2, ..., vn) |
v = P1P2 = OP2 - OP1 |
displacement vector |
norm/magnitude of vector ||v|| |
sqrt( (v1)2+(v2)2...) |
||v|| = 0 iff v =0 |
||kv|| = |k| ||v|| |
unit vector u in same direct as v |
u = (1/ ||v||) v |
e1 = (1,0...) ... en = (0,...1) in Rn |
standard unit vector |
d(u,v) = sqrt((u1-v1)2 + (u2-v2)2 ... (un-vn)2) = ||u-v|| |
d(u,v) = 0 iff u = v |
u·v = u1v1 + u2v2 ...+unvn ||u|| ||v|| cos(θ) |
dot product |
u and v are orthogonal if u·v = 0 (cos(θ) = 0) |
a set of vectors is an orthogonal set iff vi·vj = 0,if i≠j |
a set of vectors is an orthonormal set iff vi·vj = 0,if i≠j, and ||vi|| = 1 for all i |
(u·v)2 ≤ ||u||2||v||2 or |u·v| ≤ ||u|| ||v|| |
Cauchy-Schwarz Inequality |
d(uv) ≤ d(u,w) + d(w,v) ||u+v|| ≤ ||u|| + ||v|| |
Triangle Inequality |
||v1 + v2 ... + vk|| = ||v1|| + ||v2|| ... + ||vk|| |
Lines and Planes
a vector equation with parameter t |
x = x0 + tv, -∞ < t < +∞ |
solutin set for 3 dimension linear equation is a plane |
if x is a point on this plane (point-normal equation) |
n·(x-x0) = 0 |
A(x-x0)+B(y-y0)+C(z-z0) = 0 |
x0 = (x0,y0,z0), n = (A, B, C) |
general/algebraic equation |
Ax+By+Cz = D |
two planes are parallel if n1 = kn2, orthogonal if n1·n2 = 0 |
Matrix Algebra, Identity and Inverse Matrix
(A + B)ij = (A)ij + (B)ij |
(A - B)ij = (A)ij - (B)ij |
(cA)ij = c(A)ij |
(AT)ij = (A)ji |
(AB)ij = ai1b1j + ai2b2j + ... aikbkj |
Inner Product (number) is uTv = u·v, u and v same size |
Outer Product (matrix) is uvT, u and v can be any size |
(AT)T = A |
(kA)T = k(A)T |
(A+B)T = AT + BT |
(AB)T = BTAT |
tr(AT) = tr(A) |
tr(AB) = tr(BA) |
uTv = tr(uvT) |
tr(uvT) = tr(vuT) |
tr(A) = a11 + a22 ... + ann |
(AT)ij = Aji |
Identity matrix is square matrix with 1 along diagonals |
If A is m x n, AꞮn = A and ꞮmA = A |
a square matrix is invertible(nonsingular) if: |
AB = Ɪ = BA |
B is the inverse of A |
B = A-1 |
if A has no inverse, A is not invertible (singular) |
det(A) = ad - bc ≠ 0 is invertible |
if A is invertible: |
(AB)-1 = B-1A-1 |
(An)-1 = A-n = (A-1)n |
(AT)-1 = (A-1)T |
(kA)-1 |
1/k(A-1), k≠0 |
Elementary Matrix and Unifying Theorem
elementary matrices are invertible |
A-1 = Ek Ek-1 ... E2 E1 |
[ A | Ɪ ] -> [ Ɪ | A-1 ] (how to find inverse of A) |
Ax = b; x = A-1b |
- A -> RREF = Ɪ - A can be express as a product of E - A is invertible - Ax = 0 has only the trivial solution - Ax = b is consistent for every vector b in Rn - Ax = b has eactly 1 solution for every b in Rn - colum and rowvectors of A are linealy independent - det(A) ≠ 0 - λ = 0 is not an eigenvalue of A - TA is one to one and onto If not, then all no. |
Consistency
EAx = Eb -> Rx = b' , where b' = Eb |
(Ax=b) [ A | b ] -> [ EA | Eb ] (Rx = b') (but treat b as unknown: b1, b2...) |
For it to be consistent, if R has zero rows at the bottom, b' that row must equal to zero |
Homogeneous Systems
Linear Combination of the vectors: v = c1v1 + c2v2 ... + cnvn (use matrix to find c) |
Ax = 0 |
Homogeneous |
Ax = b |
Non-homogenous |
x = x0 + t1v1 + t2v2 ... + tkvk |
Homogeneous |
x = t1v1 + t2v2 ... + tkvk |
Non-homogeneous |
xp is any solution of NH system and xh is a solution of H system |
x = xp + xh |
|
|
Examples of Subspaces
IF: w1, w2 are within S |
then w1+w2 are within S and kw1 is within S |
- the zero vector 0 it self is a subspace |
- Rn is a subspace of all vectors |
- Lines and planes through the origin are subspaces |
- The set of all vectors b such that Ax = b is consistent, is a subspace |
- If {v1, v2, ...vk} is any set of vectors in Rn, then the set W of all linear combinations of these vector is a subspace |
W = {c1v1 + c2v2 + ... ckvk}; c are within real numbers |
Span
- the span of a set of vectors { v1, v2, ... vk} is the set of all linear combinations of these vectors |
span { v1, v2, ... vk} = { v11t, t2v2, ... , tkvk} |
If S = { v1, v2, ... vk}, then W = span(S) is a subspace |
Ax = b is consistent if and only if b is a linear combination of col(A) |
Linear Independent
- if unique solution for a set of vectors, then it is linearly independent |
c1v1 + c2v2 ... + cnvn = 0; all the c = 0 |
- for dependent, not all the c = 0 |
Dependent if: - a linear combination of the other vectors - a scalar multiple of the other - a set of more than n vectors in Rn |
Independent if: - the span of these two vectors form a plane |
- list the vectors as the columns of a matrix, row reduce it, if many solution, then it is dependent |
- after RREF, the columns with leading 1's are a maxmially linearly independent subset according to Pivot Theorem |
Diagonal, Triangular, Symmetric Matrices
Diagonal Matrices |
all zeros along the diagonal |
Lower Triangular |
zeros above diagonal |
Upper Triangular |
zeros below the diagonal |
Symmetric if: |
AT = A |
Skew-Symmetric if: |
AT = -A |
Determinants
det(A) = a1jC1j + a2jC2j ... + anjCnj |
expansion along jth column |
det(A) = ai1Ci1 + ai2Ci2 ... + ainCin |
expansion along the ith row |
Cij = (-1)i+j Mij |
Mij = deleted ith row and jth column matrix |
- pick the one with most zeros to calculate easier |
det(AT) = det(A) |
det(A-1) = 1/det(A) |
det(AB) = det(A)det(B) |
det(kA) = kndet(A) |
- A is invertible iff det(A) not equal 0 |
- det of triangular or diagonal matrix is the product of the diagonal entries |
det(A) for 2x2 matrix |
ad - bc |
Adjoint and Cramer's Rule
adj(A) = CT |
CT = matrix confactor of A |
A-1 = (1/det(A)) adj(A) |
adj(A)A = det(A) I |
x1 = det(A1) / det(A) |
x2 = det(A2) / det(A) |
xn = det(An) / det(A) |
det(A) not equal 0 |
An is the matrix when the nth column is replaced by b |
Hyperplane, Area/Volume
a hyperplane in Rn |
a1x1 + a2x2 ... + anxn = b |
- can also written as ax = b |
to find aperp |
ax = 0, find the span |
if A is 2x2 matrix: - |det(A)| is the area of parallelogram |
if A is 3x3 matrix: - |det(A)| is the volume of parallelepiped |
- subtract points to get three vectors, then make it to a matrix to find the area/volume |
Cross Product
u x v = (u2v3 - u3v2, u3v1 - u1v3, u1v2 - u2v1) |
u x v = -v x u |
k(u x v) = (ku) x v = u x (kv) |
u x u = 0 |
parallel vectors has 0 for c.p. |
u (u x v) = 0 |
v (u x v) = 0 |
u x v is perpendicular to span {u, v} |
||u x v|| = ||u|| ||v|| sin(theta), where theta is the angle between vectors |
Complex Number
complex number |
a + ib |
(a + ib) + (c + id) = (a + c) + i(b + d) |
(a + ib) - (c + id) = (a - c) + i(b - d) |
(a + ib) (c + id) = (ac + bd) + i(ad + bc) |
(a + bx) (c + dx) = (ac + bdx2) + x(ad + bc) |
i2 = -1 |
z = a + ib |
z bar = a - ib |
the length(magnitude) of vector z |
|z| = sqrt(z x z bar) = sqrt(a2 + b2) |
z-1 = 1/z = z bar / |z|2 |
z1 / z2 = z1z2-1 |
z = |z| (cos(θ) + i (sin(θ)) |
polar form (r = |z|) |
z1z2 = |z1| |z2| (cos(θ1 + θ2) + i (sin(θ1 + θ2)) |
z1/z2 = |z1| / |z2| (cos(θ1 - θ2) + i (sin(θ1 - θ2)) |
zn = rn(cos(n θ) + i sin(n θ)) |
r = |z| |
ei theta = cos(θ) + i sin(θ) |
ei pi = -1 |
ei pi +1 = 0 |
z1z2 = r1r2 ei (θ1 + θ2) |
zn = rn ei nθ |
z1 /z2 = r1 / r2 ei (θ1 - θ2) |
|
|
Eigenvalues and Eigenvectors
Ax= λx |
det(λI - A) = (-1)n det(A - λI) |
pa(λ) = 3x3: det(A - λI); 2x2: det(λI - A) |
- solve for (λI - A)x = 0 for eigenvectors |
Work Flow: - form matrix - compute pa(λ) = det(λI - A) - find roots of pa(λ) -> eigenvalues of A - plug in roots then solve for the equation |
Linear Transformation
f: Rn -> Rm, n = domain, m = co-domain f(x1, x2, ...xn) = (y1, ...ym) |
T: Rn -> Rm is a linear transformatin if 1. T(cu) = cT(u) 2. T(u +v) = T(u)+ T(v) |
for any linear transformation, T(0) = 0 |
Rθ = [T(e1) T(e2)] = [cosθ −sinθ] [sinθ cosθ] |
matrix for rotation |
reflection across y-axis: T(x, y) = (-x, y) |
reflection across x-axis: T(x, y) = (y, -x) |
reflection across diagonal y = x, T(x, y) = (y, x) |
orthogonal projection onto the x-axis: T(x, y) = (x, 0) |
orthogonal projection onto the y-axis: T(x, y) = (0, y) |
u = (1/ ||v||)v; express it vertically as u1 and u2 |
A = [(u1)2 u2u1] [u1u2 (u2)2] |
projection matrix |
contraction with 0 ≤ k < 1 (shrink), k > 1 (stretch) [x, y] -> [kx, ky] |
compression in x-direction [x, y] -> [kx, y] |
compression in y-direction [x, y] -> [x, ky] |
shear in x-direction T(x,y) = (x+ky, y); [x+ky (1, k), y( 0, 1)] |
shear in y-direction T(x,y) = (x, y+kx); [x (1, 0), y (k, 1)] |
orthogonal projection on the xy-plane: [x, y , 0] |
orthogonal projection on the xz-plane: [x, 0 , z] |
orthogonal projection on the yz-plane: [0, y , z] |
reflection about the xy-plane: [x, y, -z] |
reflection about the xz-plane: [x, -y, z] |
reflection about the yz-plane: [-x, y, z] |
Orthogonal Transformation
an orthogonal transformation is a linear transformation T; Rn -> Rn that preserves lengths; ||T(u)|| = ||u|| |
||T(u)|| = ||u|| <=> T(x)·T(y) = x·y for all x,y in Rn |
orthogonal matrix is square matrix A such that AT = A-1 |
1. if A is orthogonal, then so is AT and A-1 |
2. a product of orthonal matrices is orthogonal |
3. if A is orthogonal, then det(A) = 1 or -1 |
4. if A is orthogonal, then rows and columns of A are each orthonormal sets of vectors |
Kernel, Range, Composition
ker(T) is the set of all vectors x such that T(x) = 0, RREF matrix, find the vector, ker(T) = span{(v)} |
the solution space of Ax = 0 is the null space; null(A) = ker(A) |
range of T, ran(T) is the set of vectors y such that y = T(x) for some x |
ran(T) = col([T]) = span{ [col1], [col2] ...}; Ax = b |
Important Facts: 1. T is one to one iff ker(T) = {0} 2. Ax = b, if consistent, has a unique solution iff null(A) = {0}; Ax = 0 has only the trivial solution iff null(A) = {0} |
Important facts 2: 1.T:Rn -> Rm is onto iff the system Tx = y has a solution x in Rn for every y in Rm 2. Ax = b is consistent for every b in Rm(A is onto) iff col(A) = Rm |
The composition of T2 with T1 is: T2 ◦ T1 |
(T2 ◦ T1)(x) = T2(T1(x)); T2 ◦ T1: Rn -> Rm |
compostion of linear transformations corresponds to matrix application: [T2 ◦ T1] = [T1][T2] |
[T(θ1+θ2)] = [Tθ2] ◦ [Tθ1]; rotate then shear ≠ shear then rotate |
linear trans T: Rn->Rm has an inverse iff T is one to one, T-1: Rm -> Rn, Tx = y <=> x = T-1y |
for Rn to Rn, [T-1] = [T]-1; [T]-1◦T = 1n <=> [T-1][T]=Ɪn 1n is identity transformation; Ɪn is identity matrix |
Basis, Dimension, Rank
S is a basis for the subspace V of Rn if: S is linearly idenpendent and span(S) = V |
dim(V) = k, k is the # of vectors |
row(A) = rows with leading ones after RREF |
col(A) = columns with leading ones from original A |
null(A) = free variable's vectors |
null(AT) = after transform, the free variable vector |
The Rank Theorem: rank(A) = rank(AT) for any matrix have the same dimension |
rank(A) = # of free vectors in span |
dim(row(A)) = dim(col(A)) = rank(A) |
dim(null(A)) = nullity(A) |
Orthogonal Compliment, DImention Theorem
S⟂ = {v ∈ Rn | v · w = 0 for all w ∈ S} |
S⟂ is a subspace of Rn; S⟂ = span(S)⟂ = W⟂ |
row(A)⟂ = null(A) |
null(A)⟂ = row(A) ((S⟂)⟂ = S iff S is subspace |
col(A)⟂ = null(AT) |
null(AT)⟂ = col(A) |
The Dimension Theorem A is m x n matrix |
rank(A) + nullity(A) = n (k + (n-k) = n) |
if W is a subspace of Rn |
dim(W) + dim(W⟂) = n |
|
Created By
Metadata
Favourited By
Comments
No comments yet. Add yours below!
Add a Comment
Related Cheat Sheets