Shreyas’ Notes

Linear Algebra

MATH 355

fall, sophomore year

Linear algebra formalizes the procedure to find all solutions to a linear system of equations when they exist, or to prove that they don’t exist.

Linear Combinations and Span §

Linear Independence §

Linear Transformations §

T:VWT : V \rightarrow W is a linear transformation if:

Properties:

Onto and Into §

The image of a linear transformation T:VWT : V \rightarrow W is the set {T(v):vV}\{T(v) : v \in V\}. The image is also the span of the columns of AA, the std matrix of TT.

A linear transformation T:VWT : V \rightarrow W is onto if the image of TT is WW.

A linear transformation T:VWT : V \rightarrow W is into if for every u,vVu, v \in V, T(u)=T(V)    u=vT(u) = T(V) \implies u = v.

TT is onto TT is into
AA's RREF has a pivot in each row AA's RREF has a pivot in each column
The columns of AA span Rm\mathbb{R}^m The columns of AA are LI
Ax=wAx = w has a solution for each wRmw \in \mathbb{R}^m Ax=0Ax = 0 has only the trivial solution x=0x = 0
The image of TT is Rm\mathbb{R}^m The kernel of TT is {0}\{0\}
Existence Uniqueness

Matrix Multiplication §

If S:RpRnS : \mathbb{R}^p \rightarrow \mathbb{R}^n and T:RnRmT : \mathbb{R}^n \rightarrow \mathbb{R}^m are linear transformations, TS:RpRmT \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m is a linear transformation.

If the m×nm \times n matrix AA is the std matrix for T:RnRmT : \mathbb{R}^n \rightarrow \mathbb{R}^m and the n×pn \times p matrix BB is the std matrix for S:RpRnS : \mathbb{R}^p \rightarrow \mathbb{R}^n, then ABAB will be a m×pm \times p matrix which is the std matrix for TS:RpRmT \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m.

For a m×nm \times n matrix AA, AIn=AA I_n = A and ImA=AI_m A = A.

Inverses §

Consider T:VWT: V \rightarrow W and S:WVS : W \rightarrow V. SS and TT are inverses of each other if S(T(V))=vvVS(T(V)) = v \forall v \in V and T(S(w))=wwWT(S(w)) = w \forall w \in W.

If S=T1S = T^{-1} exists, TT is invertible.

A linear transformation is invertible iff it is both into and onto.

Consider a m×nm \times n matrix AA and a n×mn \times m matrix BB. AA and BB are inverses of each other if AB=ImAB = I_m and BA=InBA = I_n.

If B=A1B = A^{-1} exists, AA is invertible.

Transposes and Inverses §

A=(At)tA = {(A^t)}^t

A matrix AA is symmetric if A=AtA = A^t

If AA and BB are matrices such that ABAB is defined, (AB)t=BtAt{(AB)}^t = B^t A^t.

If AA is invertible, AtA^t is invertible. (At)1=(A1)t{(A^t)}^{-1} = {(A^{-1})}^t.

Invertible Matrix Theorem §

Consider a linear transformation T:RnRnT : \mathbb{R}^n \rightarrow \mathbb{R}^n with a n×nn \times n standard matrix AA. The following statements are equivalent:

To calculate the matrix of a matrix, row reduce it with the identity matrix on the right side.

Elementary Matrices §

Scaling a row §

E1=[11c11]E_1 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & c & & & \\ & & & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]

Adding cc times a row to another row §

E2=[11c11]E_2 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & \ddots & & & \\ & & c & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]

Swapping two rows §

E3=[10111101]E_3 = \left[ \begin{array}{ccccccccccc} 1 & & & & & & & & & & \\ & \ddots & & & & & & & & & \\ & & & 0 & & & & 1 & & & \\ & & & & 1 & & & & & & \\ & & & & & \ddots & & & & & \\ & & & & & & 1 & & & & \\ & & & 1 & & & & 0 & & & \\ & & & & & & & & & \ddots & \\ & & & & & & & & & & 1 \end{array} \right]

Dimension §

If VV is spanned by a finite list of vectors, the dimension of VV is the number of vectors in a basis. If VV is nto spanned by a finite set, VV is infinite-dimensional.

Any two bases of VV have the same size.

Any set of more than nn vectors is linearly dependent.

Consider a nn-dimensional vector space VV:

Matrices Relative to Bases §

Let T:VWT: V \rightarrow W. B={b1,,bn}B = \{b_1, \dots, b_n\} is a basis for VV and CC is a basis for WW. If AA is the matrix for TT relative to BB and CC, A[v]B=[T(v)]CA {[v]}_B = {[T(v)]}_C for all vVv \in V.

AA has the columns: [T(b1)]C,,[T(bn)]C{[T(b_1)]}_C, \dots, {[T(b_n)]}_C.

TT is invertible iff AA is invertible.

A1A^{-1} is the matrix for T1T^{-1} relative to CC and BB.

Similarity and Diagonalization §

A square n×nn \times n matrix AA is diagonalizable if it is similar to some diagonal matrix DD.

If AA has nn linearly independent eigenvectors {v1,,vn}\{v_1, \dots, v_n\}, AA is diagonalizable and A=PDP1A = P D P^{-1}, where PP has columns v1,,vnv_1, \dots, v_n and DD has λ1,,λn\lambda_1, \dots, \lambda_n along its diagonal.

Eigentheory §

Any list of eigenvectors for a matrix AA with distinct eigenvalues must be linearly independent.

A n×nn \times n matrix has at most nn distinct eigenvalues.

If a n×nn \times n matrix has nn distinct eigenvalues, it has an eigenbasis and is therefore diagonalizable.

If AA is a square matrix and λ\lambda is a scalar, ker(AλI)\ker (A - \lambda I) (plus 0\vec 0) is the λ\lambda-eigenspace of AA.

The characteristic polynomial of AA in λ\lambda is AλI|A - \lambda I|. A scalar is an eigenvalue of AA iff it is a solution to AλI=0|A - \lambda I = 0.

If λ\lambda is an eigenvalue for AA, the algebraic multiplicity of AA is the number of times λ\lambda appears as a root of the characteristic poly (aka the exponent of the corresponding term). The geometric multiplicity of λ\lambda is the dimension of the λ\lambda-eigenspace (dimAλI\dim A - \lambda I).

Complex Numbers §

Let z=a+biz = a + bi. (z)=a=z+z2\Re(z) = a = \frac{z + \overline{z}}{2} and (z)=b=zz2i\Im(z) = b = \frac{z - \overline{z}}{2i}.

Fundamental theorem of algebra: Any non-constant polynomial with real or complex coefficients factors completely into linear factors over C\mathbb{C}.

If ff is a polynomial with real coefficients, and f(z)=0f(z) = 0, f(z)f(\overline{z}) is also zero.

Norm: z=a2+b2|z| = \sqrt{a^2 + b^2}

Argument: tan1ba\tan^{-1} \frac{b}{a}

zw=zw|zw| = |z| \cdot |w| and arg(zw)=arg(z)+arg(w)\arg(zw) = \arg(z) + \arg(w)

Complex Eigenvalues §

If vv is an eigenvector ofor a real matrix with eigenvalue λ\lambda, then v\overline v is also an eigenvector with eigenvalue λ\overline \lambda.

Suppose AA is a real 2×22 \times 2 matrix. Suppose that λ=a+bi\lambda = a + bi is a non-real complex eigenvalue for AA, with a corresponding eigenvector vv. Let x=(v)x = \Re(v) and y=(v)y = -\Im(v). Then B={x;y}B = \{x; y\} is a basis of R2\mathbb{R}^2, and the matrix for AA with respect to BB is [abba]\begin{bmatrix}a & -b \\ b & a\end{bmatrix}.

Inner Products §

For two vectors v=[a1,,an]v = [a_1, \dots, a_n] and w=[b1,,bn]w = [b_1, \dots, b_n] in Rn\mathbb{R}^n, vw˙=v,w=a1b1++anbnv \dot w = \langle v, w \rangle = a_1 b_1 + \dots + a_n b_n.

Norm/Magnitude v=v,v||v|| = \sqrt{\langle v, v \rangle}

The distance between two vectors vv and ww is vw||v - w||.

For two vectors vv, ww in a vector space VV that has an inner product, vv and ww are orthogonal if v,w=0\langle v, w \rangle = 0. Additionally, vv and ww are orthogonal iff v+w2=v2+w2{||v + w||}^2 = {||v||}^2 + {||w||}^2.

A list {v1,,vn}\{v_1, \dots, v_n\} is orthogonal iff vi,vj=0\langle v_i, v_j \rangle = 0 for iji \neq j. If the list is orthogonal and vi=1||v_i|| = 1 for all ii, it is also orthonormal.

A list of non-zero orthogonal vectors is linearly independent.

Orthogonal Projections §

For a vVv \in V; WW, a subspace of VV; {w1,,wk}\{w_1, \dots, w_k\}, an orthonormal basis of WW; v^=i=1kv,wiwi\hat v = \sum_{i = 1}^k \langle v, w_i \rangle w_i.

  1. v^W\hat v \in W
  2. vv^v - \hat v is perpendicular to all vectors in WW
  3. v^\hat v is the only vector that satisfies (1) and (2)
  4. v^\hat v is the closest vector to vv in WW

If WW has an orthogonal basis {u1,,uk}\{u_1, \dots, u_k\}, v^=i=1kv,wiwiwi,wi\hat v = \sum_{i = 1}^k \langle v, w_i \rangle \frac{w_i}{\langle w_i, w_i \rangle}.

v,w=vtw\langle v, w \rangle = v^t w

If AA is an n×nn \times n matrix and the columns of AA form an ONB of Rn\mathbb{R}^n, AtA=InA^t A = I_n and At=A1A^t = A^{-1}.

Orthogonal Decomposition §

VV is the direct sum of UU and WW if U+W=VU + W = V and UW=0U \cap W = 0. V=UWV = U \oplus W.

Let WW be a subspace of VV. WW^\perp, the orthogonal complement of WW, is {vV:v,w=0wW}\{v \in V : \langle v, w \rangle = 0 \forall w \in W\}. WW^\perp is a subspace of VV. Additionally, V=WWV = W \oplus W^\perp.

(W)=W{(W^\perp)}^\perp = W

(kerA)=imAt{(\ker A)}^\perp = im A^t and (imA)=kerAt{(im A)}^\perp = \ker A^t

Least Squares §

Let AA be a m×nm \times n matrix and vRmv \in \mathbb{R}^m. AtAx=AtvA^t A x = A^t v is consistent, and its solutions are least-squares solutions to Ax=vAx = v.

Orthogonal Matrices §

If AA is symmetric, eigenvectors with distinct eigenvalues are orthogonal.

If AA is a real, symmetric matrix, AA is orthogonally diagonalizable.

For an n×nn \times n matrix AA, the following are equivalent:

Spectral Theorem §

AA is symmetric if and only if AA is orthogonally diagonalizable.

Spectral theorem: For a symmetric matrix AA: