Shreyas’ Notes

Linear Algebra

MATH 355

fall, sophomore year

Linear algebra formalizes the procedure to find all solutions to a linear system of equations when they exist, or to prove that they don’t exist.

Linear Combinations and Span §

Linear Independence §

Linear Transformations §

T:VWT : V \rightarrow W is a linear transformation if:

Properties:

Onto and Into §

The image of a linear transformation T:VWT : V \rightarrow W is the set {T(v):vV}\{T(v) : v \in V\}. The image is also the span of the columns of AA, the std matrix of TT.

A linear transformation T:VWT : V \rightarrow W is onto if the image of TT is WW.

A linear transformation T:VWT : V \rightarrow W is into if for every u,vVu, v \in V, T(u)=T(V)    u=vT(u) = T(V) \implies u = v.

TT is onto TT is into
AA's RREF has a pivot in each row AA's RREF has a pivot in each column
The columns of AA span Rm\mathbb{R}^m The columns of AA are LI
Ax=wAx = w has a solution for each wRmw \in \mathbb{R}^m Ax=0Ax = 0 has only the trivial solution x=0x = 0
The image of TT is Rm\mathbb{R}^m The kernel of TT is {0}\{0\}
Existence Uniqueness

Matrix Multiplication §

If S:RpRnS : \mathbb{R}^p \rightarrow \mathbb{R}^n and T:RnRmT : \mathbb{R}^n \rightarrow \mathbb{R}^m are linear transformations, TS:RpRmT \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m is a linear transformation.

If the m×nm \times n matrix AA is the std matrix for T:RnRmT : \mathbb{R}^n \rightarrow \mathbb{R}^m and the n×pn \times p matrix BB is the std matrix for S:RpRnS : \mathbb{R}^p \rightarrow \mathbb{R}^n, then ABAB will be a m×pm \times p matrix which is the std matrix for TS:RpRmT \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m.

For a m×nm \times n matrix AA, AIn=AA I_n = A and ImA=AI_m A = A.

Inverses §

Consider T:VWT: V \rightarrow W and S:WVS : W \rightarrow V. SS and TT are inverses of each other if S(T(V))=vvVS(T(V)) = v \forall v \in V and T(S(w))=wwWT(S(w)) = w \forall w \in W.

If S=T1S = T^{-1} exists, TT is invertible.

A linear transformation is invertible iff it is both into and onto.

Consider a m×nm \times n matrix AA and a n×mn \times m matrix BB. AA and BB are inverses of each other if AB=ImAB = I_m and BA=InBA = I_n.

If B=A1B = A^{-1} exists, AA is invertible.

Transposes and Inverses §

A=(At)tA = {(A^t)}^t

A matrix AA is symmetric if A=AtA = A^t

If AA and BB are matrices such that ABAB is defined, (AB)t=BtAt{(AB)}^t = B^t A^t.

If AA is invertible, AtA^t is invertible. (At)1=(A1)t{(A^t)}^{-1} = {(A^{-1})}^t.

Invertible Matrix Theorem §

Consider a linear transformation T:RnRnT : \mathbb{R}^n \rightarrow \mathbb{R}^n with a n×nn \times n standard matrix AA. The following statements are equivalent:

To calculate the matrix of a matrix, row reduce it with the identity matrix on the right side.

Elementary Matrices §

Scaling a row §

E1=[11c11]E_1 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & c & & & \\ & & & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]

Adding cc times a row to another row §

E2=[11c11]E_2 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & \ddots & & & \\ & & c & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]

Swapping two rows §

E3=[10111101]E_3 = \left[ \begin{array}{ccccccccccc} 1 & & & & & & & & & & \\ & \ddots & & & & & & & & & \\ & & & 0 & & & & 1 & & & \\ & & & & 1 & & & & & & \\ & & & & & \ddots & & & & & \\ & & & & & & 1 & & & & \\ & & & 1 & & & & 0 & & & \\ & & & & & & & & & \ddots & \\ & & & & & & & & & & 1 \end{array} \right]

Dimension §

If VV is spanned by a finite list of vectors, the dimension of VV is the number of vectors in a basis. If VV is nto spanned by a finite set, VV is infinite-dimensional.

Any two bases of VV have the same size.

Any set of more than nn vectors is linearly dependent.

Consider a nn-dimensional vector space VV:

Matrices Relative to Bases §

Let T:VWT: V \rightarrow W. B={b1,,bn}B = \{b_1, \dots, b_n\} is a basis for VV and CC is a basis for WW. If AA is the matrix for TT relative to BB and CC, A[v]B=[T(v)]CA {[v]}_B = {[T(v)]}_C for all vVv \in V.

AA has the columns: [T(b1)]C,,[T(bn)]C{[T(b_1)]}_C, \dots, {[T(b_n)]}_C.

Similarity and Diagonalization §

A n×nn \times n matrix AA is diagonalizable iff AA has nn linearly independent eigenvectors. If A=PDP1A = P D P^{-1}

Eigentheory §

Any list of eigenvectors for a matrix AA with distinct eigenvalues must be linearly independent.

A n×nn \times n matrix has at most nn distinct eigenvalues.

If a n×nn \times n matrix has nn distinct eigenvalues, it has an eigenbasis and is therefore diagonalizable.

Complex Eigen §

If vv is an eigenvector ofor a real matrix with eigenvalue λ\lambda, then v\overline v is also an eigenvector with eigenvalue λ\overline \lambda.

(v)\Im (v)

Suppose AA is a real 2×22 \times 2 matrix. Suppose that λ=a+bi\lambda = a + bi is a non-real complex eigenvalue for AA, with a corresponding eigenvector vv. Let x=(v)x = \Re(v) and y=(v)y = \Im(v). Then B={x;y}B = \{x; y\} is a basis of R2\mathbb{R}^2, and the matrix for AA with respect to BB is [abba]\begin{bmatrix}a & -b \\ b & a\end{bmatrix}.