Linear Algebra
MATH 355

existence

uniqueness

coefficient matrix

augmented matrix
Linear algebra formalizes the procedure to find all solutions to a linear system of equations when they exist, or to prove that they don’t exist.
Linear Combinations and Span §
Linear Independence §
Linear Transformations §
$T : V \rightarrow W$ is a linear transformation if:
 For $u, v \in V$, $T(u + v) = T(u) + T(v)$
 For $v \in V$ and scalar $a$, $T(av) = a \cdot T(v)$
Properties:
 $T(0) = 0$
 any linear transformations respects linear combinations
Onto and Into §
The image of a linear transformation $T : V \rightarrow W$ is the set $\{T(v) : v \in V\}$. The image is also the span of the columns of $A$, the std matrix of $T$.
A linear transformation $T : V \rightarrow W$ is onto if the image of $T$ is $W$.
A linear transformation $T : V \rightarrow W$ is into if for every $u, v \in V$, $T(u) = T(V) \implies u = v$.
$T$ is onto  $T$ is into 

$A$'s RREF has a pivot in each row  $A$'s RREF has a pivot in each column 
The columns of $A$ span $\mathbb{R}^m$  The columns of $A$ are LI 
$Ax = w$ has a solution for each $w \in \mathbb{R}^m$  $Ax = 0$ has only the trivial solution $x = 0$ 
The image of $T$ is $\mathbb{R}^m$  The kernel of $T$ is $\{0\}$ 
Existence  Uniqueness 
Matrix Multiplication §
If $S : \mathbb{R}^p \rightarrow \mathbb{R}^n$ and $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ are linear transformations, $T \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m$ is a linear transformation.
If the $m \times n$ matrix $A$ is the std matrix for $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ and the $n \times p$ matrix $B$ is the std matrix for $S : \mathbb{R}^p \rightarrow \mathbb{R}^n$, then $AB$ will be a $m \times p$ matrix which is the std matrix for $T \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m$.
For a $m \times n$ matrix $A$, $A I_n = A$ and $I_m A = A$.
Inverses §
Consider $T: V \rightarrow W$ and $S : W \rightarrow V$. $S$ and $T$ are inverses of each other if $S(T(V)) = v \forall v \in V$ and $T(S(w)) = w \forall w \in W$.
If $S = T^{1}$ exists, $T$ is invertible.
A linear transformation is invertible iff it is both into and onto.
Consider a $m \times n$ matrix $A$ and a $n \times m$ matrix $B$. $A$ and $B$ are inverses of each other if $AB = I_m$ and $BA = I_n$.
If $B = A^{1}$ exists, $A$ is invertible.
Transposes and Inverses §
$A = {(A^t)}^t$
A matrix $A$ is symmetric if $A = A^t$
If $A$ and $B$ are matrices such that $AB$ is defined, ${(AB)}^t = B^t A^t$.
If $A$ is invertible, $A^t$ is invertible. ${(A^t)}^{1} = {(A^{1})}^t$.
Invertible Matrix Theorem §
Consider a linear transformation $T : \mathbb{R}^n \rightarrow \mathbb{R}^n$ with a $n \times n$ standard matrix $A$. The following statements are equivalent:
 $T$ is into
 $\ker(T) = \{0\}$
 The columns of $A$ are linearly independent
 $Ax = 0$ has only the trivial solution
 $A$'s RREF has a pivot in every column (no free vars)
 $A$'s RREF is the identity matrix
 $A$'s RREF has a pivot in each row
 For every $v \in \mathbb{R}^n$, $Ax = v$ is consistent
 The columns of $A$ span $\mathbb{R}^n$
 $T$ is onto
 $T$ is invertible
 $A$ is invertible
 $A^t$ is invertible
 The rows of $A$ span $\mathbb{R}^n$
 The rows of $A$ are linearly independent
To calculate the matrix of a matrix, row reduce it with the identity matrix on the right side.
Elementary Matrices §
Scaling a row §
$E_1 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & c & & & \\ & & & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]$
Adding $c$ times a row to another row §
$E_2 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & \ddots & & & \\ & & c & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]$
Swapping two rows §
$E_3 = \left[ \begin{array}{ccccccccccc} 1 & & & & & & & & & & \\ & \ddots & & & & & & & & & \\ & & & 0 & & & & 1 & & & \\ & & & & 1 & & & & & & \\ & & & & & \ddots & & & & & \\ & & & & & & 1 & & & & \\ & & & 1 & & & & 0 & & & \\ & & & & & & & & & \ddots & \\ & & & & & & & & & & 1 \end{array} \right]$
Dimension §
If $V$ is spanned by a finite list of vectors, the dimension of $V$ is the number of vectors in a basis. If $V$ is nto spanned by a finite set, $V$ is infinitedimensional.
Any two bases of $V$ have the same size.
Any set of more than $n$ vectors is linearly dependent.
Consider a $n$dimensional vector space $V$:
 if $\{v_1, \cdots, v_n\}$ are LI, they also span $V$ and hence form a basis
 if $\{v_1, \cdots, v_n\}$ span $V$, they are also LI and hence form a basis
Matrices Relative to Bases §
Let $T: V \rightarrow W$. $B = \{b_1, \dots, b_n\}$ is a basis for $V$ and $C$ is a basis for $W$. If $A$ is the matrix for $T$ relative to $B$ and $C$, $A {[v]}_B = {[T(v)]}_C$ for all $v \in V$.
$A$ has the columns: ${[T(b_1)]}_C, \dots, {[T(b_n)]}_C$.
$T$ is invertible iff $A$ is invertible.
$A^{1}$ is the matrix for $T^{1}$ relative to $C$ and $B$.
Similarity and Diagonalization §
A square $n \times n$ matrix $A$ is diagonalizable if it is similar to some diagonal matrix $D$.
If $A$ has $n$ linearly independent eigenvectors $\{v_1, \dots, v_n\}$, $A$ is diagonalizable and $A = P D P^{1}$, where $P$ has columns $v_1, \dots, v_n$ and $D$ has $\lambda_1, \dots, \lambda_n$ along its diagonal.
Eigentheory §
Any list of eigenvectors for a matrix $A$ with distinct eigenvalues must be linearly independent.
A $n \times n$ matrix has at most $n$ distinct eigenvalues.
If a $n \times n$ matrix has $n$ distinct eigenvalues, it has an eigenbasis and is therefore diagonalizable.
If $A$ is a square matrix and $\lambda$ is a scalar, $\ker (A  \lambda I)$ (plus $\vec 0$) is the $\lambda$eigenspace of $A$.
The characteristic polynomial of $A$ in $\lambda$ is $A  \lambda I$. A scalar is an eigenvalue of $A$ iff it is a solution to $A  \lambda I = 0$.
If $\lambda$ is an eigenvalue for $A$, the algebraic multiplicity of $A$ is the number of times $\lambda$ appears as a root of the characteristic poly (aka the exponent of the corresponding term). The geometric multiplicity of $\lambda$ is the dimension of the $\lambda$eigenspace ($\dim A  \lambda I$).
Complex Numbers §
Let $z = a + bi$. $\Re(z) = a = \frac{z + \overline{z}}{2}$ and $\Im(z) = b = \frac{z  \overline{z}}{2i}$.
Fundamental theorem of algebra: Any nonconstant polynomial with real or complex coefficients factors completely into linear factors over $\mathbb{C}$.
If $f$ is a polynomial with real coefficients, and $f(z) = 0$, $f(\overline{z})$ is also zero.
Norm: $z = \sqrt{a^2 + b^2}$
Argument: $\tan^{1} \frac{b}{a}$
$zw = z \cdot w$ and $\arg(zw) = \arg(z) + \arg(w)$
Complex Eigenvalues §
If $v$ is an eigenvector ofor a real matrix with eigenvalue $\lambda$, then $\overline v$ is also an eigenvector with eigenvalue $\overline \lambda$.
Suppose $A$ is a real $2 \times 2$ matrix. Suppose that $\lambda = a + bi$ is a nonreal complex eigenvalue for $A$, with a corresponding eigenvector $v$. Let $x = \Re(v)$ and $y = \Im(v)$. Then $B = \{x; y\}$ is a basis of $\mathbb{R}^2$, and the matrix for $A$ with respect to $B$ is $\begin{bmatrix}a & b \\ b & a\end{bmatrix}$.
Inner Products §
For two vectors $v = [a_1, \dots, a_n]$ and $w = [b_1, \dots, b_n]$ in $\mathbb{R}^n$, $v \dot w = \langle v, w \rangle = a_1 b_1 + \dots + a_n b_n$.
 $\langle v, w \rangle = \langle w, v \rangle$
 $\langle av, w \rangle = a \cdot \langle v, w \rangle$
 $\langle v, v \rangle \geq 0$
 $\langle u + v, w \rangle = \langle u, w \rangle + \langle v, w \rangle$
Norm/Magnitude $v = \sqrt{\langle v, v \rangle}$
The distance between two vectors $v$ and $w$ is $v  w$.
For two vectors $v$, $w$ in a vector space $V$ that has an inner product, $v$ and $w$ are orthogonal if $\langle v, w \rangle = 0$. Additionally, $v$ and $w$ are orthogonal iff ${v + w}^2 = {v}^2 + {w}^2$.
A list $\{v_1, \dots, v_n\}$ is orthogonal iff $\langle v_i, v_j \rangle = 0$ for $i \neq j$. If the list is orthogonal and $v_i = 1$ for all $i$, it is also orthonormal.
A list of nonzero orthogonal vectors is linearly independent.
Orthogonal Projections §
For a $v \in V$; $W$, a subspace of $V$; $\{w_1, \dots, w_k\}$, an orthonormal basis of $W$; $\hat v = \sum_{i = 1}^k \langle v, w_i \rangle w_i$.
 $\hat v \in W$
 $v  \hat v$ is perpendicular to all vectors in $W$
 $\hat v$ is the only vector that satisfies (1) and (2)
 $\hat v$ is the closest vector to $v$ in $W$
If $W$ has an orthogonal basis $\{u_1, \dots, u_k\}$, $\hat v = \sum_{i = 1}^k \langle v, w_i \rangle \frac{w_i}{\langle w_i, w_i \rangle}$.
$\langle v, w \rangle = v^t w$
If $A$ is an $n \times n$ matrix and the columns of $A$ form an ONB of $\mathbb{R}^n$, $A^t A = I_n$ and $A^t = A^{1}$.
Orthogonal Decomposition §
$V$ is the direct sum of $U$ and $W$ if $U + W = V$ and $U \cap W = 0$. $V = U \oplus W$.
Let $W$ be a subspace of $V$. $W^\perp$, the orthogonal complement of $W$, is $\{v \in V : \langle v, w \rangle = 0 \forall w \in W\}$. $W^\perp$ is a subspace of $V$. Additionally, $V = W \oplus W^\perp$.
${(W^\perp)}^\perp = W$
${(\ker A)}^\perp = im A^t$ and ${(im A)}^\perp = \ker A^t$
Least Squares §
Let $A$ be a $m \times n$ matrix and $v \in \mathbb{R}^m$. $A^t A x = A^t v$ is consistent, and its solutions are leastsquares solutions to $Ax = v$.
Orthogonal Matrices §
If $A$ is symmetric, eigenvectors with distinct eigenvalues are orthogonal.
If $A$ is a real, symmetric matrix, $A$ is orthogonally diagonalizable.
For an $n \times n$ matrix $A$, the following are equivalent:
 $A$ is an orthogonal matrix
 The columns form an orthonormal basis for $\mathbb{R}^n$
 $A^t = A^{1}$
 $A$ preserves inner products: $\langle Av, Aw \rangle = \langle v, w \rangle$
 $A$ preserves norms: $Av = v$
Spectral Theorem §
$A$ is symmetric if and only if $A$ is orthogonally diagonalizable.
Spectral theorem: For a symmetric matrix $A$:
 All roots of $A  \lambda I$ are real
 The geometric multiplicity of each eigenvalue equals its algebraic multiplicity
 Eigenspaces with distinct eigenvalues are mutually orthogonal
 $A$ is orthogonally diagonalizable