Shreyas’ Notes

# MATH 355

• existence

• uniqueness

• coefficient matrix

• augmented matrix

Linear algebra formalizes the procedure to find all solutions to a linear system of equations when they exist, or to prove that they don’t exist.

### Linear Transformations §

$T : V \rightarrow W$ is a linear transformation if:

• For $u, v \in V$, $T(u + v) = T(u) + T(v)$
• For $v \in V$ and scalar $a$, $T(av) = a \cdot T(v)$

Properties:

• $T(0) = 0$
• any linear transformations respects linear combinations

### Onto and Into §

The image of a linear transformation $T : V \rightarrow W$ is the set $\{T(v) : v \in V\}$. The image is also the span of the columns of $A$, the std matrix of $T$.

A linear transformation $T : V \rightarrow W$ is onto if the image of $T$ is $W$.

A linear transformation $T : V \rightarrow W$ is into if for every $u, v \in V$, $T(u) = T(V) \implies u = v$.

$T$ is onto $T$ is into
$A$'s RREF has a pivot in each row $A$'s RREF has a pivot in each column
The columns of $A$ span $\mathbb{R}^m$ The columns of $A$ are LI
$Ax = w$ has a solution for each $w \in \mathbb{R}^m$ $Ax = 0$ has only the trivial solution $x = 0$
The image of $T$ is $\mathbb{R}^m$ The kernel of $T$ is $\{0\}$
Existence Uniqueness

### Matrix Multiplication §

If $S : \mathbb{R}^p \rightarrow \mathbb{R}^n$ and $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ are linear transformations, $T \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m$ is a linear transformation.

If the $m \times n$ matrix $A$ is the std matrix for $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ and the $n \times p$ matrix $B$ is the std matrix for $S : \mathbb{R}^p \rightarrow \mathbb{R}^n$, then $AB$ will be a $m \times p$ matrix which is the std matrix for $T \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m$.

For a $m \times n$ matrix $A$, $A I_n = A$ and $I_m A = A$.

### Inverses §

Consider $T: V \rightarrow W$ and $S : W \rightarrow V$. $S$ and $T$ are inverses of each other if $S(T(V)) = v \forall v \in V$ and $T(S(w)) = w \forall w \in W$.

If $S = T^{-1}$ exists, $T$ is invertible.

A linear transformation is invertible iff it is both into and onto.

Consider a $m \times n$ matrix $A$ and a $n \times m$ matrix $B$. $A$ and $B$ are inverses of each other if $AB = I_m$ and $BA = I_n$.

If $B = A^{-1}$ exists, $A$ is invertible.

### Transposes and Inverses §

$A = {(A^t)}^t$

A matrix $A$ is symmetric if $A = A^t$

If $A$ and $B$ are matrices such that $AB$ is defined, ${(AB)}^t = B^t A^t$.

If $A$ is invertible, $A^t$ is invertible. ${(A^t)}^{-1} = {(A^{-1})}^t$.

#### Invertible Matrix Theorem §

Consider a linear transformation $T : \mathbb{R}^n \rightarrow \mathbb{R}^n$ with a $n \times n$ standard matrix $A$. The following statements are equivalent:

• $T$ is into
• $\ker(T) = \{0\}$
• The columns of $A$ are linearly independent
• $Ax = 0$ has only the trivial solution
• $A$'s RREF has a pivot in every column (no free vars)
• $A$'s RREF is the identity matrix
• $A$'s RREF has a pivot in each row
• For every $v \in \mathbb{R}^n$, $Ax = v$ is consistent
• The columns of $A$ span $\mathbb{R}^n$
• $T$ is onto
• $T$ is invertible
• $A$ is invertible
• $A^t$ is invertible
• The rows of $A$ span $\mathbb{R}^n$
• The rows of $A$ are linearly independent

To calculate the matrix of a matrix, row reduce it with the identity matrix on the right side.

### Elementary Matrices §

#### Scaling a row §

$E_1 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & c & & & \\ & & & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]$

#### Adding $c$ times a row to another row §

$E_2 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & \ddots & & & \\ & & c & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]$

#### Swapping two rows §

$E_3 = \left[ \begin{array}{ccccccccccc} 1 & & & & & & & & & & \\ & \ddots & & & & & & & & & \\ & & & 0 & & & & 1 & & & \\ & & & & 1 & & & & & & \\ & & & & & \ddots & & & & & \\ & & & & & & 1 & & & & \\ & & & 1 & & & & 0 & & & \\ & & & & & & & & & \ddots & \\ & & & & & & & & & & 1 \end{array} \right]$

### Dimension §

If $V$ is spanned by a finite list of vectors, the dimension of $V$ is the number of vectors in a basis. If $V$ is nto spanned by a finite set, $V$ is infinite-dimensional.

Any two bases of $V$ have the same size.

Any set of more than $n$ vectors is linearly dependent.

Consider a $n$-dimensional vector space $V$:

• if \{v_1, \cdots, v_n} are LI, they also span $V$ and hence form a basis
• if \{v_1, \cdots, v_n} span $V$, they are also LI and hence form a basis