Shreyas’ Notes

# MATH 355

• existence

• uniqueness

• coefficient matrix

• augmented matrix

Linear algebra formalizes the procedure to find all solutions to a linear system of equations when they exist, or to prove that they don’t exist.

### Linear Transformations §

$T : V \rightarrow W$ is a linear transformation if:

• For $u, v \in V$, $T(u + v) = T(u) + T(v)$
• For $v \in V$ and scalar $a$, $T(av) = a \cdot T(v)$

Properties:

• $T(0) = 0$
• any linear transformations respects linear combinations

### Onto and Into §

The image of a linear transformation $T : V \rightarrow W$ is the set $\{T(v) : v \in V\}$. The image is also the span of the columns of $A$, the std matrix of $T$.

A linear transformation $T : V \rightarrow W$ is onto if the image of $T$ is $W$.

A linear transformation $T : V \rightarrow W$ is into if for every $u, v \in V$, $T(u) = T(V) \implies u = v$.

$T$ is onto $T$ is into
$A$'s RREF has a pivot in each row $A$'s RREF has a pivot in each column
The columns of $A$ span $\mathbb{R}^m$ The columns of $A$ are LI
$Ax = w$ has a solution for each $w \in \mathbb{R}^m$ $Ax = 0$ has only the trivial solution $x = 0$
The image of $T$ is $\mathbb{R}^m$ The kernel of $T$ is $\{0\}$
Existence Uniqueness

### Matrix Multiplication §

If $S : \mathbb{R}^p \rightarrow \mathbb{R}^n$ and $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ are linear transformations, $T \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m$ is a linear transformation.

If the $m \times n$ matrix $A$ is the std matrix for $T : \mathbb{R}^n \rightarrow \mathbb{R}^m$ and the $n \times p$ matrix $B$ is the std matrix for $S : \mathbb{R}^p \rightarrow \mathbb{R}^n$, then $AB$ will be a $m \times p$ matrix which is the std matrix for $T \circ S : \mathbb{R}^p \rightarrow \mathbb{R}^m$.

For a $m \times n$ matrix $A$, $A I_n = A$ and $I_m A = A$.

### Inverses §

Consider $T: V \rightarrow W$ and $S : W \rightarrow V$. $S$ and $T$ are inverses of each other if $S(T(V)) = v \forall v \in V$ and $T(S(w)) = w \forall w \in W$.

If $S = T^{-1}$ exists, $T$ is invertible.

A linear transformation is invertible iff it is both into and onto.

Consider a $m \times n$ matrix $A$ and a $n \times m$ matrix $B$. $A$ and $B$ are inverses of each other if $AB = I_m$ and $BA = I_n$.

If $B = A^{-1}$ exists, $A$ is invertible.

### Transposes and Inverses §

$A = {(A^t)}^t$

A matrix $A$ is symmetric if $A = A^t$

If $A$ and $B$ are matrices such that $AB$ is defined, ${(AB)}^t = B^t A^t$.

If $A$ is invertible, $A^t$ is invertible. ${(A^t)}^{-1} = {(A^{-1})}^t$.

#### Invertible Matrix Theorem §

Consider a linear transformation $T : \mathbb{R}^n \rightarrow \mathbb{R}^n$ with a $n \times n$ standard matrix $A$. The following statements are equivalent:

• $T$ is into
• $\ker(T) = \{0\}$
• The columns of $A$ are linearly independent
• $Ax = 0$ has only the trivial solution
• $A$'s RREF has a pivot in every column (no free vars)
• $A$'s RREF is the identity matrix
• $A$'s RREF has a pivot in each row
• For every $v \in \mathbb{R}^n$, $Ax = v$ is consistent
• The columns of $A$ span $\mathbb{R}^n$
• $T$ is onto
• $T$ is invertible
• $A$ is invertible
• $A^t$ is invertible
• The rows of $A$ span $\mathbb{R}^n$
• The rows of $A$ are linearly independent

To calculate the matrix of a matrix, row reduce it with the identity matrix on the right side.

### Elementary Matrices §

#### Scaling a row §

$E_1 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & c & & & \\ & & & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]$

#### Adding $c$ times a row to another row §

$E_2 = \left[ \begin{array}{ccccccc} 1 & & & & & & \\ & \ddots & & & & & \\ & & 1& & & & \\ & & & \ddots & & & \\ & & c & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1 \end{array} \right]$

#### Swapping two rows §

$E_3 = \left[ \begin{array}{ccccccccccc} 1 & & & & & & & & & & \\ & \ddots & & & & & & & & & \\ & & & 0 & & & & 1 & & & \\ & & & & 1 & & & & & & \\ & & & & & \ddots & & & & & \\ & & & & & & 1 & & & & \\ & & & 1 & & & & 0 & & & \\ & & & & & & & & & \ddots & \\ & & & & & & & & & & 1 \end{array} \right]$

### Dimension §

If $V$ is spanned by a finite list of vectors, the dimension of $V$ is the number of vectors in a basis. If $V$ is nto spanned by a finite set, $V$ is infinite-dimensional.

Any two bases of $V$ have the same size.

Any set of more than $n$ vectors is linearly dependent.

Consider a $n$-dimensional vector space $V$:

• if $\{v_1, \cdots, v_n\}$ are LI, they also span $V$ and hence form a basis
• if $\{v_1, \cdots, v_n\}$ span $V$, they are also LI and hence form a basis

## Matrices Relative to Bases §

Let $T: V \rightarrow W$. $B = \{b_1, \dots, b_n\}$ is a basis for $V$ and $C$ is a basis for $W$. If $A$ is the matrix for $T$ relative to $B$ and $C$, $A {[v]}_B = {[T(v)]}_C$ for all $v \in V$.

$A$ has the columns: ${[T(b_1)]}_C, \dots, {[T(b_n)]}_C$.

## Similarity and Diagonalization §

A $n \times n$ matrix $A$ is diagonalizable iff $A$ has $n$ linearly independent eigenvectors. If $A = P D P^{-1}$

## Eigentheory §

Any list of eigenvectors for a matrix $A$ with distinct eigenvalues must be linearly independent.

A $n \times n$ matrix has at most $n$ distinct eigenvalues.

If a $n \times n$ matrix has $n$ distinct eigenvalues, it has an eigenbasis and is therefore diagonalizable.

## Complex Eigen §

If $v$ is an eigenvector ofor a real matrix with eigenvalue $\lambda$, then $\overline v$ is also an eigenvector with eigenvalue $\overline \lambda$.

$\Im (v)$

Suppose $A$ is a real $2 \times 2$ matrix. Suppose that $\lambda = a + bi$ is a non-real complex eigenvalue for $A$, with a corresponding eigenvector $v$. Let $x = \Re(v)$ and $y = \Im(v)$. Then $B = \{x; y\}$ is a basis of $\mathbb{R}^2$, and the matrix for $A$ with respect to $B$ is $\begin{bmatrix}a & -b \\ b & a\end{bmatrix}$.