Linear algebra formalizes the procedure to find all solutions to a linear system of equations when they exist, or to prove that they don’t exist.
Linear Combinations and Span §
Linear Independence §
Linear Transformations §
is a linear transformation if:
- For ,
- For and scalar ,
- any linear transformations respects linear combinations
Onto and Into §
The image of a linear transformation is the set . The image is also the span of the columns of , the std matrix of .
A linear transformation is onto if the image of is .
A linear transformation is into if for every , .
|is onto||is into|
|'s RREF has a pivot in each row||'s RREF has a pivot in each column|
|The columns of span||The columns of are LI|
|has a solution for each||has only the trivial solution|
|The image of is||The kernel of is|
Matrix Multiplication §
If and are linear transformations, is a linear transformation.
If the matrix is the std matrix for and the matrix is the std matrix for , then will be a matrix which is the std matrix for .
For a matrix , and .
Consider and . and are inverses of each other if and .
If exists, is invertible.
A linear transformation is invertible iff it is both into and onto.
Consider a matrix and a matrix . and are inverses of each other if and .
If exists, is invertible.
Transposes and Inverses §
A matrix is symmetric if
If and are matrices such that is defined, .
If is invertible, is invertible. .
Invertible Matrix Theorem §
Consider a linear transformation with a standard matrix . The following statements are equivalent:
- is into
- The columns of are linearly independent
- has only the trivial solution
- 's RREF has a pivot in every column (no free vars)
- 's RREF is the identity matrix
- 's RREF has a pivot in each row
- For every , is consistent
- The columns of span
- is onto
- is invertible
- is invertible
- is invertible
- The rows of span
- The rows of are linearly independent
To calculate the matrix of a matrix, row reduce it with the identity matrix on the right side.
Elementary Matrices §
Scaling a row §
Adding times a row to another row §
Swapping two rows §
If is spanned by a finite list of vectors, the dimension of is the number of vectors in a basis. If is nto spanned by a finite set, is infinite-dimensional.
Any two bases of have the same size.
Any set of more than vectors is linearly dependent.
Consider a -dimensional vector space :
- if are LI, they also span and hence form a basis
- if span , they are also LI and hence form a basis
Matrices Relative to Bases §
Let . is a basis for and is a basis for . If is the matrix for relative to and , for all .
has the columns: .
is invertible iff is invertible.
is the matrix for relative to and .
Similarity and Diagonalization §
A square matrix is diagonalizable if it is similar to some diagonal matrix .
If has linearly independent eigenvectors , is diagonalizable and , where has columns and has along its diagonal.
Any list of eigenvectors for a matrix with distinct eigenvalues must be linearly independent.
A matrix has at most distinct eigenvalues.
If a matrix has distinct eigenvalues, it has an eigenbasis and is therefore diagonalizable.
If is a square matrix and is a scalar, (plus ) is the -eigenspace of .
The characteristic polynomial of in is . A scalar is an eigenvalue of iff it is a solution to .
If is an eigenvalue for , the algebraic multiplicity of is the number of times appears as a root of the characteristic poly (aka the exponent of the corresponding term). The geometric multiplicity of is the dimension of the -eigenspace ().
Complex Numbers §
Let . and .
Fundamental theorem of algebra: Any non-constant polynomial with real or complex coefficients factors completely into linear factors over .
If is a polynomial with real coefficients, and , is also zero.
Complex Eigenvalues §
If is an eigenvector ofor a real matrix with eigenvalue , then is also an eigenvector with eigenvalue .
Suppose is a real matrix. Suppose that is a non-real complex eigenvalue for , with a corresponding eigenvector . Let and . Then is a basis of , and the matrix for with respect to is .
Inner Products §
For two vectors and in , .
The distance between two vectors and is .
For two vectors , in a vector space that has an inner product, and are orthogonal if . Additionally, and are orthogonal iff .
A list is orthogonal iff for . If the list is orthogonal and for all , it is also orthonormal.
A list of non-zero orthogonal vectors is linearly independent.
Orthogonal Projections §
For a ; , a subspace of ; , an orthonormal basis of ; .
- is perpendicular to all vectors in
- is the only vector that satisfies (1) and (2)
- is the closest vector to in
If has an orthogonal basis , .
If is an matrix and the columns of form an ONB of , and .
Orthogonal Decomposition §
is the direct sum of and if and . .
Let be a subspace of . , the orthogonal complement of , is . is a subspace of . Additionally, .
Least Squares §
Let be a matrix and . is consistent, and its solutions are least-squares solutions to .
Orthogonal Matrices §
If is symmetric, eigenvectors with distinct eigenvalues are orthogonal.
If is a real, symmetric matrix, is orthogonally diagonalizable.
For an matrix , the following are equivalent:
- is an orthogonal matrix
- The columns form an orthonormal basis for
- preserves inner products:
- preserves norms:
Spectral Theorem §
is symmetric if and only if is orthogonally diagonalizable.
Spectral theorem: For a symmetric matrix :
- All roots of are real
- The geometric multiplicity of each eigenvalue equals its algebraic multiplicity
- Eigenspaces with distinct eigenvalues are mutually orthogonal
- is orthogonally diagonalizable