How to Find a Basis for a Subspace
        This document is a list of some material in linear algebra that you should be familiar with. Throughout, we will take        A        to be the 3  x  4 matrix        
      
        ![\begin{displaymath}  A = \left[ \begin{array}{rrrr}  1 & 2 & 3 & 4 \\  -2 & 3 & -1 & 5 \\  3 & -1 & 4 & -1 \end{array} \right]  \end{displaymath}](https://homepages.rpi.edu/~mitchj/handouts/linalg/img1.gif) 
      
I assume you are familiar with matrix and vector addition and multiplication.
- All vectors will be column vectors.
- Given a vector          v, if we say that           , we mean that          v          has at least one nonzero component. , we mean that          v          has at least one nonzero component.
- The          transpose          of a vector or matrix is denoted by a superscriptT. For example,           ![\begin{displaymath}  A^T = \left[ \begin{array}{rrr}  1 & -2 & 3 \\ 2 & 3 & -1 \\ 3 & -1 & 4 \\ 4 & 5 & -1  \end{array} \right]  \end{displaymath}](https://homepages.rpi.edu/~mitchj/handouts/linalg/img3.gif)  
 
- The          inner product          or          dot product          of two vectors          u          and          v          in           can be written          u                      T                    v; this denotes can be written          u                      T                    v; this denotes . If          u                      T                    v=0 then          u          and          v          are          orthogonal. . If          u                      T                    v=0 then          u          and          v          are          orthogonal.
- The null space of A is the set of all solutions x to the matrix-vector equation Ax=0.
- To solve a system of equations          Ax=b, use Gaussian elimination. For example, if                     ![$b=[4,\: -3,\: 7]^T$](https://homepages.rpi.edu/~mitchj/handouts/linalg/img6.gif) , then we solve          Ax=b          as follows: (We set up the augmented matrix and row reduce (or pivot) to upper triangular form.) , then we solve          Ax=b          as follows: (We set up the augmented matrix and row reduce (or pivot) to upper triangular form.)  
 Thus, the solutions are all vectors x of the form![\begin{displaymath}  x=\left[\begin{array}{r}1\\ 0\\ 1\\ 0\end{array}\right]+  s...  ...ht]+  t\left[\begin{array}{r}2\\ 13\\ 0\\ -7\end{array}\right]  \end{displaymath}](https://homepages.rpi.edu/~mitchj/handouts/linalg/img8.gif)  
 for any numbers s and t.
- The          span          of a set of vectors is the set of all linear combinations of the vectors. For example, if                     ![$v^1=[11,\: 5,\: -7,\: 0]^T$](https://homepages.rpi.edu/~mitchj/handouts/linalg/img9.gif) and and![$v^2=[2,\: 13,\: 0,\: -7]^T$](https://homepages.rpi.edu/~mitchj/handouts/linalg/img10.gif) then the span of          v          1          and          v          2          is the set of all vectors of the form          sv          1+tv          2          for some scalars          s          andt. then the span of          v          1          and          v          2          is the set of all vectors of the form          sv          1+tv          2          for some scalars          s          andt.
- The span of a set of vectors in           gives a          subspace          of gives a          subspace          of . Any nontrivial subspace can be written as the span of any one of uncountably many sets of vectors. . Any nontrivial subspace can be written as the span of any one of uncountably many sets of vectors.
- A set of vectors                      is          linearly independent          if the only solution to the vector equation is          linearly independent          if the only solution to the vector equation is is for alli. If a set of vectors is not linearly independent, then it is          linearly dependent. For example, the rows of          A          are          not          linearly independent, since for alli. If a set of vectors is not linearly independent, then it is          linearly dependent. For example, the rows of          A          are          not          linearly independent, since![\begin{displaymath}  - \left[ \begin{array}{r}1\\ 2\\ 3\\ 4\end{array} \right]  ...  ...t]  = \left[ \begin{array}{r}0\\ 0\\ 0\\ 0\end{array} \right].  \end{displaymath}](https://homepages.rpi.edu/~mitchj/handouts/linalg/img14.gif)  
 To determine whether a set of vectors is linearly independent, write the vectors as columns of a matrix C, say, and solve Cx=0. If there are any nontrivial solutions then the vectors are linearly dependent; otherwise, they are linearly independent.
- If a linearly independent set of vectors spans a subspace then the vectors form a basis for that subspace. For example, v 1 and v 2 form a basis for the span of the rows of A. Given a subspace S, every basis of S contains the same number of vectors; this number is the dimension of the subspace. To find a basis for the span of a set of vectors, write the vectors as rows of a matrix and then row reduce the matrix.
- The span of the rows of a matrix is called the row space of the matrix. The dimension of the row space is the rank of the matrix.
- The span of the columns of a matrix is called the range or the column space of the matrix. The row space and the column space always have the same dimension.
- If          M          is an          m          x          n          matrix then the null space and the row space of M are subspaces of           and the range of M is a subspace of and the range of M is a subspace of . .
- If          u          is in the row space of a matrix          M          and          v          is in the null space of          M          then the vectors are orthogonal. The dimension of the null space of a matrix is the          nullity          of the matrix. If          M          has          n          columns then rank(M)+nullity(M)=n. Any basis for the row space together with any basis for the null space gives a basis for . .
- If          M          is a square matrix,           is a scalar, and          x          is a vector satisfying is a scalar, and          x          is a vector satisfying then          x          is an          eigenvector          of          M          with corresponding          eigenvalue then          x          is an          eigenvector          of          M          with corresponding          eigenvalue . For example, the vector . For example, the vector![$x=[1,\: 2]^T$](https://homepages.rpi.edu/~mitchj/handouts/linalg/img18.gif) is an eigenvector of the matrix is an eigenvector of the matrix![\begin{displaymath}  M = \left[ \begin{array}{rr}3&2\\ 2&6\end{array}\right]  \end{displaymath}](https://homepages.rpi.edu/~mitchj/handouts/linalg/img19.gif)  
 with eigenvalue . .
- The eigenvalues of a symmetric matrix are always real. A nonsymmetric matrix may have complex eigenvalues.
- Given a symmetric matrix          M, the following are equivalent:          - 1.
- All the eigenvalues of M are positive.
- 2.
-               x                              T                            Mx>0 for any               . .
- 3.
- M is positive definite.
 
- Given a symmetric matrix          M, the following are equivalent:          - 1.
- All the eigenvalues of M are nonnegative.
- 2.
-                for any              x. for any              x.
- 3.
- M is positive semidefinite.
 
- About this document ...
John E. Mitchell
2004-08-31
How to Find a Basis for a Subspace
Source: https://www.rpi.edu/~mitchj/handouts/linalg/