Javascript required
Skip to content Skip to sidebar Skip to footer

How to Find a Basis for a Subspace

This document is a list of some material in linear algebra that you should be familiar with. Throughout, we will take A to be the 3 x 4 matrix

\begin{displaymath}  A = \left[ \begin{array}{rrrr}  1 & 2 & 3 & 4 \\  -2 & 3 & -1 & 5 \\  3 & -1 & 4 & -1 \end{array} \right]  \end{displaymath}


I assume you are familiar with matrix and vector addition and multiplication.
  • All vectors will be column vectors.
  • Given a vector v, if we say that $v\neq 0$, we mean that v has at least one nonzero component.
  • The transpose of a vector or matrix is denoted by a superscriptT. For example,

    \begin{displaymath}  A^T = \left[ \begin{array}{rrr}  1 & -2 & 3 \\ 2 & 3 & -1 \\ 3 & -1 & 4 \\ 4 & 5 & -1  \end{array} \right]  \end{displaymath}


  • The inner product or dot product of two vectors u and v in $I \! \! R^n$ can be written u T v; this denotes $\sum_{i=1}^n u_iv_i$. If u T v=0 then u and v are orthogonal.
  • The null space of A is the set of all solutions x to the matrix-vector equation Ax=0.
  • To solve a system of equations Ax=b, use Gaussian elimination. For example, if $b=[4,\: -3,\: 7]^T$, then we solve Ax=b as follows: (We set up the augmented matrix and row reduce (or pivot) to upper triangular form.)

    \begin{displaymath}  \begin{tabular}{\vert rrrr\vert r\vert}  \hline  1 & 2 & 3 ...  ... & 7 & 5 & 13 & 5 \\ 0 & 0 & 0 & 0 & 0 \\ \hline  \end{tabular}\end{displaymath}


    Thus, the solutions are all vectors x of the form

    \begin{displaymath}  x=\left[\begin{array}{r}1\\ 0\\ 1\\ 0\end{array}\right]+  s...  ...ht]+  t\left[\begin{array}{r}2\\ 13\\ 0\\ -7\end{array}\right]  \end{displaymath}


    for any numbers s and t.
  • The span of a set of vectors is the set of all linear combinations of the vectors. For example, if $v^1=[11,\: 5,\: -7,\: 0]^T$ and $v^2=[2,\: 13,\: 0,\: -7]^T$ then the span of v 1 and v 2 is the set of all vectors of the form sv 1+tv 2 for some scalars s andt.
  • The span of a set of vectors in $I \! \! R^n$ gives a subspace of $I \! \! R^n$. Any nontrivial subspace can be written as the span of any one of uncountably many sets of vectors.
  • A set of vectors $\{v^1,\ldots,v^m\}$ is linearly independent if the only solution to the vector equation $\lambda_1v^1+\ldots+\lambda_mv^m=0$ is $\lambda_i=0$ for alli. If a set of vectors is not linearly independent, then it is linearly dependent. For example, the rows of A are not linearly independent, since

    \begin{displaymath}  - \left[ \begin{array}{r}1\\ 2\\ 3\\ 4\end{array} \right]  ...  ...t]  = \left[ \begin{array}{r}0\\ 0\\ 0\\ 0\end{array} \right].  \end{displaymath}


    To determine whether a set of vectors is linearly independent, write the vectors as columns of a matrix C, say, and solve Cx=0. If there are any nontrivial solutions then the vectors are linearly dependent; otherwise, they are linearly independent.
  • If a linearly independent set of vectors spans a subspace then the vectors form a basis for that subspace. For example, v 1 and v 2 form a basis for the span of the rows of A. Given a subspace S, every basis of S contains the same number of vectors; this number is the dimension of the subspace. To find a basis for the span of a set of vectors, write the vectors as rows of a matrix and then row reduce the matrix.
  • The span of the rows of a matrix is called the row space of the matrix. The dimension of the row space is the rank of the matrix.
  • The span of the columns of a matrix is called the range or the column space of the matrix. The row space and the column space always have the same dimension.
  • If M is an m x n matrix then the null space and the row space of M are subspaces of $I \! \! R^n$ and the range of M is a subspace of $I \! \! R^m$.
  • If u is in the row space of a matrix M and v is in the null space of M then the vectors are orthogonal. The dimension of the null space of a matrix is the nullity of the matrix. If M has n columns then rank(M)+nullity(M)=n. Any basis for the row space together with any basis for the null space gives a basis for$I \! \! R^n$.
  • If M is a square matrix, $\lambda$ is a scalar, and x is a vector satisfying $Mx=\lambda x$ then x is an eigenvector of M with corresponding eigenvalue$\lambda$. For example, the vector $x=[1,\: 2]^T$ is an eigenvector of the matrix

    \begin{displaymath}  M = \left[ \begin{array}{rr}3&2\\ 2&6\end{array}\right]  \end{displaymath}


    with eigenvalue $\lambda=7$.
  • The eigenvalues of a symmetric matrix are always real. A nonsymmetric matrix may have complex eigenvalues.
  • Given a symmetric matrix M, the following are equivalent:
    1.
    All the eigenvalues of M are positive.
    2.
    x T Mx>0 for any $x \neq 0$.
    3.
    M is positive definite.
  • Given a symmetric matrix M, the following are equivalent:
    1.
    All the eigenvalues of M are nonnegative.
    2.
    $x^TMx \geq 0$ for any x.
    3.
    M is positive semidefinite.

 
  • About this document ...

John E. Mitchell
2004-08-31

How to Find a Basis for a Subspace

Source: https://www.rpi.edu/~mitchj/handouts/linalg/