Chapter 4

Chapter 4

Definitions

Vector Space
A Vector Space is a non-empty set $V$ of objects with operations, addition and scalar multiplication, such that:

1) $\vec{u}+\vec{v} \in V$
2) $\vec{u} +\vec{v}=\vec{v} +\vec{u}, \forall \vec{u},\vec{v}\in V$
3) $\vec{u}+(\vec{v}+\vec{w})=(\vec{u}+\vec{v})+\vec{w}$
4) $\exists \vec{0}\in V such that \forall\vec{u}\in V, \vec{0}+\vec{u}=\vec{u}$
5) $\forall \vec{u}\in V \exists \vec{-u}\in V such that (\vec{u})+(\vec{-u})=\vec{0}$
6) $c\vec{u}\in V where c\in R$
7) $c(\vec{u}+\vec{v})=c\vec{u}+c\vec{v}, \forall\vec{u},\vec{v}\in V and c\in R$
8) $(c+d)\vec{u}=c\vec{u}+d\vec{u}, \forall \vec{u} \in V and c,d\in R$
9) $(cd)\vec{u} = c(d\vec{u}), \forall \vec{u}\in V and c,d\in R$
10) $1\cdot\vec{u}=\vec{u}, \forall \vec{u} \in V$

Basis
Let A be a subspace of a vector space V. A indexed set of vectors $\beta =$ {$\vec{b_1}, \vec{b_2},..., \vec{b_p}$} each in V is a basis for H if:
1. $\beta$ is linearly independent.
2. $H = span${$\vec{b_1}, \vec{b_2},..., \vec{b_p}$}

Row-Space
Each row of a matrix A can be identified by a vector in $\mathbb{R}^{n}$. The set of linear combinations of the row vectors is the row space, row(A).

Rank
The rank of a matrix is the dimension of the column space of the matrix.

Change of Basis
Let $B$ and $C$ be bases for a vector space. Given a vector $\vec{x}$ described using the basis $B$, we can describe the vector in terms of the basis $C$.
When changing bases the following formula can be used: $P_{C \leftarrow B} [\vec{x}]_B = [\vec{x}]_C$

Theorems

Theorem 1
If $\vec{v}_{1},...,\vec{v}_p$ are in a vector space $V$, then Span {$\vec{v}_{1},...,\vec{v}_p$} is a subspace of $V$.

Theorem 2
The null space of an $m x n$ matrix $A$ is a subspace of $R^n$. Equivalently, the set of all solutions to a system $Ax = 0$ of $m$ homogeneous linear equations in $n$ unknowns is a subspace of $R^n$.

Theorem 3
The Column space of an $m \times n$ matrix A is a subspace of $R^n$.

Theorem 4
An indexed set {$\vec{v}_{1},...,\vec{v}_p$} of two or more vectors, with $v_1$ $\neq$ 0, is linearly dependent if and only if some $v_j$ (with j >1) is a linear combination of the preceding vectors, $v_1,...,v_{j-1}$

Theorem 5
Let $S = \{ \vec{v}_1,...\vec{v}_{p} \}$ be a set in $V$, and let $H = \mathrm{Span} \ \{ \vec{v}_1,...\vec{v}_{p} \}$.

  1. If one of the vectors in $S$ — say, $\vec{v}_k$ — is a linear combination of the remaining vectors in $S$, then the set formed from $S$ by removing $\vec{v}_k$ still spans $H$.
  2. If $H \neq \{ \vec{0} \}$, some subset of $S$ is a basis for $H$.

Theorem 6
The pivot column of a matrix A form a basis for Col A.

Theorem 7
Let $\beta=\left\{\vec{b_1},\vec{b_2},...,\vec{b_n}\right\}$ be a basis for a vector space $V$. Then for each $\vec{x}\in V$ there exists a unique set of constants $c_1,c_2,...,c_n$ such that $c_1\vec{b_1}+c_2\vec{b_2}+...+c_n\vec{b_n}=\vec{0}$

Theorem 8
Let $\beta=\left\{\vec{b_1},\vec{b_2},...,\vec{b_n}\right\}$ be a basis for a vector space $V$. Then th coordinates mapping $x\mapsto [x]_{\beta}$ is a one-to-one linear transformation from $V$ onto $\mathbb{R}^{n}$.

Theorem 9
If a vector space $V$ has a basis $\beta=\left\{\vec{b_1}, \vec{b_2}, ..., \vec{b_n}\right\}$, then any set in $V$ containing more than $n$ vectors is linearly dependent.

Theorem 10
If a vector space has a basis of $n$ vectors, then every basis of the vector space has $n$ vectors.

Theorem 11
Let $H$ be a subspace of a finite-dimensional vector space $V$. Any linearly independent set in $H$ can be expanded, if necessary, to a basis for $H$. Also, $H$ is finite-dimensional and $dim H <= dim V$.

Theorem 12
Let $V$ be a $p$-dimensional vector space, $p \geq 1$. Any linearly independent set of exactly $p$ elements in $V$ is automatically a basis for $V$. Any set of exactly $p$ elements that spans $V$ is automatically a basis for $V$.

Theorem 13
If two matrices $A$ and $B$ are row equivalent then their row spaces are the same. If $B$ is in echelon form, the nonzero rows of $B$ form a basis for the row space of $A$ as well as for that of $B$

Theorem 14
The dimensions of the column space and the row space of an $m \times n$ matrix $A$ are equal. This common dimension, the rank of A, also equals the number of pivot positions in $A$ and satisfies the equation

(1)
\begin{align} \mathrm{rank}(A) + \mathrm{dim}(\mathrm{nul}(A)) = n \end{align}

Homework Problems

4.4
27. Use coordinate vectors to test the linear independence of the set of polynomials ${1+2t^3, 2+t-3t^2,-t+2t^2-t^3}$.
First, convert this to coordinate vectors and place them in a matrix: $\begin{bmatrix}1&2&0\\0&1&-1\\0&-3&2\\2&0&-1\end{bmatrix}$. Notice the three columns represent the three polynomials, and the rows represent multiples of the terms. We know this system is linearly independent if $\begin{bmatrix}1&2&0\\0&1&-1\\0&-3&2\\2&0&-1\end{bmatrix}$ $\begin{bmatrix}x_1\\x_2\\x_3\end{bmatrix} = \vec{0}$ has only the trivial solution. Notice the row reduced version of the matrix is $\begin{bmatrix}1&0&0\\0&1&0\\0&0&1\\0&0&0\end{bmatrix}$, and so since there is a pivot in every column, the system has no free variables and must be linearly independent.

4.6
25. A scientist solves non homogeneous system of ten linear equations in twelve unknowns and finds that three of the unknowns are free variables. Can the scientist be certain that, if the right sides of the equations are changed, the non homogeneous system will have a solution?
No, there will be nine resulting pivots if there were three free variables, but notice there are ten equations. So we won't span our whole space. So we can't guarantee a solution for every vector.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License