## Chapter 1

### Definitions

A **linear equation** in the variables $x_1, x_2, ... , x_n$ is an equation that can be written in the form

A **system of linear equations** is a collection of linear equations involving the same variables.

A **solution** of the system is a list of numbers that makes each equation true.

The set of all solutions is the **solution set**.

Two systems are **equivalent** if they have the same solution set.

#### Linear Combination

Let $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$ be vectors in $\mathbb{R}^n$.

Let $c_1, c_2, ..., c_p$ be scalars.

Then

is the linear combination of $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$

#### Span

Let $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p \in \mathbb{R}^n$. The **span** of $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$ is the set of all linear combinations of$\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$

#### Linear Independence

The vectors $\bar{v}_1,...,\bar{v}_p$ in $\mathbb{R}^n$ are **linearly independent** if $x_1 \bar{v}_1 + ... + x_p \bar{v}_p = 0$ has only the trivial solution.

### Theorems

#### Theorem 1

Each matrix is row equivalent to one and only one reduced echelon matrix.

#### Theorem 2

A linear system is consistent if and only if the right most column of the augmented matrix is not a pivot. Furthermore, if a linear system is consistent then the solution set is unique if there are no free variables and infinite if there is at least one free variable.

#### Theorem 3

If $A$ is and $m \times n$ matrix with columns $\bar{a}_1, \bar{a}_2, ... ,\bar{a}_n$ and if $\bar{b}$ is in $\mathbb{R}^m$, the matrix equation

(3)has the same solution set as the vector equations

(4)which also has the same solution set as the augmented matrix

(5)#### Theorem 4

Let $A$ be an $m \times n$ matrix. Then the following statements are logically equivalent. That is, for a particular $A$, either they are all true statements or they are all false.

- For each $\vec{b}$ in $\mathbb{R}^m$, the equation $A\vec{x} = \vec{b}$ has a solution.
- Each $\vec{b}$ in $\mathbb{R}^m$ is a linear combination of the columns of $A$.
- The columns of $A$ span $\mathbb{R}^m$.
- A has a pivot position in every row.

#### Theorem 5

If $A$ is an $m \times n$ matrix, $\vec{u}$ and $\vec{v}$ are vectors in $\mathbb{R}^n$, and $c$ is a scalar, then:

- $A(\vec{u}+\vec{v}) = A\vec{u}+A\vec{v}$ ;
- $A(c\vec{u}) = c(A\vec{u})$.

#### Theorem 6

Suppose the equation $A\vec{x} = \vec{b}$ is consistent for some given $\vec{b}$ and let $\vec{p}$ be a solution. Then the solution set of $A\vec{x} = \vec{b}$ is the set of all vectors of the form $\vec{w} = \vec{p}+\vec{v}_h$, where $\vec{v}_h$ is any solution of the homogeneous equation $A\vec{x} = 0$.

#### Theorem 7

An indexed set $S = {\vec{v}_1, ..., \vec{v}_p}$ of two or more vectors is linearly dependent if and only if at least one of the vectors in $S$ is a linear combination of the others. In fact, if $S$ is linearly dependent and $\vec{v}_1\not= 0$, then some $\vec{v}_j$ (with $j > 1$) is a linear combination of the preceding vectors, $\vec{v}_1, ..., \vec{v}_{j-1}$.

#### Theorem 8

If a set contains more vectors than there are entries in each vector, then the set is linearly dependent. That is, any set ${[\vec{v}_1, ...,\vec{v}_p]}$ in $\mathbb{R}^n$ is linearly dependent if $p > n$.

#### Theorem 9

If a set $S = {[\vec{v}_1, ...,\vec{v}_p]}$ in $\mathbb{R}^n$ contains the zero vector, then the set is linearly dependent.

**Proof:**

Let $S$ be a set of vectors such that $\vec{0}$ is an element of $S$.

$S =${$\vec{v}_1, \vec{v}_2, ..., \vec{v}_m, \vec{0}$}

$A = [\vec{v}_1 \vec{v}_2 ... \vec{0}]$

Consider $A\vec{x}=\vec{0}$, the column of $\vec{0}$ cannot be a pivot, therefore, it must be free. Consequently, the columns of $A$ are linearly dependent. $S$ is linearly dependent.

#### Linear Transformation

A transformation (or mapping) T is **linear** if:

(i) T(**u** + **v**) = T(u) + T(v) for all **u**,**v**, in the domain of T

(ii) T(c**u**) = cT(**u**) for all scalars c and all **u** in the domain of T.

If $T$ is a linear transformation, then

$T(0)=0$

and

$T(c\vec{u}+d\vec{v})=cT(\vec{u})+dT(\vec{v})$

for all vectors $\vec{u}$, $\vec{v}$ in the domain of $T$ and all scalars $c$, $d$.

#### Parametric Vector Form

Parametric Vector form is the explicit description of the plane as the set spanned by u and v.

Parametric form:

x=s*u+t*v (s,t,$\mathbb{R}$)

#### Homework Problems

**Section 1.4**

31) Let $A$ be a $3 \times 2$ matrix. Explain why the equation $A \vec{x} = \vec{b}$ cannot be consistent for all $\vec{b}$ in $\mathbb{R}^3$.

The matrix $A$ has more rows than it has columns. This makes it impossible for each row to contain a pivot position. As per Theorem 4, this means that $A \vec{x} = \vec{b}$ cannot be consistent for all $\vec{b}$ in $\mathbb{R}^3$. (i.e. There is not a solution for * every* $\vec{b}$ in $\mathbb{R}^3$ using the equation $A \vec{x} = \vec{b}$.)

**Section 1.6**

6) We are given $Al_2O_3+C=Al + CO_2$. From here, we assign the matrix $\vec{X}=\begin{bmatrix}Al\\O\\C\end{bmatrix}$ to the two sides of the equation to get the matrix $\begin{bmatrix}2\\3\\0\end{bmatrix}X_1+\begin{bmatrix}0\\0\\1\end{bmatrix}X_2=\begin{bmatrix}1\\0\\0\end{bmatrix}X_3+\begin{bmatrix}0\\2\\1\end{bmatrix}X_4$.

We move the $X's$ to one side to get: $\begin{bmatrix}2\\3\\0\end{bmatrix}X_1+\begin{bmatrix}0\\0\\1\end{bmatrix}X_2+\begin{bmatrix}-1\\0\\0\end{bmatrix}X_3+\begin{bmatrix}0\\-2\\-1\end{bmatrix}X_4=\begin{bmatrix}0\\0\\0\end{bmatrix}$

This gives us the augmented matrix:

$\begin{bmatrix}�1362�2&0&{-1}&0$0\\3$0$0${-2}&0\\0&1&0&{-1}&0�1363�\end{bmatrix}$ which row reduces to $\begin{bmatrix}�1364�2&0&{-1}&0$0\\0$1$0${-1}&0\\0&0&{\frac{3}{2}}&{-2}&0�1365�\end{bmatrix}$

From here, we know that

$X_1=\frac{1}{2}X_3$

$X_2=X_4$

$X_3=\frac{4}{3}X_3$

$X_4=X_4$

We assign:

$X_1=2$

$X_2=3$

$X_3=4$

$X_4=3$

Therefore, the balanced equation is:

$2Al_2O_3+3C=4Al + 3CO_2$.

**Section 1.8**

25) Given $\vec{v} \neq 0$ and $\vec{p}$ in $\mathbb{R}^n$, the line through $\vec{p}$ in direction $\vec{v}$ has the parametric equation $\vec{x}=\vec{p}+t\vec{v}$. Show that linear transformation $T: \mathbb{R}^n \rightarrow \mathbb{R}^n$ maps this line onto another line or a single point.

First, just attempt to apply the transformation to the equation, yielding $T(\vec{x})=T(\vec{p}+t\vec{v})$. Notice that since our transformation is linear, our definitions apply and the following manipulations take place:

$T(\vec{x})=T(\vec{p}+t\vec{v})$

$=T(\vec{p})+T(t\vec{v})$

$=T(\vec{p})+tT(\vec{v})$

Notice this equation holds the form of a point if $t=0$, and a parametric line otherwise. $\blacksquare$