Section 2.1: Matrix Operations

Definition of multiplication:

If $A$ is an $mxn$ matrix and $B$ is an $nxp$ matrix where $B=[\vec{b_1},\vec{b_2},\vec{b_3}...\vec{b_p}$ then $AB$ is the matrix

$AB=A[\vec{b_1},\vec{b_2},\vec{b_3}...\vec{b_p}]=[A\vec{b_1},A\vec{b_2},A\vec{b_3}...A\vec{b_p}]$

Example 1:

Let $A=\begin{bmatrix}1&2\\0&-1\end{bmatrix}$ and $B=\begin{bmatrix}2&0&1\\1&-1&2\end{bmatrix}$

$AB=\begin{bmatrix}1&2\\0&-1\end{bmatrix}\begin{bmatrix}2&0&1\\1&-1&2\end{bmatrix}=\begin{bmatrix}A\vec{b_1}&A\vec{b_2}&A\vec{b_3}\end{bmatrix}$

$A\vec{b_1}=\begin{bmatrix}1&2\\0&-1\end{bmatrix}\begin{bmatrix}2\\1\end{bmatrix}=\begin{bmatrix}4\\-1\end{bmatrix}$

$A\vec{b_2}=\begin{bmatrix}1&2\\0&-1\end{bmatrix}\begin{bmatrix}0\\-1\end{bmatrix}=\begin{bmatrix}-2\\1\end{bmatrix}$

$A\vec{b_3}=\begin{bmatrix}1&2\\0&-1\end{bmatrix}\begin{bmatrix}1\\2\end{bmatrix}=\begin{bmatrix}-2\\5\end{bmatrix}$

$AB=\begin{bmatrix}4&-2&5\\-1&1&-1\end{bmatrix}$

##### Theorem 1

Let $A$, $B$, and $C$ be matrices of the same size, and let $r$ and $s$ be scalars.$A + B = B + A$ | $r(A + B) = rA + rB$ |

$(A + B) + C = A + (B + C)$ | $(r + s)A = rA + sA$ |

$A + 0 = A$ | $r(sA) = (rs)A$ |

##### Theorem 2

Let $A$ be an $m \times n$ matrix, and let $B$ and $C$ have sizes for which the indicated sums and products are defined.

$A(BC) = (AB)C$ | Associative Law |

$A(B + C) = AB + AC$ | Left Distributive Law |

$(B + C)A = BA + CA$ | Right Distributive Law |

$r(AB) = (rA)B = A(rB)$ | for any scalar $r$ |

$I_mA = A = AI_n$ | Identity Matrix |

##### Theorem 3

Let A and B denote matrices whose sizes are appropriate for the following sums and products.

a. $(A^T)^T = A$

b. $(A+B)^T = A^T+B^T$

c. For any scalar r, $(rA)^T = rA^T$

d. $(AB)^T = B^TA^T$

Section 2.2: The Inverse of a Matrix

##### Definitions:

Invertible (also known as nonsingular) = Matrices such that there exists a multiplicative inverse.

Not invertible (also known as singular) = Matrices such that there does not exist a multiplicative inverse.

*When differentiating between the two, remember Dr.V's reference to singular and nonsingular aka nonsingular matrices have a buddy and singular matrices do not.

##### Theorem 4

Let $A=\begin{bmatrix}a&b\\c&d\end{bmatrix}$. If $ad-bc\not=0$, then $A$ is invertible and

$A^{-1} =\frac{1}{ad-bc} \begin{bmatrix}d&-b\\-c&a\end{bmatrix}$

If $ad-bc = 0$, then $A$ is not invertible.

##### Theorem 5

Let $A$ be an $n \times n$ matrix, then for each $\overrightarrow{b}$ in $\mathbb{R}^{n}$ , the equation $A\overrightarrow{x} = \overrightarrow{b}$ has the unique solution $\overrightarrow{x} = A^{-1}\overrightarrow{b}$.

##### Theorem 6

- If $A$ is an invertible matrix, then $A^{-1}$ is invertible and

$(A^{-1})^{-1} = A$

- If $A$and $B$ are $n \times n$ invertible matrices, then so is $AB$, and the inverse of $AB$ is the product of the inverses of $A$ and $B$ in reverse order.

That is, $(AB)^{-1} = B^{-1}A^{-1}$

- If $A$ is an invertible matrix, then so is $A^{T}$, and the inverse of $A^{T}$ is the transpose of $A^{-1}$. That is,

$(A^{T})^{-1} = (A^{-1})^{T}$

##### Theorem 7

An $n \times n$ matrix $A$ is invertible if and only if $A$ is row equivalent to $I_n$, and in this case, any sequence of elementary row operations that

reduces $A$ to $I_n$ also transforms $I_n$ into $A^{-1}$.

##### Theorem 8

Let $A$ be a square $n \times n$ matrix. Then the following statements are equivalent. That is, for a given $A$, the statements are either all true or all false.

a. $A$ is an invertible matrix

b. $A$ is row equivalent to the $n \times n$ identity matrix.

c. $A$ has n pivot position

d. The equation $Ax = 0$ has only the trivial solution

e. The columns of A form a linearly independent set.

f. The linear transformation $x$ to $Ax$ is one-to one.

g. The equation $Ax = b$ has at least one solution for each $b$ in $R^n$

h. The columns of $A$ span $R^n$

i. The linear transformation $x$ to $Ax$ maps $R^n$ onto $R^n$

j. There is an $n \times n$ matrix $C$ such that $CA = I$

k. There is an $n \times n$ matrix $D$ such that $AD = I$

l. $A^T$ is an invertible matrix

###### Theorem 8: Explained

Equivalencies | Theorem, Definition, etc. |
---|---|

a $\Leftrightarrow$ l | Section 2, Theorem 6.c, (p.105) |

d $\Leftrightarrow$ e | Fact of Linear Independence, (p. 57) |

g $\Leftrightarrow$ h | Section 1, Theorem 4, (p. 37) |

cell-content | cell-content |

cell-content | cell-content |

cell-content | cell-content |

cell-content | cell-content |

cell-content | cell-content |

cell-content | cell-content |

cell-content | cell-content |

cell-content | cell-content |

##### Theorem 9

Let $T : R^n —> R^n$ be a linear transformation and let *A* be the standard matrix for *T*. Then *T* is invertible if and only if *A* is an invertible matrix.

In that case, the linear transformation *S* given by $S(x) = A^{-1}x$ is the unique function satisfying equations (1) and (2).

#### Homework Problems

**Section 2.1**

23) If $CA=I_n$, show that $A\vec{x} = 0$ has only the trivial solution. Why must $A$ not have more columns then rows?

Notice that

$\vec{x} = (I_n)\vec{x} = CA(\vec{x}) = C(A\vec{x})=C(\vec{0})=\vec{0}$.

This implies only trivial solution exists.

If $A$ had more columns then rows, the system would be linearly dependent, and non-trivial solutions would exist. This would contradict our previous logic.

**Section 2.2**

17) If $A$, $B$, and $C$ are invertible, show that $ABC$ is invertible.

Notice if $ABC$ was invertible, then there would exist a matrix $D$ such that $(ABC)D = D(ABC) = I$, where $I$ is the identity matrix.

Consider $D=C^{-1}B^{-1}A^{-1}$. Notice $(ABC)(C^{-1}B^{-1}A^{-1}) = I$ and $(C^{-1}B^{-1}A^{-1})(ABC) = I$. Additionally, notice by theorem 6, $C^{-1}B^{-1}A^{-1}=(ABC)^{-1}$.

Thus, $ABC$ truly is invertible.