The Linear Algebra Notes

4,058次阅读
没有评论

3.Multiplication and Inverse Matrices

We use $a_{ij}$ to describe the element of $A$ on line i, column j.

When do A and B allowed to be multiplied

$$A_{\color{blue}{m}\times\color{red}{k}}B_{\color{red}{k}\times\color{green}{n}}=C_{\color{blue}{m}\times\color{green}{n}}$$That means the multiplication must be a left matrix with k columns and a right matrix with k rows.

If the left matrix has m rows and the right matrix has n columns, the result will have m rows and n columns.

5 ways to do multiplication with matrices

1. Dot production of vector [$row_i$] and vector [$colomn_j$]

$$\begin{bmatrix}& & & \vdots \\a_{i1} & a_{i2} & \cdots & a_{ik} \\& & & \vdots\end{bmatrix}\begin{bmatrix}& b_{1j} & \\& b_{2j} & \\& \vdots & \\\cdots & b_{kj} & \cdots\end{bmatrix}=\begin{bmatrix}& \vdots & & \\\cdots & c_{ij} & \cdots & \\& \vdots & &\end{bmatrix}$$$$c_{ij}=\begin{bmatrix}a_{i1} & a_{i2} & \cdots & a_{ik}\end{bmatrix} \cdot\begin{bmatrix}b_{1j}\\b_{2j}\\\vdots\\b_{kj}\end{bmatrix}=a_{i1}b_{1j}+a_{i2}b_{2j}+\cdots+a_{ik}b_{kj}=\sum^{k}_{p=1}a_{ip}b_{pj}$$

2. Each column of C is the linear combination of A

$$\begin{bmatrix}A_{col_1} & A_{col_2} & \cdots & A_{col_k}\end{bmatrix}\begin{bmatrix}\cdots & b_{1j} & \cdots \\\cdots & b_{2j} & \cdots \\\cdots & \vdots & \cdots \\\cdots & b_{kj} & \cdots\end{bmatrix}=\begin{bmatrix}\cdots & (b_{1j}A_{col_1} & b_{2j}A_{col_2} & \cdots & b_{kj}A_{col_k}) & \cdots\end{bmatrix}$$

3. Each row of C is the linear combination of B

$$\begin{bmatrix}\vdots & \vdots & \vdots & \vdots \\a_{i1} & a_{i2} & \cdots & a_{ik} \\\vdots & \vdots & \vdots & \vdots\end{bmatrix}\begin{bmatrix}B_{row_1} \\ B_{row_2} \\ \vdots \\ B_{row_k}\end{bmatrix}=\begin{bmatrix}\vdots \\(a_{i1}B_{row_1} + a_{i2}B_{row_2} + \cdots + a_{ik}B_{row_k}) \\\vdots \\\end{bmatrix}$$

4. Multiply columns of A by rows of B

$$\begin{bmatrix}A_{col_1} & A_{col_2} & \cdots & A_{col_k}\end{bmatrix}\begin{bmatrix}B_{row_1} \\ B_{row_2} \\ \vdots \\ B_{row_k}\end{bmatrix}=A_{col_1}B_{row_1}+A_{col_2}B_{row_2}+\cdots+A_{col_k}B_{row_k}$$

5. Block Multiplication

$$\begin{bmatrix}\begin{array}{c | c}A_1 & A_2 \\ \hline{} A_3 & A_4\end{array}\end{bmatrix}\begin{bmatrix}\begin{array}{c | c}B_1 & B_2 \\ \hline{} B_3 & B_4\end{array}\end{bmatrix}=\begin{bmatrix}\begin{array}{c | c}A_1B_1+A_2B_3 & A_1B_2+A_2B_4 \\ \hline{}A_3B_1+A_4B_3 & A_3B_2+A_4B_4\end{array}\end{bmatrix}$$

What does the matrices multiplication mean?

Last lecture we have learnt that the matrices multiplication mixes two transformations into one. If the rotation matrix $A=\begin{bmatrix}0&-1\\1&0\end{bmatrix}$ and the shear matrix $B=\begin{bmatrix}1&1\\0&1\end{bmatrix}$, then the composition matrix $C=BA=\begin{bmatrix}1&-1\\1&0\end{bmatrix}$. See what happened in this video:

Python codes of this video

What if $D=AB$?

Python codes of this video

We can see that $AB \neq BA$ in this example.

Inverse (Square Matrices)

If matrix $A$ is invertible, then there is another matrix called $A$ inverse. And it multiplies $A$ produces $I$(Identity).
If left inverse exists, then also the matrix on the right can get identity. That means the left inverse of square matrix equals the right inverse.$$ A^{-1}A = AA^{-1} = I $$For rectangular matrices the left inverse isn’t the right inverse because the shape is not allow the multiplication.

When does inverse exist

Not all square matrices have inverses. The matrices which have inverses are called invertible or non-singular matrices.
For singular matrices, they have no inverses. And their determinant is 0. And we can find a non-zero vector $x$ with $Ax=0$.

For example, if $A=\begin{bmatrix}1 & 3 \\ 2 & 6\end{bmatrix}$, the columns of A are both on the same line. So every linear combination is on that line and is impossible to get $\begin{bmatrix}1 \\ 0\end{bmatrix}$ or $\begin{bmatrix}0 \\ 1\end{bmatrix}$. That means there is no matrix can be multiplied by $A$ to get $I$.
If $x=\begin{bmatrix}3 \\ -1\end{bmatrix}$, then $Ax=\begin{bmatrix}1 & 3 \\ 2 & 6\end{bmatrix}\begin{bmatrix}3 \\ -1\end{bmatrix}=0$.

Proof: Suppose that we can find a non-zero vector $x$ with $Ax=0$ and A is invertible, then $A^{-1}Ax=Ix=x=0$. So the assumption is false.

How to get $A^{-1}$

$A=\begin{bmatrix}1 & 3 \\ 2 & 7\end{bmatrix}$ for example, suppose $A^{-1}=\begin{bmatrix}a & b \\ c & d\end{bmatrix}$, then $\begin{bmatrix}1 & 3 \\ 2 & 7\end{bmatrix}\begin{bmatrix}a & c \\ b & d\end{bmatrix}=\begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix}$. We can get a system of equations:$\begin{cases}\begin{bmatrix}1 & 3 \\ 2 & 7\end{bmatrix}\begin{bmatrix}a \\ b\end{bmatrix}=\begin{bmatrix}1 \\ 0\end{bmatrix} \\\begin{bmatrix}1 & 3 \\ 2 & 7\end{bmatrix}\begin{bmatrix}c \\ d\end{bmatrix}=\begin{bmatrix}0 \\ 1\end{bmatrix}\end{cases}$

Gauss-Jordam(Solve 2 equations at once):
Stick the I to the right of A, making it an augmented matrix:$[A|I]=\begin{bmatrix}\begin{array}{cc | cc}1 & 3 & 1 & 0 \\2 & 7 & 0 & 1\end{array}\end{bmatrix}$.
Then transform it into $[I|E]$ by elimination.$$\begin{bmatrix}\begin{array}{cc | cc}1 & 3 & 1 & 0 \\2 & 7 & 0 & 1\end{array}\end{bmatrix}\xrightarrow[E_{21}]{row_2 \ – \ 2row_1}\begin{bmatrix}\begin{array}{cc | cc}1 & 3 & 1 & 0 \\0 & 1 & -2 & 1\end{array}\end{bmatrix}\xrightarrow[E_{12}]{row_1 \ – \ 3row_2}\begin{bmatrix}\begin{array}{cc | cc}1 & 0 & 7 & -3 \\0 & 1 & -2 & 1\end{array}\end{bmatrix}$$Look at the left side. The matrix operation $E=E_{21}E_{12}$ makes $A$ become $I$. So $EA=I$. And the right side $I$ becomes $E$ because $EI=E$. Then $E=A^{-1}$.

Notice

This is an outdated article of the linear algebra notes.

Please check the new version here: https://blog.lyzen.cn/2023/07/10/The-Linear-Algebra-Notes/

Lyzen
版权声明:本站原创文章,由 Lyzen 2022-07-09发表,共计15330字。
转载说明:除特殊说明外本站文章皆由CC-4.0协议发布,转载请注明出处。
评论(没有评论)
验证码