Introdunction to Linear Algebra-第2章-Solving Linear Equations

Introdunction to Linear Algebra-第2章-Solving Linear Equations

[TOC]
《Introdunction to Linear Algebra》的第2章:Solving Linear Equations.

3 种理解线性方程组的方式

  1. 列的 linear combination
    the column picture of $Ax=\boldsymbol{b}$:a combination of $n$ columns of $A$ produces the vector $\boldsymbol{b}$.
  2. 行乘以列的 inner product
    the raw picture of $Ax=b$: $m$ eqations from $m$ rows give $m$ planes meeting at $x$.
    $(\textbf{row} : 1) \cdot x = b_1,…,(\textbf{row} : m) \cdot x = b_m$
  3. 矩阵的形式: $Ax=\boldsymbol{b}$

ROWS: The row picture shows two lines meeting at a single point (the solution).

COLUMNS: The column picture combines the column vectors on the left side to produce the vector $\boldsymbol{b}$ on the right side.

2.2 The Idea of Elimination

消元法为了得到 upper triangular system $U$
The pivots are on the diagonal of the triangle after elimination. pivot 不能为 0 否则要么无解(平行), 要么有无穷多个点等号右侧也为 0.

Temporary failure (zero in pivot). A row exchange produces two pivots.

Singular equations have no solution or infinitely many solutions. Pivots must be nonzero because we have to divide by them.

A linear system ($A \boldsymbol{x} = \boldsymbol{b}$) becomes upper triangular ($U \boldsymbol{x} = \boldsymbol{c}$) after elimination.

2.3 Elimination Using Matrices

本节继承上一节的消元思想, 通过矩阵相乘的操作实现上面的过程, 一个是消元矩阵 $E_{ij}$ 以及行交换矩阵 $P_{ij}$, 它们都是从简单的单位矩阵转化而来. 然后从消元的过程中理解矩阵相乘的原理, 例如行向量与列向量的点积, 矩阵乘法的分配率等.

本节概要

  1. The first step multiplies the equations $A \boldsymbol{x}=\boldsymbol{b}$ by a matrix $E_{21}$ to produce $E_{21} A \boldsymbol{x}=E_{21} \boldsymbol{b}$.
  2. That matrix $E_{21} A$ has a zero in row 2, column 1 because $x_{1}$ is eliminated from equation $2 .$
  3. $E_{21}$ is the identity matrix (diagonal of 1’s) minus the multiplier $a_{21} / a_{11}$ in row 2, column $1 .$
  4. Matrix-matrix multiplication is $n$ matrix-vector multiplications: $\boldsymbol{E} \boldsymbol{A}=\left[\boldsymbol{E} \boldsymbol{a}_{\mathbf{1}} \quad \cdots \boldsymbol{E} \boldsymbol{a}_{\boldsymbol{n}}\right]$
  5. We must also multiply $E \boldsymbol{b} !$ So $E$ is multiplying the **augmented matrix** $[A \: \boldsymbol{b}]=\left[\boldsymbol{a}_{1} \ldots \boldsymbol{a}_{n} \; \boldsymbol{b}\right]$.
  6. Elimination multiplies $A \boldsymbol{x}=\boldsymbol{b}$ by $E_{21}, E_{31}, \ldots, E_{n 1}$, then $E_{32}, E_{42}, \ldots, E_{n 2}$, and onward.
  7. The row exchange matrix is not $E_{i j}$ but $P_{i j} .$ To find $P_{i j}$, exchange rows $i$ and $j$ of $I$.

The identity matrix has $1$ ‘s on the diagonal and otherwise $0$’s. Then $I \boldsymbol{b} = \boldsymbol{b}$ for all $\boldsymbol{b}$.
The elementary matrix or elimination matrix $E_{ij}$ has the extra nonzero entry $-l$
in the $i, j$ position. Then $E_{ij}$ subtracts a multiple $l$ of row $j$ from row $i$.

Start with $A$. Apply $E’s$ to produce zeros below the pivots (the first $E$ is $E_{21}$). End with a triangular $U$.

Row Exchange Matrix $P_{ij}$ is the identity matrix with rows $i$ and $j$ reversed.
When this “permutation matrix“ $P_{ij}$ multiplies a matrix, it exchanges rows $i$ and $j$.

The Augmented Matrix: $A \boldsymbol{x}=\boldsymbol{b}$ 里的 $A$ 增广为 $[A ; \boldsymbol{b}]$.

2.4 Rules for Matrix Operations

本节概要

  1. Matrices $A$ with $n$ columns multiply matrices $B$ with $n$ rows : $A_{\boldsymbol{m} \times \boldsymbol{n}} B_{\boldsymbol{n} \times \boldsymbol{p}}=C_{\boldsymbol{m} \times \boldsymbol{p}}$.
  2. Each entry in $A B=C$ is a dot product: $C_{i j}=($ row $i$ of $A) \cdot($ column $j$ of $B)$
  3. This rule is chosen so that $A B$ times $C$ equals $A$ times $B C .$ And $(A B) x=A(B x)$.
  4. More ways to compute $A B:$ ( $A$ times columns of $B$ ) (rows of $A$ times $B$ ) (columns times rows).
  5. It is not usually true that $A B=B A .$ In most cases $A$ doesn’t commute with $B$.
  6. Matrices can be multiplied by blocks: $A=\left[A_{1} : A_{2}\right]$ times $B=\left[\begin{array}{l}B_{1} \\ B_{2}\end{array}\right]$ is $A_{1} B_{1}+A_{2} B_{2}$.

四种方式理解矩阵乘法

  1. 行列点积
    The entry in row $i$ and column $j$ of $AB$ is (row $i$ of $A$) · (column $j$ of $B$).

  2. column picture of matrix multiplication:
    Matrix $A$ times every column of $B$:
    $$
    A\left[b_{1} \cdots b_{p}\right]=\left[A b_{1} \cdots A b_{p}\right]
    $$

  3. row picture of matrix multiplication:
    $$
    [\text { row } i \text { of } A]\left[\begin{array}{lll}
    1 & 2 & 3 \\
    4 & 5 & 6 \\
    7 & 8 & 9
    \end{array}\right]=[\text { row } i \text { of } A B]
    $$

  4. columns multiply rows
    $$
    \left[\begin{array}{ccc}
    \text { col} 1 & \text {col} 2 & \text {col} 3 \\
    \cdot & \cdot & \cdot \\
    \cdot & \cdot & \cdot
    \end{array}\right]\left[\begin{array}{cc}
    \text { row 1 } & \cdots \cdot \\
    \text { row 2 } & \cdots \cdot \\
    \text { row 3 } & \cdots \cdot
    \end{array}\right]=(\operatorname{col} 1)(\text { row 1 })+(\operatorname{col} 2)(\text { row 2 })+(\operatorname{col} 3)(\text { row 3 })
    $$

$$
A B=\left[\begin{array}{ll}
a & b \\
c & d
\end{array}\right]\left[\begin{array}{ll}
E & F \\
G & H
\end{array}\right]=\left[\begin{array}{ll}
a E+b G & a F+b H \\
c E+d G & c F+d H
\end{array}\right]
$$

The Laws for Matrix Operations

$$
\begin{aligned}
A+B &=B+A & & \text { (commutative law) } \\
c(A+B) &=c A+c B & & \text { (distributive law) } \\
A+(B+C) &=(A+B)+C & & \text { (associative law). } \\
A B &\ne B A & & \\
A(B+C) &=A B+A C & & \text { (the commutative “law” is usually broken) } \\
(A+B) C &=A C+B C & & \text { (distributive law from the left) } \\
A(B C) &=(A B) C & & \text { (associative law for } A B C \text { ) (parentheses not needed ) }
\end{aligned}
$$

$A^{p}=A A A \cdots A(p$ factors $) \quad\left(A^{p}\right)\left(A^{q}\right)=A^{p+q} \quad\left(A^{p}\right)^{q}=A^{p q}$

Block multiplication: If blocks of $A$ can multiply blocks of $B$, then block multiplication of $A B$ is allowed. Cuts between columns of $A$ match cuts between rows of $B$.

$$
\left[\begin{array}{ll}
A_{11} & A_{12} \\
A_{21} & A_{22}
\end{array}\right]\left[\begin{array}{l}
B_{11} \\
B_{21}
\end{array}\right]=\left[\begin{array}{l}
A_{11} B_{11}+A_{12} B_{21} \\
A_{21} B_{11}+A_{22} B_{21}
\end{array}\right]
$$

分块法则解释了上面的 columns multiply rows 理解角度

Schur complement
通过块运算(矩阵乘积以及矩阵的逆)进行消元操作.

$$
\begin{aligned}
&\text { Block } \\
&\text { elimination }
\end{aligned}\left[\begin{array}{c|c}
I & \mathbf{0} \\
\hline-C A^{-1} & I
\end{array}\right]\left[\begin{array}{c|c}
A & B \\
\hline C & D
\end{array}\right]=\left[\begin{array}{c|c}
A & B \\
\hline \mathbf{0} & \boldsymbol{D}-\boldsymbol{C A}^{-1} \boldsymbol{B}
\end{array}\right]
$$

2.5 Inverse Matrices

本节概要

  1. If the square matrix $A$ has an inverse, then both $A^{-1} A=I$ and $A A^{-1}=I$.
  2. The algorithm to test invertibility is elimination: $A$ must have $n$ (nonzero) pivots.
  3. The algebra test for invertibility is the determinant of $A: \operatorname{det} A$ must not be zero.
  4. The equation that tests for invertibility is $A \boldsymbol{x}=\mathbf{0}: \boldsymbol{x}=\mathbf{0}$ must be the only solution.
  5. If $A$ and $B$ (same size) are invertible then so is $A B:(A B)^{-1}=B^{-1} A^{-1}$.
  6. $A A^{-1}=I$ is $n$ equations for $n$ columns of $A^{-1}$. Gauss-Jordan eliminates $[A : I]$ to $\left[I : A^{-1}\right]$.
  7. The last page of the book gives 14 equivalent conditions for a square $A$ to be invertible.

假设 $A$ 为方阵.

  • Note 1: The inverse exists if and only if elimination produces $n$ pivots (row exchanges are allowed). Elimination solves $A \boldsymbol{x} = \boldsymbol{b}$ without explicitly using the matrix $A^{-1}$.
  • Note 2: The matrix $A$ cannot have two different inverses.也就是 left-inverse = right-inverse
  • Note 3: If $A$ is invertible, the one and only solution to $Ax = \boldsymbol{b}$ is $\boldsymbol{x} = A^{- 1} \boldsymbol{b}$:
  • Note 4 (Important): Suppose there is a nonzero vector $\boldsymbol{x}$ such that $A \boldsymbol{x} = 0$. Then $A$ cannot have an inverse. No matrix can bring $0$ back to $\boldsymbol{x}$.If $A$ is invertible, then $A \boldsymbol{x} = 0$ can only have the zero solution $\boldsymbol{x} = A^{- 1} 0 = 0$.
  • Note 5: A $2 \times 2$ matrix is invertible if and only if $ad- be$ is not zero.$A$ matrix is invertible if its determinant is not zero.
  • Note 6: A diagonal matrix has an inverse provided no diagonal entries are zero :

$$
\text { If } A=\left[\begin{array}{lll}
d_{1} & & \\
& \ddots & \\
& & d_{n}
\end{array}\right] \text { then } A^{-1}=\left[\begin{array}{ccc}
1 / d_{1} & & \\
& \ddots & \\
& & 1 / d_{n}
\end{array}\right]
$$

对于$A + B$: If(not only if) $A$ and $B$ are invertible then so is $AB$. The inverse of a product AB is

$$
(AB)^{-1}=B^{-1}A^{-1}
$$

从行消元理解上面的顺序问题:

$E=\left[\begin{array}{rrr}1 & 0 & 0 \\ -5 & 1 & 0 \\ 0 & 0 & 1\end{array}\right] \quad$ and $\quad E^{-1}=\left[\begin{array}{lll}1 & 0 & 0 \\ 5 & 1 & 0 \\ 0 & 0 & 1\end{array}\right]$

$F=\left[\begin{array}{rrr}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & -4 & 1\end{array}\right] \quad$ and $\quad F^{-1}=\left[\begin{array}{lll}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 4 & 1\end{array}\right]$

$F E=\left[\begin{array}{rrr}1 & 0 & 0 \\ -5 & 1 & 0 \\ 20 & -4 & 1\end{array}\right]$ is inverted by $E^{-1} F^{-1}=\left[\begin{array}{lll}1 & 0 & 0 \\ 5 & 1 & 0 \\ 0 & 4 & 1\end{array}\right]$​

有 $20$ 这个元素是因为先从第一行乘以 $-5$ 到第二行($E$ 矩阵), 从第二行乘以 $-4$ 到第三行时 $-5$ 也被影响因此最终的复合结果为 $20$. 但是对于$E^{-1}F^{-1}$, 先通过第一行直接乘以 $4$ 到第三行并不会受到从第一行到第二行的影响, 因此不存在 $20$ 项.

Calculating $A^{-1}$ by Gauss-Jordan Elimination

求逆并不是求解方程矩阵的最好办法, 一方面是因为有些矩阵没有逆, 另一方面就算有逆求逆的过程也会很低效(见 Chapter 11). 但是假定一些条件下需要求逆.

$$ A A^{-1}=A\left[\begin{array}{lll} \boldsymbol{x}_{1} & \boldsymbol{x}_{2} & \boldsymbol{x}_{3} \end{array}\right]=\left[\begin{array}{lll} \boldsymbol{e}_{1} & \boldsymbol{e}_{2} & \boldsymbol{e}_{3} \end{array}\right]=I $$

通过第 $i$ 行凑到除了第 $i$ 个元素为 $1$ 之外其他元素都为 $0$ 的列向量的形式得到逆.

例如第一行$A \boldsymbol{x}_1 = \boldsymbol{e}_1 = (1, 0, 0)$.

下面以一个$3 \times 3$的方阵 $K$ 一步步讲述 Gauss-Jordan 消元过程.

先构造一个增广矩阵 $[K : I]$,然后把这个增广矩阵化成上三角矩阵的形式.

$$ \begin{aligned} {\left[\begin{array}{llll} K & \boldsymbol{e}_{1} & \boldsymbol{e}_{2} & \boldsymbol{e}_{3} \end{array}\right] } &=\left[\begin{array}{rrrrrr} \mathbf{2} & -\mathbf{1} & \mathbf{0} & 1 & 0 & 0 \\\\ -\mathbf{1} & \mathbf{2} & -\mathbf{1} & 0 & 1 & 0 \\\\ \mathbf{0} & -\mathbf{1} & \mathbf{2} & 0 & 0 & 1 \end{array}\right] \text { Start Gauss-Jordan on } K \\\\ & \rightarrow\left[\begin{array}{rrrrrr} 2 & -1 & 0 & 1 & 0 & 0 \\\\ \mathbf{0} & \frac{3}{2} & -\mathbf{1} & \frac{1}{2} & \mathbf{1} & \mathbf{0} \\\\ 0 & -1 & 2 & 0 & 0 & 1 \end{array}\right] \quad\left(\frac{1}{2} \text { row } 1+\text { row } \mathbf{2}\right)\\\\ & \rightarrow\left[\begin{array}{rrrrrr} 2 & -1 & 0 & 1 & 0 & 0 \\\\ 0 & \frac{3}{2} & -1 & \frac{1}{2} & 1 & 0 \\\\ \mathbf{0} & \mathbf{0} & \frac{4}{3} & \frac{1}{3} & \frac{2}{3} & \mathbf{1} \end{array}\right] \quad \left(\frac{2}{3} \text { row } 2 + \text { row } \mathbf{3}\right) \end{aligned} $$

到这一步得到上三角矩阵 $U$ 是 Gauss 的贡献, Gauss 得到 $U$ 后进行回代计算求解, 然而 Jordan 继续进行消元, 如下

$$
\begin{aligned}
&\left(\begin{array}{l}
\text { Zero above } \\
\text { third pivot }
\end{array}\right) \rightarrow\left[\begin{array}{rrrrrr}
2 & -1 & 0 & 1 & 0 & 0 \\
0 & \frac{3}{2} & 0 & \frac{3}{4} & \frac{3}{2} & \frac{3}{4} \\
0 & 0 & \frac{4}{3} & \frac{1}{3} & \frac{2}{3} & 1
\end{array}\right] \quad\left(\frac{3}{4} \text { row 3 }+\right.\text { row 2) } \\
&\left(\begin{array}{l}
\text { Zero above } \\
\text { second pivot }
\end{array}\right) \rightarrow\left[\begin{array}{cccccc}
2 & 0 & 0 & \frac{3}{2} & 1 & \frac{1}{2} \\
0 & \frac{3}{2} & 0 & \frac{3}{4} & \frac{3}{2} & \frac{3}{4} \\
0 & 0 & \frac{4}{3} & \frac{1}{3} & \frac{2}{3} & 1
\end{array}\right] \quad\left(\frac{2}{3} \text { row 2 }\right) \text { row 1) }
\end{aligned}
$$

对每个 pivot 除以系数得到左侧的单位矩阵
$$
\begin{aligned}
&(\text { divide by } 2) \\
&(\text { divide by } \frac{3}{2}) \\
&(\text { divide by } \frac{4}{3})
\end{aligned}\left[\begin{array}{cccccc}
1 & 0 & 0 & \frac{3}{4} & \frac{1}{2} & \frac{1}{4} \\
0 & 1 & 0 & \frac{1}{2} & 1 & \frac{1}{2} \\
0 & 0 & 1 & \frac{1}{4} & \frac{1}{2} & \frac{3}{4}
\end{array}\right]=\left[\begin{array}{llll}
I & x_{1} & x_{2} & x_{3}
\end{array}\right]=\left[\begin{array}{ll}
I & \boldsymbol{K}^{-1}
\end{array}\right] \text {. }
$$
总结起来

Gauss-Jordan Multiply $[ A ; I]$ by $ A^{- 1}$ to get $[I ; A^{- 1}]$.

可以看到几个逆的特点:

  1. 对称性的一致, $K$ is symmetric across its main diagonal. Then $K^{- 1}$ is also symmetric.
  2. 即便 $K$ 是稀疏的, 逆也大概率是密布的, 计算效率低因此除了较小的矩阵外我们不会选择通过计算逆的形式解方程. $K$ is tridiagonal (only three nonzero diagonals). But $K^{- 1}$ is a dense matrix with no zeros. That is another reason we don’t often compute inverse matrices. The inverse of a band matrix is generally a dense matrix.
  3. 秩与逆的关系(这里 $K$ 的秩是 4)):
    $K^{-1}$ involves division by the determinant of $K$

$$
K^{-1}=\frac{1}{4}\left[\begin{array}{lll}
3 & 2 & 1 \\
2 & 4 & 2 \\
1 & 2 & 3
\end{array}\right]
$$

This is why an invertible matrix cannot have a zero determinant: we need to divide.

计算复杂度:

对于 $n \times n$ 矩阵, 求逆: $n^3$, 通过 $[A ; \boldsymbol{b}]$ 增广矩阵求解: $n^3/3$.

Singular versus Invertible

命题
$A^{-1}$ exists exactly when $A$ has a full set of $n$ pivots.

通过 Gauss-Jordan 消元法证明

  1. 右逆
    如果有 $n$ 个 pivots, 消元可以求解所有的方程: $A\boldsymbol{x}_i=\boldsymbol{e}_i$, $\boldsymbol{x}_i$构成了 $A^{-1}$, 得到 $A A^{-1} = I$,因此必有右逆.
  2. 左逆
    通过下式可以把左逆 $C$ 分解为如下

$$
C A=\left(D^{-1} \cdots E \cdots P \cdots E\right) A=I
$$

其中 $D$ 是对 pivot 归一化的矩阵,$E$ 是消元的矩阵,$P$ 是行交换的矩阵,由此可得左逆存在.

左右逆均存在,证明命题.

关于逆的一些性质:

  • A triangular matrix is invertible if and only if no diagonal entries are zero.

  • If $L$ is lower triangular with 1’s on the diagonal, so is $L^{-1}$ .

Recognizing an Invertible Matrix

Diagonally dominant matrices are invertible. Each $a_{ii}$ on the diagonal is larger than
the total sum along the rest of row $i$. On every row,
$$
\left|a_{i i}\right|>\sum_{j \neq i}\left|a_{i j}\right| \text { means that }\left|a_{i i}\right|>\left|a_{i 1}\right|+\cdots\left(\text { skip }\left|a_{i i}\right|\right) \cdots+\left|a_{i n}\right|
$$

Examples. $A$ is diagonally dominant ($3> 2$), thus invertible. $B$ is not (but still invertible). $C$ is singular.
$$
A=\left[\begin{array}{lll}
3 & 1 & 1 \\
1 & 3 & 1 \\
1 & 1 & 3
\end{array}\right] \quad B=\left[\begin{array}{lll}
2 & 1 & 1 \\
1 & 2 & 1 \\
1 & 1 & 3
\end{array}\right] \quad C=\left[\begin{array}{lll}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 3
\end{array}\right]
$$

2.6 Elimination = Factorization: $A = LU$

本节概要

  1. Each elimination step $E_{i j}$ is inverted by $L_{i j} .$ Off the main diagonal change $-\ell_{i j}$ to $+\ell_{i j}$.
  2. The whole forward elimination process (with no row exchanges) is inverted by $\boldsymbol{L}$ :

$$
\boldsymbol{L}=\left(L_{21} L_{31} \ldots L_{n 1}\right)\left(L_{32} \ldots L_{n 2}\right)\left(L_{43} \ldots L_{n 3}\right) \ldots\left(L_{n n-1}\right)
$$

  1. That product matrix $L$ is still lower triangular. Every multiplier $\ell_{i j}$ is in row $i$, column $j$.
  2. The original $A$ is recovered from $U$ by $A=L U=$ (lower triangular)(upper triangular).
  3. Elimination on $A \boldsymbol{x}=\boldsymbol{b}$ reaches $U \boldsymbol{x}=\boldsymbol{c} .$ Then back-substitution solves $U \boldsymbol{x}=\boldsymbol{c} .$
  4. Solving a triangular system takes $n^{2} / 2$ multiply-subtracts. Elimination to find $U$ takes $n^{3} / 3$.

其实线性代数中的很多核心理念都是对矩阵进行分解. Many key ideas of linear algebra, are really factorizations of a matrix.

LU 分解(本节不考虑行交换,下节考虑行交换):

$$
\left(E_{32} E_{31} E_{21}\right) A=U \quad \text { becomes } \quad A=\left(E_{21}^{-1} E_{31}^{-1} E_{32}^{-1}\right) U \quad \text { which is } \quad A=L U
$$

LU 分解与消元的关系:
$l_{ij}$ 代表对第 $j$ 行乘以 $-l_{ij}$ 倍, 然后加到第 $i$ 行. $L$ 矩阵的第 $(i,j)$ 位置的元素对应的即为 $l_{ij}$, 在下三角部分, 对角线元素均为 1.
例子:

$$
A=\left[\begin{array}{lll}
2 & 1 & 0 \\
1 & 2 & 1 \\
0 & 1 & 2
\end{array}\right]=\left[\begin{array}{lll}
1 & 0 & 0 \\
\frac{1}{2} & 1 & 0 \\
0 & \frac{2}{3} & 1
\end{array}\right]\left[\begin{array}{lll}
2 & 1 & 0 \\
0 & \frac{3}{2} & 1 \\
0 & 0 & \frac{4}{3}
\end{array}\right]=L U
$$

可以看出 $U$ 的对角线元素不为 1, 为了化成与 $L$ 的对称形式, 可以对 $U$ 进一步分解:

$$
\text { Split } U \text { into }\left[\begin{array}{llll}
d_{1} & & & \\
& d_{2} & & \\
& & \ddots & \\
& & & d_{n}
\end{array}\right]\left[\begin{array}{cccc}
1 & u_{12} / d_{1} & u_{13} / d_{1} & \cdot \\
& 1 & u_{23} / d_{2} & \cdot \\
& & \ddots & \vdots \\
& & & & 1
\end{array}\right]
$$

$A= LU \rightarrow A= LDU$
$D$ 为除对角线以外元素均为 0 的矩阵(也就是本征值矩阵).

LU 分解的小技巧:

  • When a row of $A$ starts with zeros, so does that row of $L$.
  • When a column of $A$ starts with zeros, so does that column of $U$.

One Square System = Two Triangular Systems

用 LU 分解求解方程组, 步骤如下:

1 Factor (into $L$ and $U$, by elimination on the left side matrix $A$).

2 Solve (forward elimination on $\boldsymbol{b}$ using $L$, then back substitution for $\boldsymbol{x}$ using $U$).

对于第二步具体如下:
Solve $L\boldsymbol{c} = \boldsymbol{b}$ and then solve $U\boldsymbol{x} = \boldsymbol{c}$

复杂度分析:
第一步: Elimination on $A$ requires about $\frac{n^3}{2}$ multiplications and $\frac{n^3}{2}$ subtractions.
第二步: Solve Each right side needs $n^2$ multiplications and $n^2$ subtractions.

对于稀疏矩阵, 运算可以更快.

2.7 Transposes and Permutations

本节概要

  1. The transposes of $A \boldsymbol{x}$ and $A B$ and $A^{-1}$ are $\boldsymbol{x}^{\mathrm{T}} A^{\mathrm{T}}$ and $B^{\mathrm{T}} A^{\mathrm{T}}$ and $\left(A^{\mathrm{T}}\right)^{-1}$.
  2. The dot product (inner product) is $\boldsymbol{x} \cdot \boldsymbol{y}=\boldsymbol{x}^{\mathrm{T}} \boldsymbol{y}$. This is $(1 \times n)(n \times 1)=(1 \times 1)$.
    The outer product is $\boldsymbol{x} \boldsymbol{y}^{\mathbf{T}}=$ column times row $=(n \times 1)(1 \times n)=n \times n$ matrix.
  3. The idea behind $A^{\mathrm{T}}$ is that $A \boldsymbol{x} \cdot \boldsymbol{y}$ equals $\boldsymbol{x} \cdot A^{\mathrm{T}} \boldsymbol{y}$ because $(A \boldsymbol{x})^{\mathrm{T}} \boldsymbol{y}=\boldsymbol{x}^{\mathrm{T}} A^{\mathrm{T}} \boldsymbol{y}=\boldsymbol{x}^{\mathrm{T}}\left(A^{\mathrm{T}} \boldsymbol{y}\right) .$
  4. A symmetric matrix has $\boldsymbol{S}^{\mathbf{T}}=\boldsymbol{S}$ (and the product $A^{\mathrm{T}} A$ is always symmetric).
  5. An orthogonal matrix has $Q^{\mathrm{T}}=Q^{-1}$. The columns of $Q$ are orthogonal unit vectors.
  6. A permutation matrix $P$ has the same rows as $I$ (in any order). There are $\boldsymbol{n} !$ different orders.
  7. Then $P \boldsymbol{x}$ puts the components $x_{1}, x_{2}, \ldots, x_{n}$ in that new order. And $P^{\mathrm{T}}$ equals $P^{-1}$.

转置的概念

The transpose of a lower triangular matrix is upper triangular. (But the inverse is still lower triangular).

运算规则:

$$
\begin{array}{lcc}
\text { Sum } & \text { The transpose of } A+B \text { is } A^{\mathrm{T}}+B^{\mathrm{T}} \text {. } \\
\text { Product } & \text { The transpose of } A B \text { is }(A B)^{\mathrm{T}}=B^{\mathrm{T}} A^{\mathrm{T}} \text {. } \\
\text { Inverse } & \text { The transpose of } A^{-1} \text { is }\left(A^{-1}\right)^{\mathrm{T}}=\left(A^{\mathrm{T}}\right)^{-1} \text {. }
\end{array}
$$

对于 LU 分解:

$$
\text { If } A=L D U \text { then } A^{\mathrm{T}}=U^{\mathrm{T}} D^{\mathrm{T}} L^{\mathrm{T}} . \quad \text { The pivot matrix has } D=D^{\mathrm{T}}
$$

The Meaning of Inner Products

$\mathrm{T}$​ is inside The dot product or inner product is $\boldsymbol{x}^{\mathrm{T}} \boldsymbol{y} \quad(1 \times n)(n \times 1)$​

$\mathrm{T}$​ is outside The rank one product or outer product is $\boldsymbol{x} \boldsymbol{y}^{\mathrm{T}} \quad(n \times 1)(1 \times n)$​

内积值为标量, 外积值为矩阵.

内积的实际意义例子:

From mechanics $\quad$ Work $=$ (Movements) $($ Forces $)=\boldsymbol{x}^{\mathrm{T}} \boldsymbol{f}$
From circuits $\quad$ Heat loss $=($ Voltage drops $)($ Currents $)=\boldsymbol{e}^{\mathrm{T}} \boldsymbol{y}$
From economics $\quad$ Income $=($ Quantities $)($ Prices $)=\boldsymbol{q}^{\mathrm{T}} \boldsymbol{p}$

从内积的角度来看转置的定义:
$A^T$ is the matrix that makes these two inner products equal for every $\boldsymbol{x}$ and $\boldsymbol{y}$:
$$
(A \boldsymbol{x})^{\mathrm{T}} \boldsymbol{y}=\boldsymbol{x}^{\mathrm{T}}\left(A^{\mathrm{T}} \boldsymbol{y}\right) \quad \text { Inner product of } A \boldsymbol{x} \text { with } \boldsymbol{y}=\text { Inner product of } \boldsymbol{x} \text { with } A^{\mathrm{T}} \boldsymbol{y}
$$

对称矩阵:
symmetric matrix has $S^T = S$.

$A^T A$ and $A A^T$ 都是对称矩阵.

对称矩阵可以是消元变快: $S=LDL^T$, 如果 $S$ 是对称矩阵的话 $U=L^T$.

permutation matrix 行交换矩阵

A permutation matrix $P$ has the rows of the identity $I$ in any order. $I$ 共有 $n!$ 种 $P$. $P^{-1}$ 仍然为一个置换矩阵. $P^{-1}=P^T$.

The $PA = LU$ Factorization with Row Exchanges

对于 LU 分解过程中的行交换可以在 2 个时期进行, 并可以写成不同的形式:

  1. The row exchanges can be done in advance. Their product $P$ puts the rows of $A$ in the right order, so that no exchanges are needed for $PA$. Then $PA = L U$.
  2. If we hold row exchanges until after elimination, the pivot rows are in a strange order. $A$ puts them in the correct triangular order in $U_1$ . Then $A = L_1 P_1 U_1$.

Introdunction to Linear Algebra-第2章-Solving Linear Equations

https://www.chuxin911.com/Introdunction_to_Linear_Algebra_Chapter2_20220220/

作者

cx

发布于

2022-02-20

更新于

2022-11-30

许可协议