矩阵分析与应用第五章

5.1

Posted by 夜雨声烦 on November 6, 2022

Chapter 5 Norms, Inner Products and Orthogonality

5.1 Vector Norms

Euclidean Vector Norm

For a vector $x_{n \times 1}$, the euclidean norm of $x$ is defined to be:

  • $ \lVert x \rVert = \left( \sum_{i=1}^n x_i^2 \right)^{\frac{1}{2}} = \sqrt{x^T x} \quad \text{whenever} \ \ x \in \mathfrak{R}^{n \times 1} $
  • $ \lVert x \rVert = \left( \sum_{i=1}^n \lvert x_i \rvert^2 \right)^{\frac{1}{2}} = \sqrt{x^* x} \quad \text{whenever} \ \ x \in \mathfrak{C}^{n \times 1} $

Several points to note:

  1. Recall that if $z = a + \mathrm{i}b$, then $\bar{z} = a − \mathrm{i}b $, and the magnitude of $z$ is $\lvert z \rvert = \sqrt{z\bar{z}} = \sqrt{a^2 + b^2}$.
  2. The definition of euclidean norm guarantees that for all scalars $\alpha$
  3. \[\lVert x \rVert \geq 0\]
\[\lVert x \rVert = 0 \Longleftrightarrow x = 0\] \[\lVert \alpha x \rVert = \lvert \alpha \rvert \lVert x \rVert \tag{5.1.1}\]
  1. Given a vector $x \neq 0$, we normalize $x$ by setting $u = \frac{x}{\lVert x \rVert}$ to have another vector that points in the same direction as $x$, it’s easy to see that from (5.1.1)
\[\lVert u \rVert = \lVert \frac{x}{\lVert x \rVert} \rVert = \frac{1}{\lVert x \rVert} \lVert x \rVert = 1 \tag{5.1.2}\]
  1. For vectors in $\mathfrak{R}^n$ and $\mathfrak{C}^n$, the distance between $u$ and $v$ is naturally defined to be $\lVert u - v \rVert$.

Standard Inner Product

The scalar terms defined by

\[x^T y = \sum_{i=1}^n x_i y_i \in \mathfrak{R} \quad \text{and} \quad x^* y \sum_{i=1}^n \bar{x_i} y_i \in \mathfrak{C}\]

are called the standard inner products for $\mathfrak{R}^n$ and $\mathfrak{C}^n$, respectively.

Cauchy-Bunyakovskii-Schwarz(CBS) Inequality

The CBS inequality is one of the most important inequalities in mathematics. It relates inner product to norm.

\[\lvert x^* y \rvert \leq \lVert x \rVert \lVert y \rVert \quad \text{for all} \; x, y \in \mathfrak{C}^{n \times 1} \tag{5.1.3} \\\\ \text{Equality holds if and only if} \; y = \alpha x \; \text{for} \; \alpha = \frac{x^* y}{x^* x}\]

One reason that the CBS inequality is important is because it helps to establish that the geometry in higher-dimensional spaces is consistent with the geometry in the visual spaces $\mathfrak{R}^2$ and $\mathfrak{R}^3$.

This observation is known as the triangle inequality. In higher-dimensional spaces the question of whether or not the triangle inequality remains valid has no obvious answer. The CBS inequality is precisely what is required to prove that, in this respect, the geometry of higher dimensions is no different than that of the visual spaces.

Triangle Inequality

\[\lVert x + y \rVert \leq \lVert x \rVert + \lVert y \rVert \quad \text{for every} \; x, y \in \mathfrak{C}^n\]

The triangle inequality can be extended to any number of vectors in the sense that $\lVert \sum_i x_i \rVert \leq \sum_i \lVert x_i \rVert $.

Furthermore, it follows as a corollary that for real or complex numbers, $\lvert \sum_i \alpha_i \rvert \leq \sum_i \lvert \alpha_i \rvert $ (the triangle inequality for scalars).

p-Norms

\[\text{For} \; p \geq 1, \text{the} \; \mathit{p-norm} \; \text{of} \; x \in \mathfrak{C}^n \; \text{is defined as} \; \lVert x \rVert_p = \left(\sum_{i=1}^n \lvert x_i \rvert^p \right)^{\frac 1p}.\]