Skip to content

東京大学 新領域創成科学研究科 メディカル情報生命専攻 2016年8月実施 問題8

Author

zephyr

Description

Answer the following questions about linear algebra.

(1) Denote by \(\mathbf{o}\) the zero vector. Let \(\mathbf{a}\) denote a two-dimensional vector that is not \(\mathbf{o}\). \(T_{\mathbf{a}}(\mathbf{x})\) is the orthogonal projection of a point \(\mathbf{x}\) on \(\mathbf{a}\). Prove the following propositions.

  • (1.1) \(T_{\mathbf{a}}(T_{\mathbf{a}}(\mathbf{x})) = T_{\mathbf{a}}(\mathbf{x})\) for any two-dimensional point \(\mathbf{x}\).

  • (1.2) \(T_{\mathbf{b}}(T_{\mathbf{a}}(\mathbf{x})) = \mathbf{o}\) for any non-zero two-dimensional vector \(\mathbf{b}\) orthogonal to \(\mathbf{a}\).

(2) Assume that a real symmetric matrix \(\mathbf{P}\) satisfies \(\mathbf{P}^2 = \mathbf{P}\). Prove that the eigenvalues of \(\mathbf{P}\) are either 0 or 1.

(3) Denote by \(\mathbf{a_1}, \mathbf{a_2}\) the column vectors corresponding to the bases of a two-dimensional subspace of the three dimensional space. Describe the projection matrix to the subspace using \(\mathbf{A} = [\mathbf{a_1}, \mathbf{a_2}]\).


回答以下有关线性代数的问题。

(1) 用 \(\mathbf{o}\) 表示零向量。设 \(\mathbf{a}\) 表示一个二维向量,它不是 \(\mathbf{o}\)\(T_{\mathbf{a}}(\mathbf{x})\) 是点 \(\mathbf{x}\)\(\mathbf{a}\) 上的正交投影。证明以下命题。

  • (1.1) 对于任意二维点 \(\mathbf{x}\)\(T_{\mathbf{a}}(T_{\mathbf{a}}(\mathbf{x})) = T_{\mathbf{a}}(\mathbf{x})\)

  • (1.2) 对于任意非零二维向量 \(\mathbf{b}\),它与 \(\mathbf{a}\) 正交,\(T_{\mathbf{b}}(T_{\mathbf{a}}(\mathbf{x})) = \mathbf{o}\)

(2) 假设一个实对称矩阵 \(\mathbf{P}\) 满足 \(\mathbf{P}^2 = \mathbf{P}\)。证明 \(\mathbf{P}\) 的特征值要么是 0,要么是 1。

(3) 用 \(\mathbf{a_1}, \mathbf{a_2}\) 表示对应于三维空间的二维子空间基的列向量。用 \(\mathbf{A} = [\mathbf{a_1}, \mathbf{a_2}]\) 描述该子空间的投影矩阵。

Kai

(1)

(1.1)

The orthogonal projection of \(\mathbf{x}\) on \(\mathbf{a}\) is given by:

\[ T_{\mathbf{a}}(\mathbf{x}) = \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \]

To prove the proposition, we apply \(T_{\mathbf{a}}\) again on \(T_{\mathbf{a}}(\mathbf{x})\):

\[ T_{\mathbf{a}}(T_{\mathbf{a}}(\mathbf{x})) = T_{\mathbf{a}}\left( \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \right) \]

Using the definition of orthogonal projection:

\[ T_{\mathbf{a}}\left( \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \right) = \frac{\mathbf{a} \cdot \left( \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \right)}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \]

Simplifying the dot products:

\[ = \frac{\frac{(\mathbf{a} \cdot \mathbf{x})(\mathbf{a} \cdot \mathbf{a})}{(\mathbf{a} \cdot \mathbf{a})}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} = \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \]

Thus:

\[ T_{\mathbf{a}}(T_{\mathbf{a}}(\mathbf{x})) = T_{\mathbf{a}}(\mathbf{x}) \]

(1.2)

Given \(\mathbf{b} \cdot \mathbf{a} = 0\), we need to show:

\[ T_{\mathbf{b}}(T_{\mathbf{a}}(\mathbf{x})) = T_{\mathbf{b}}\left( \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \right) \]

Using the definition of orthogonal projection:

\[ T_{\mathbf{b}}\left( \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \right) = \frac{\mathbf{b} \cdot \left( \frac{\mathbf{a} \cdot \mathbf{x}}{\mathbf{a} \cdot \mathbf{a}} \mathbf{a} \right)}{\mathbf{b} \cdot \mathbf{b}} \mathbf{b} \]

Since \(\mathbf{b} \cdot \mathbf{a} = 0\):

\[ = \frac{\frac{(\mathbf{a} \cdot \mathbf{x})(\mathbf{b} \cdot \mathbf{a})}{(\mathbf{a} \cdot \mathbf{a})}}{\mathbf{b} \cdot \mathbf{b}} \mathbf{b} = \frac{(\mathbf{a} \cdot \mathbf{x}) \cdot 0}{(\mathbf{a} \cdot \mathbf{a}) \cdot (\mathbf{b} \cdot \mathbf{b})} \mathbf{b} = 0 \]

Thus:

\[ T_{\mathbf{b}}(T_{\mathbf{a}}(\mathbf{x})) = \mathbf{o} \]

(2)

Assume that a real symmetric matrix \(\mathbf{P}\) satisfies \(\mathbf{P}^2 = \mathbf{P}\). Prove that the eigenvalues of \(\mathbf{P}\) are either 0 or 1.

Let \(\mathbf{P}\) be a real symmetric matrix. Therefore, it is diagonalizable. Let \(\mathbf{v}\) be an eigenvector of \(\mathbf{P}\) with eigenvalue \(\lambda\):

\[ \mathbf{P}\mathbf{v} = \lambda \mathbf{v} \]

Applying \(\mathbf{P}\) again:

\[ \mathbf{P}^2 \mathbf{v} = \mathbf{P} (\mathbf{P} \mathbf{v}) = \mathbf{P} (\lambda \mathbf{v}) = \lambda \mathbf{P} \mathbf{v} = \lambda (\lambda \mathbf{v}) = \lambda^2 \mathbf{v} \]

Since \(\mathbf{P}^2 = \mathbf{P}\), we have:

\[ \mathbf{P}^2 \mathbf{v} = \mathbf{P} \mathbf{v} = \lambda \mathbf{v} \]

Thus:

\[ \lambda^2 \mathbf{v} = \lambda \mathbf{v} \]

Since \(\mathbf{v}\) is a non-zero vector, we can conclude:

\[ \lambda^2 = \lambda \]

Thus, the eigenvalues \(\lambda\) must satisfy:

\[ \lambda (\lambda - 1) = 0 \]

Therefore:

\[ \lambda = 0 \quad \text{or} \quad \lambda = 1 \]

(3)

Given the matrix \(\mathbf{A}\) formed by two column vectors \(\mathbf{a_1}\) and \(\mathbf{a_2}\), which represent the basis of a two-dimensional subspace in three-dimensional space, we want to find the projection matrix \(\mathbf{P}\) that projects any vector in \(\mathbb{R}^3\) onto this subspace.

Matrix \(\mathbf{A}\) is:

\[ \mathbf{A} = [\mathbf{a_1}, \mathbf{a_2}] \]

where \(\mathbf{A}\) is a \(3 \times 2\) matrix.

Derivation of the Projection Matrix

1. Projection of a Vector

The projection of a vector \(\mathbf{x}\) onto the subspace spanned by the columns of \(\mathbf{A}\) can be expressed as a linear combination of the columns of \(\mathbf{A}\):

\[ \mathbf{P}\mathbf{x} = c_1 \mathbf{a_1} + c_2 \mathbf{a_2} \]

In matrix form, we write:

\[ \mathbf{P}\mathbf{x} = \mathbf{A}\mathbf{c} \]

where \(\mathbf{c}\) is a column vector of coefficients:

\[ \mathbf{c} = \begin{bmatrix} c_1 \\ c_2 \end{bmatrix} \]
2. Finding the Coefficients

To determine the coefficients \(\mathbf{c}\), we use the property that the projection minimizes the distance to the subspace. This can be formulated as:

\[ \mathbf{A}^T (\mathbf{x} - \mathbf{A}\mathbf{c}) = \mathbf{0} \]

This equation implies:

\[ \mathbf{A}^T \mathbf{x} = \mathbf{A}^T \mathbf{A} \mathbf{c} \]

Assuming \(\mathbf{A}^T \mathbf{A}\) is invertible, we solve for \(\mathbf{c}\):

\[ \mathbf{c} = (\mathbf{A}^T \mathbf{A})^{-1} \mathbf{A}^T \mathbf{x} \]
3. Constructing the Projection Matrix

Substituting \(\mathbf{c}\) back into the projection formula, we have:

\[ \mathbf{P}\mathbf{x} = \mathbf{A} (\mathbf{A}^T \mathbf{A})^{-1} \mathbf{A}^T \mathbf{x} \]

Since this holds for any vector \(\mathbf{x}\), the projection matrix \(\mathbf{P}\) can be identified as:

\[ \mathbf{P} = \mathbf{A} (\mathbf{A}^T \mathbf{A})^{-1} \mathbf{A}^T \]

Knowledge

对称矩阵 特征值和特征向量 投影矩阵

重点词汇

  • Orthogonal projection 正交投影
  • Symmetric matrix 对称矩阵
  • Eigenvalue 特征值
  • Column vector 列向量
  • Subspace 子空间
  • Projection matrix 投影矩阵

参考资料

  1. Gilbert Strang, "Linear Algebra and Its Applications," Chap. 3, 5.
  2. David C. Lay, "Linear Algebra and Its Applications," Chap. 6.