Skip to content

東京大学 新領域創成科学研究科 メディカル情報生命専攻 2014年8月実施 問題11

Author

zephyr

Description

For an arbitrary random variable \(X\) that takes values in non-negative integers, we define the probability generating function \(\varphi_X(s)\) of \(X\) as \(\varphi_X(s) = E\{s^X\} = \sum_{k=0}^{\infty} s^k \Pr\{X = k\}\). Here, \(E\{A\}\) denotes the expected value of \(A\), \(\Pr\{X = k\}\) denotes the probability that \(X\) assumes value \(k\), and \(s\) represents a real number.

(1) Show the following equalities (a), (b).

  • (a) \(\varphi_X(1) = 1\)
  • (b) \(\frac{d \varphi_X}{ds}(1) = E\{X\}\)

In the following, \(N\) and \(U_i\) \((i = 1, 2, \ldots)\) are mutually independent, identically distributed random variables that take values in non-negative integers.

(2) For a positive integer \(n\), we define a random variable \(Y_n = Y_n(U_1, U_2, \ldots) = \sum_{i=1}^n U_i\). Show \(\varphi_{Y_n}(s) = \varphi_N(s)^n\).

(3) We define a random variable \(W = W(N, Y_1, Y_2, \ldots) = \sum_{n=1}^{\infty} Y_n I(N = n)\). Here,

\[ I(N = n) = \begin{cases} 1 & \text{if } N = n \\ 0 & \text{if } N \neq n \end{cases} \]

Show \(\varphi_W(s) = \varphi_N(\varphi_N(s))\).

(Hint: \(\Pr\{W = k\} = \sum_{n=1}^{\infty} \Pr\{Y_n = k\} \Pr\{N = n\}\) for a positive integer \(k\))

(4) For \(\Pr\{N = k\} = q^k (1 - q)\), \((0 < q < 1)\), calculate \(E\{N\}\) and \(E\{W\}\).


对于一个取非负整数值的任意随机变量 \(X\),我们定义 \(X\) 的概率生成函数 \(\varphi_X(s)\)\(\varphi_X(s) = E\{s^X\} = \sum_{k=0}^{\infty} s^k \Pr\{X = k\}\)。其中,\(E\{A\}\) 表示 \(A\) 的期望值,\(\Pr\{X = k\}\) 表示 \(X\) 取值为 \(k\) 的概率,\(s\) 表示一个实数。

(1) 证明以下等式 (a), (b)。

  • (a) \(\varphi_X(1) = 1\)
  • (b) \(\frac{d \varphi_X}{ds}(1) = E\{X\}\)

在下列情形中,\(N\)\(U_i\) \((i = 1, 2, \ldots)\) 是相互独立的同分布随机变量,取非负整数值。

(2) 对于一个正整数 \(n\),我们定义一个随机变量 \(Y_n = Y_n(U_1, U_2, \ldots) = \sum_{i=1}^n U_i\)。证明 \(\varphi_{Y_n}(s) = \varphi_N(s)^n\)

(3) 我们定义一个随机变量 \(W = W(N, Y_1, Y_2, \ldots) = \sum_{n=1}^{\infty} Y_n I(N = n)\)。 其中,

\[ I(N = n) = \begin{cases} 1 & \text{如果 } N = n \\ 0 & \text{如果 } N \neq n \end{cases} \]

证明 \(\varphi_W(s) = \varphi_N(\varphi_N(s))\)

(提示:\(\Pr\{W = k\} = \sum_{n=1}^{\infty} \Pr\{Y_n = k\} \Pr\{N = n\}\) 对于一个正整数 \(k\)

(4) 对于 \(\Pr\{N = k\} = q^k (1 - q)\), \((0 < q < 1)\),计算 \(E\{N\}\)\(E\{W\}\)

Kai

(1)

(a)

\[ \begin{aligned} \varphi_X(1) &= E\{1^X\} = \sum_{k=0}^{\infty} 1^k \Pr\{X = k\} \\ &= \sum_{k=0}^{\infty} \Pr\{X = k\} \\ &= 1 \end{aligned} \]

The last step follows from the fact that the sum of probabilities over all possible outcomes is 1.

(b)

\[ \begin{aligned} \frac{d \varphi_X}{ds}(s) &= \frac{d}{ds} E\{s^X\} = \frac{d}{ds} \sum_{k=0}^{\infty} s^k \Pr\{X = k\} \\ &= \sum_{k=0}^{\infty} k s^{k-1} \Pr\{X = k\} \end{aligned} \]

Evaluating at \(s = 1\):

\[ \begin{aligned} \frac{d \varphi_X}{ds}(1) &= \sum_{k=0}^{\infty} k \cdot 1^{k-1} \Pr\{X = k\} \\ &= \sum_{k=0}^{\infty} k \Pr\{X = k\} \\ &= E\{X\} \end{aligned} \]

(2)

We need to show \(\varphi_{Y_n}(s) = \varphi_N(s)^n\) where \(Y_n = \sum_{i=1}^n U_i\).

\[ \begin{aligned} \varphi_{Y_n}(s) &= E\{s^{Y_n}\} = E\{s^{\sum_{i=1}^n U_i}\} \\ &= E\{s^{U_1} \cdot s^{U_2} \cdot ... \cdot s^{U_n}\} \\ &= E\{s^{U_1}\} \cdot E\{s^{U_2}\} \cdot ... \cdot E\{s^{U_n}\} \quad \text{(due to i.i.d)} \\ &= (E\{s^{N}\})^n \quad \text{(due to i.i.d)} \\ &= \varphi_N(s)^n \end{aligned} \]

(3)

We need to show \(\varphi_W(s) = \varphi_N(\varphi_U(s))\) where \(W = \sum_{n=1}^{\infty} Y_n I(N = n)\).

Using the hint and the definition of probability generating function:

\[ \begin{aligned} \varphi_W(s) &= E\{s^W\} = \sum_{k=0}^{\infty} s^k \Pr\{W = k\} \\ &= \sum_{k=0}^{\infty} s^k \sum_{n=1}^{\infty} \Pr\{Y_n = k\} \Pr\{N = n\} \\ &= \sum_{n=1}^{\infty} \Pr\{N = n\} \sum_{k=0}^{\infty} s^k \Pr\{Y_n = k\} \\ &= \sum_{n=1}^{\infty} \Pr\{N = n\} \varphi_{Y_n}(s) \\ &= \sum_{n=1}^{\infty} \Pr\{N = n\} \varphi_N(s)^n \quad \text{(from result of part 2)} \\ &= \varphi_N(\varphi_N(s)) \end{aligned} \]

(4)

Given \(\Pr\{N = k\} = q^k (1 - q)\), \((0 < q < 1)\), we need to calculate \(E\{N\}\) and \(E\{W\}\).

First, let's calculate \(E\{N\}\):

\[ \begin{aligned} E\{N\} &= \sum_{k=0}^{\infty} k \Pr\{N = k\} = \sum_{k=0}^{\infty} k q^k (1 - q) \\ &= (1 - q) \sum_{k=0}^{\infty} k q^k = (1 - q) \frac{q}{(1-q)^2} = \frac{q}{1-q} \end{aligned} \]

Now, for \(E\{W\}\), we can use the result from part 1(b) and part 3:

\[ \begin{aligned} E\{W\} &= \frac{d \varphi_W}{ds}(1) = \frac{d}{ds} \varphi_N(\varphi_N(s)) |_{s=1} \\ &= \varphi_N'(\varphi_N(1)) \cdot \varphi_N'(1) \\ &= E\{N\}^2 \quad \text{(using part 1(b))} \\ &= (\frac{q}{1-q})^2 \end{aligned} \]

Therefore, \(E\{W\} = (\frac{q}{1-q})^2\).

Knowledge

概率论 概率生成函数 条件期望 全期望公式 复合分布

难点思路

  1. 理解概率生成函数的定义和基本性质
  2. 利用独立性推导和的概率生成函数
  3. 使用条件期望和全期望公式推导复合随机变型的概率生成函数
  4. 应用概率生成函数的性质计算具体分布的期望

解题技巧和信息

  1. 概率生成函数的基本性质:
  2. \(\varphi_X(1) = 1\)
  3. \(\frac{d \varphi_X}{ds}(1) = E\{X\}\)
  4. \(\varphi_{X+Y}(s) = \varphi_X(s) \cdot \varphi_Y(s)\) (对于独立的 \(X\)\(Y\))
  5. 几何分布的概率生成函数:如果 \(X \sim Geo(p)\),则 \(\varphi_X(s) = \frac{p}{1-(1-p)s}\)
  6. 利用全期望公式:\(E\{Y\} = E\{E\{Y|X\}\}\)

重点词汇

  • Probability generating function: 概率生成函数
  • Independent and identically distributed (i.i.d.): 独立同分布
  • Compound distribution: 复合分布
  • Conditional expectation: 条件期望
  • Law of total expectation: 全期望公式
  • Geometric distribution: 几何分布