京都大学 情報学研究科 通信情報システム専攻 2024年8月実施 専門基礎A [A-3]
Author
SUN, 祭音Myyura (assisted by ChatGPT 5.4 Thinking)
Description
Answer all the following questions.
(1)
Consider a weather forecast which predicts the weather with two values: sunny and rainy. Let
Table 1: Joint probability distribution of and
| Actual weather | Sunny | Rainy |
|---|---|---|
| Sunny | 0.45 | 0.20 |
| Rainy | 0.15 | 0.20 |
Table 2: Logarithm table
| Expression | Value |
|---|---|
| 1.58 | |
| 2.32 | |
| 2.81 | |
| 3.46 | |
| 3.70 | |
| 4.09 | |
| 4.25 |
(a) Find the entropy of
(b) Find the conditional entropy
(c) Find the joint entropy
(d) Find the mutual information of
(e) Explain whether the above weather forecast is useful compared to the case of always predicting sunny, by using the forecast accuracy and the mutual information.
(2)
Answer the following questions on additive binary communication channels. The entropy function
(a) Find the transition probability matrix of the Markov information source represented by the state transition diagram using probabilities
The state transition diagram corresponds to:
: output , probability : output , probability : output , probability : output , probability
(b) Find the stationary distribution of the Markov information sources in Question (a).
(c) Find the bit error rate of the additive binary communication channel whose error source
(d) Find the communication channel capacity of the additive binary communication channel in Question (c).
Kai
(1)
Given the joint distribution:
So the marginals are:
(a)
(b)
(c)
(d)
(e)
Forecast accuracy:
Always-sunny accuracy:
- Forecast acc: 65%
- Always sunny acc: 65%
Therefore, the given forecast does not improve the accuracy compared with always predicting sunny.
On the other hand, the mutual information is
which is positive. This means that the forecast still contains some information about the actual weather. However, the amount of information is very small.
Hence, this forecast is not useful in terms of prediction accuracy, because it is no better than always predicting sunny, and it is only weakly informative in terms of mutual information.
(2)
(a)
(b)
(c)
For the additive channel, the error bit is
(d)
where
With stationary probabilities
so