跳到主要内容

京都大学 情報学研究科 通信情報システム専攻 2024年8月実施 専門基礎A [A-3]

Author

SUN, 祭音Myyura (assisted by ChatGPT 5.4 Thinking)

Description

Answer all the following questions.

(1)

Consider a weather forecast which predicts the weather with two values: sunny and rainy. Let and denote random variables which represent the actual weather and the forecasted weather, respectively. The joint probability distribution of and , , is given as shown in Table 1. Answer the following questions. Use Table 2 to calculate the logarithm.

Table 1: Joint probability distribution of and

Actual weather / Forecasted weather SunnyRainy
Sunny0.450.20
Rainy0.150.20

Table 2: Logarithm table

ExpressionValue
1.58
2.32
2.81
3.46
3.70
4.09
4.25

(a) Find the entropy of , , and the entropy of , .

(b) Find the conditional entropy .

(c) Find the joint entropy .

(d) Find the mutual information of and , .

(e) Explain whether the above weather forecast is useful compared to the case of always predicting sunny, by using the forecast accuracy and the mutual information.

(2)

Answer the following questions on additive binary communication channels. The entropy function may be used.

(a) Find the transition probability matrix of the Markov information source represented by the state transition diagram using probabilities , and states , below. This state transition diagram shows, for example, that the probability of transition from state to state is and that 1 is output at that time.

The state transition diagram corresponds to:

  • : output , probability
  • : output , probability
  • : output , probability
  • : output , probability

(b) Find the stationary distribution of the Markov information sources in Question (a).

(c) Find the bit error rate of the additive binary communication channel whose error source is the Markov information source in Question (a).

(d) Find the communication channel capacity of the additive binary communication channel in Question (c).

Kai

(1)

Given the joint distribution:

So the marginals are:

(a)

(b)

(c)

(d)

(e)

Forecast accuracy:

Always-sunny accuracy:

  • Forecast acc: 65%
  • Always sunny acc: 65%

Therefore, the given forecast does not improve the accuracy compared with always predicting sunny.

On the other hand, the mutual information is

which is positive. This means that the forecast still contains some information about the actual weather. However, the amount of information is very small.

Hence, this forecast is not useful in terms of prediction accuracy, because it is no better than always predicting sunny, and it is only weakly informative in terms of mutual information.

(2)

(a)

(b)

(c)

For the additive channel, the error bit is . In steady state:

(d)

where is the entropy rate of the Markov noise source.

With stationary probabilities , , the entropy rate is

so