Information Theory (Final Test) XXXXXXXXXXQuestion Sheet XXXXXXXXXX XXXXXXXXXXZhenni PAN Student ID: XXXXXXXXXXName: (Please round to 3 decimal places during the calculations if it is necessary.) PART...

1 answer below »
Questions5,6, and 7


Information Theory (Final Test) Question Sheet 2021. 7. 21 Zhenni PAN Student ID: Name: (Please round to 3 decimal places during the calculations if it is necessary.) PART I: Multiple choice questions. Q1. Which of the following statements is always true? (A) If ?(?|?) = ?(?) − ?(?) then ? and ? are independent. (B) If ?(?|?) = 0 then ? and ? are independent. (C) If the mutual information ?(?; ?)is zero then ? and ? are independent. (D) If ?(?, ?) = 0 then ? and ? are independent. Q2. Which of the following sets of codewords could be the Huffman code for some 4 symbol source alphabet? (A) 0, 10, 110, 111 (B) 01, 10, 00, 111 (C) 1, 01, 10, 001 (D) 0, 110, 111, 101 Q3. Consider the code {0, 01}, which of the following statement is true? (A) It is instantaneous. (B) It is uniquely decodable. (C) It is singular. (D) None of the above. Q4. The received vector of a (7, 4) Hamming code is ? = 1011010. Which of the following statements is true? (A). The transmitted message is ? = 1011. (B). The syndrome is ? = 011. (C). The 3rd bit in the received vector is incorrect. (D). None of the above. PART II: Calculation questions. Q5. Let the random variable X be five possible symbols {?, ?, ?, ?, ?}. Consider two probability distributions ?(?) and ?(?) over these symbols, and two possible coding schemes?!(?) and ?"(?) for this random variable: (a) Calculate ?(?), ?(?), ?(?||?) and ?(?||?). (b) The last two columns above represent codes for the random variable. Is code ?! optimal for ?(?)? Is code ?" optimal for ?(?)? (c) Assume that we use code ?" when the distribution is ?. What is the average length of the codewords? By how much does it exceed the entropy ?(?)? Relate your answer to ?(?||?). (d) Assume that we use code ?! when the distribution is ?. What is the average length of the codewords? By how much does it exceed the entropy ?(?)? Relate your answer to ?(?||?). Q6. Arithmetic codes: Given ? = {?, ?, !} with the probability (0.4, 0.5, 0.1), “!” is a termination character. Decode: “01010101”. Q7. Noisy typewriter channel: Consider a 10-key typewriter ranged from 0 to 9. (a) If pushing a key results in printing the associated letter, what is the channel capacity ? in bits? (b) If pushing a key results in printing that letter or the next (with equal probability). Thus 0 → 0??1, . . . , 9 → 9??0. What is the channel capacity? (c) What are the optimal inputs that you can find that achieves the capacity with zero probability of error for the channel in (b). How is the distribution for those optimal inputs.
Answered Same DayJul 21, 2021

Answer To: Information Theory (Final Test) XXXXXXXXXXQuestion Sheet XXXXXXXXXX XXXXXXXXXXZhenni PAN Student ID:...

Swapnil answered on Jul 21 2021
150 Votes
6
    Given X = {C, E, !} with the probability of 0.4, 0.5, 0.1 respectively then “!” is a termination
character Decode: “01010101”
That is p (C = 0.4), p (E = 0.5), p (! = 0.1)
Then E(X) = for decoding “01010101”
To decode the corresponding to 7-bit code:
For example, first 01010101 => 0101 0101 => C
So the character can be decoded to the
01010101, 4 '1' s =>! = 0100 0110
    7
    A
    If the typewriter prints out whatever key is struck,...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here