Riddle: I measure the uncertainty of a source, from zero when certain to log2n when symbols are equal. What am I? |
Card: 3 / 20 |
True or False: If all emitted symbols from a source are equiprobable, then the entropy is H(X) = 0. |
Card: 5 / 20 |
Which of the following coding methods is known for its efficiency in generating binary codes? A) Huffman coding B) Shannon-Fano coding C) Both A and B D) Neither A nor B |
Card: 9 / 20 |
![]() Unlock all Flashcards with EduRev Infinity Plan Starting from @ ₹99 only
|
True or False: The channel capacity is defined as the minimum of mutual information. |
Card: 11 / 20 |
What is the formula for the information rate of a source with entropy H and message generation rate r? |
Card: 13 / 20 |
Fill in the blank: The process of representing data generated by a discrete source is called ______. |
Card: 15 / 20 |
If a discrete memoryless channel has n inputs and m outputs, the joint entropy is calculated using which of the following? |
Card: 19 / 20 |