Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

Information Theory MCQs

1. What does Information Theory primarily study?

a) The transmission of data through networks
b) The uncertainty and information content in signals
c) The encryption of sensitive information
d) The compression of large datasets

Answer: b) The uncertainty and information content in signals
Explanation: Information Theory deals with quantifying the uncertainty in signals and understanding the information content they carry.

2. What is entropy in Information Theory?

a) The measure of randomness and uncertainty in a source
b) The rate of data transmission
c) The degree of compression achieved
d) The encryption key used

Answer: a) The measure of randomness and uncertainty in a source
Explanation: Entropy quantifies the uncertainty or randomness of a source, indicating how much information is present in the source.

3. Which source exhibits higher entropy?

a) A source with predictable outcomes
b) A source with unpredictable outcomes
c) A source with fewer symbols
d) A source with more symbols

Answer: b) A source with unpredictable outcomes
Explanation: Higher entropy implies greater unpredictability, meaning a source with unpredictable outcomes has higher entropy.

4. What is the entropy of a binary memoryless source?

a) 0
b) 1
c) log(2)
d) 2

Answer: c) log(2)
Explanation: The entropy of a binary memoryless source is calculated using the formula -plog(p) – (1-p)log(1-p), where p is the probability of occurrence of one symbol.

5. How is entropy extended to a discrete memoryless source?

a) By summing up the probabilities of all symbols
b) By averaging the entropy of each symbol
c) By applying a compression algorithm
d) By converting symbols into binary format

Answer: b) By averaging the entropy of each symbol
Explanation: For a discrete memoryless source, entropy is extended by averaging the entropy of each symbol over all possible symbols.

6. What is self-information?

a) The information content of a message relative to the entire message space
b) The information content of a message relative to itself
c) The redundancy in a message
d) The encryption key used for the message

Answer: b) The information content of a message relative to itself
Explanation: Self-information measures the amount of information contained in a message relative to the message itself.

7. What does mutual information quantify?

a) The information shared between two messages
b) The difference between self-information and mutual information
c) The entropy of a message
d) The compression ratio

Answer: a) The information shared between two messages
Explanation: Mutual information quantifies the amount of information that two random variables share.

8. How is mutual information calculated?

a) By subtracting self-information from entropy
b) By adding self-information and entropy
c) By subtracting entropy from self-information
d) By subtracting the joint entropy from the sum of individual entropies

Answer: d) By subtracting the joint entropy from the sum of individual entropies
Explanation: Mutual information is calculated by subtracting the joint entropy of two random variables from the sum of their individual entropies.

9. What does the average information content of symbols represent?

a) The total information content of all symbols
b) The information content per symbol on average
c) The maximum entropy of the source
d) The minimum entropy of the source

Answer: b) The information content per symbol on average
Explanation: The average information content of symbols represents the average amount of information carried by each symbol in the source.

10. What property does mutual information satisfy?

a) Symmetry
b) Asymmetry
c) Linearity
d) Non-linearity

Answer: a) Symmetry
Explanation: Mutual information is symmetric, meaning the mutual information between two variables X and Y is the same as the mutual information between Y and X.

Leave a Comment