Question
Download Solution PDFConsider a discrete memoryless source with source alphabet S = {s0, s1, s2} with probabilities:
The entropy of the source is
Answer (Detailed Solution Below)
Detailed Solution
Download Solution PDFConcept:
The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.
It is calculated as:
pi is the probability of the occurrence of a symbol.
Calculation:
The entropy will be:
Putting on the respective values, we get:
Note: The entropy is maximum when all the symbols in the distribution occur with equal probabilities.
Last updated on Jul 2, 2025
-> ESE Mains 2025 exam date has been released. As per the schedule, UPSC IES Mains exam 2025 will be conducted on August 10.
-> UPSC ESE result 2025 has been released. Candidates can download the ESE prelims result PDF from here.
-> UPSC ESE admit card 2025 for the prelims exam has been released.
-> The UPSC IES Prelims 2025 will be held on 8th June 2025.
-> The selection process includes a Prelims and a Mains Examination, followed by a Personality Test/Interview.
-> Candidates should attempt the UPSC IES mock tests to increase their efficiency. The UPSC IES previous year papers can be downloaded here.