Consider a discrete memory less source with alphabet S = s0, s1, s2, s3, s4, ...  and respective probabilities of occurrence P = {1/2, 1/4, 1/8, 1/16, 1/32, .......}. The entropy of the source (in bits) is _______.

This question was previously asked in
GATE EC 2016 Official Paper: Shift 1
View all GATE EC Papers >

Answer (Detailed Solution Below) 2

Free
GATE EC 2023: Full Mock Test
3.4 K Users
65 Questions 100 Marks 180 Mins

Detailed Solution

Download Solution PDF

Concept: The entropy of the source is the average information. 

Calculation: The entropy of the source is:

 H(s)=s0log2 s0s1log2 s1s2log2 s2

H(s)=12×1+14×2+18×3+116×4...

H(s)=12[1+2×12+3×122+4×123.]

Now, the expansion of

(1x)2=1+2x+3x2+4x3.

Comparing with

1+2×12+3×122+4×123

We see, x=12

Thus, 

=(112)2=(12)2=22=4

Now,

H(s) = 0.5 × 4 = 2 bits/symbol

Latest GATE EC Updates

Last updated on Jan 8, 2025

-> The GATE EC Call Letter has been released on 7th January 2025.

-> The GATE EC 2025 Exam will be held on 15th February 2025.

-> The mode of the GATE EC exam will be a Computer-Based test of 100 marks. 

-> Candidates preparing for the exam can refer to the GATE EC Previous Year Papers to improve their preparation and increase the chances of selection. 

-> Candidates must attempt the GATE EC Mock tests

More Information Content of a Discrete Memoryless System Questions

More Information Theory Questions

Hot Links: teen patti club apk teen patti go teen patti app teen patti gold downloadable content