Skip to content
Early access — you're among the first to try PYQLabs. Share feedback
Concept drill

Entropy

GATE Electronics & Communication · Information Theory · 1989-2025

17
PYQs
47%
keyed
0
elite explanations
12
years appeared

Study anchor

Source-book anchor pending for this concept.

Practice action

Start latest PYQ

PYQs in this concept

All concepts →
2025 Q54

The random variable X takes values in {-1,0,1} with probabilities P(X = -1) = P(X = 1) = α and P(X = 0) = 1 − 2α, where 0 < α < 1/2. Let g(α) denote the entropy of X (in bits), par...

mediumanswer key
2025 PYQ

The random variable $X$ takes values in $\{-1,0,1\}$ with probabilities $P(X=-1)=P(X=1)$ and $\alpha$ and $P(X=0)=1-2 \alpha$, where $0 Let $g(\alpha)$ denote the entropy of $X$ (i...

mediumanswer keybasic explanation
2024 Q34

A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _________.

easy
2024 PYQ

A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _______ .

easybasic explanation
2023 Q65

The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be...

medium
2023 PYQ

The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be...

mediumbasic explanation
2022 PYQ

Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?

mediumanswer keybasic explanation
2020 PYQ

A binary random variable $X$ takes the value +2 or -2 . The probability $P(X=+2)=\alpha$. The value of $\alpha$ (rounded off to one decimal place), for which the entropy of $X$ is...

easybasic explanation
2016 PYQ

A discrete memoryless source has an alphabet $$({a_1},\,{a_2},\,{a_3},\,{a_4})\,$$ with corresponding probabilities$$\left( {{1 \over 2}\,\,,{1 \over 4},\,{1 \over 8},\,\,{1 \over...

easy
2016 PYQ

An analog baseband signal, band limited to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur independently with probabilities $...

medium
2016 PYQ

Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilities of occurrence $$P = \left\...

medium
2014 PYQ

A fair coin is tossed repeatedly until a 'Head' appears for the first time. Let L be the number of tosses to get this first 'Head'. The entropy H (L) in bits is _______________.

medium
2012 PYQ

A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of the first symbol by a small amou...

mediumanswer key
2008 PYQ

A memory less source emits n symbols each with a probability p. The entropy of the source as a function of n

easyanswer key
2006 PYQ

A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of symbols, the most efficient source enc...

easyanswer key
1990 PYQ

An image uses $$512\, \times \,512$$ picture elements. Each of the picture elements can take any of the 8 distinguishable intersity levels. The maximum entropy in the above image w...

easyanswer key
1989 PYQ

A source produces 4 symbols with probabilities $${1 \over 2},\,{1 \over 4},\,{1 \over 8}\,\,and\,\,{1 \over 8}.$$ For this source, a practical coding scheme has an average codeword...

easyanswer key