Skip to content
Early access — you're among the first to try PYQLabs. Share feedback

Information Theory

GATE Electronics & Communication · 31 questions across 17 years (1989-2025) · 43% recurrence rate

Recurrence sparkline

19892025
198920072025

Difficulty mix

easy 48%
med 52%

Question types

NAT18
MCQ11
MSQ2

All 31 questions on Information Theory

2025 PYQ

Consider an additive white Gaussian noise (AWGN) channel with bandwidth $W$ and noise power spectral density $\frac{N_o}{2}$. Let $P_{a v}$ denote the average transmit power constraint. Which one of the following plots i...

Med📊
2025 PYQ

$X$ and $Y$ are Bernoulli random variables taking values in $\{0,1\}$. The joint probability mass function of the random variables is given by: $$ \begin{aligned} & P(X=0, Y=0)=0.06 \\ & P(X=0, Y=1)=0.14 \\ & P(X=1, Y=0)...

Easy
2025 PYQ

The random variable $X$ takes values in $\{-1,0,1\}$ with probabilities $P(X=-1)=P(X=1)$ and $\alpha$ and $P(X=0)=1-2 \alpha$, where $0 Let $g(\alpha)$ denote the entropy of $X$ (in bits), parameterized by $\alpha$. Whic...

Med
2025 PYQ

The generator matrix of a $(6,3)$ binary linear block code is given by $$ G=\left[\begin{array}{llllll} 1 & 0 & 0 & 1 & 0 & 1 \\ 0 & 1 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 1 & 0 \end{array}\right] $$ The minimum Hamming di...

Med
2024 PYQ

A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _______ .

Easy
2023 Q65

The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be truthfully answered. The average number...

Med
2023 PYQ

The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be truthfully answered. The average number...

Med
2022 PYQ

Consider communication over a memoryless binary symmetric channel using a (7, 4) Hamming code. Each transmitted bit is received correctly with probability (1 $$-$$ $$\in$$), and flipped with probability $$\in$$. For each...

Med
2022 PYQ

Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?

Med
2021 PYQ

In a high school having equal number of boy students and girl students, 75% of the students study Science and the remaining 25% students study Commerce. Commerce students are two times more likely to be a boy than are Sc...

Med
2020 PYQ

A binary random variable $X$ takes the value +2 or -2 . The probability $P(X=+2)=\alpha$. The value of $\alpha$ (rounded off to one decimal place), for which the entropy of $X$ is maximum, is $\_\_\_\_$ .

Easy
2017 PYQ

Let $$\left( {{X_1},\,{X_2}} \right)$$ be independent random variables, $${X_1}$$ has mean 0 and variance 1, while $${X_2}$$ has mean 1 and variance 4. The mutual information I $$\left( {{X_1},\,{X_2}} \right)$$ between...

Easy
2017 PYQ

Which one of the following graphs shows the Shannon capacity (channel capacity) in bits of a memory less binary symmetric channel with crossover probability P?

Easy📊
2016 PYQ

A discrete memoryless source has an alphabet $$({a_1},\,{a_2},\,{a_3},\,{a_4})\,$$ with corresponding probabilities$$\left( {{1 \over 2}\,\,,{1 \over 4},\,{1 \over 8},\,\,{1 \over 8}\,} \right)$$. The minimum required av...

Easy
2016 PYQ

A voice-grade AWGN (additive white Gaussian noise) telephone channel has a bandwidth of 4.0 kHz and two-sided noise power spectral density $${\eta \over 2} = 2.5\, \times \,{10^{ - 5}}$$ Watt per Hz. If information at th...

Med
2016 PYQ

An analog baseband signal, band limited to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur independently with probabilities $${p_1}$$ = $${p_4}$$ = 0.125 and $${p_2}...

Med
2016 PYQ

A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events: $${x_0}$$ : a " zero " is transmitted $${x_1}$$ : a " one " is transmitted $${y_0}$$ : a "...

Easy
2016 PYQ

Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilities of occurrence $$P = \left\{ {{1 \over 2},\,{1 \over 4},\,{1 \over...

Med
2016 PYQ

A digital communication system uses a repetition code for channel encoding/decoding. During transmission, each bit is repeated three times (0 is transmitted as 000, and 1 is transmitted as 111). It is assumed that the so...

Med
2014 PYQ

The capacity of band-limited additive white Gaussian noise (AWGN) channel is given by $$C = \,W\,\,{\log _2}\left( {1 + {P \over {{\sigma ^2}\,W}}} \right)$$ bits per second (bps), where W is the channel bandwidth, P is...

Med
2014 PYQ

The capacity of a Binary Symmetric Channel (BSC) with cross - over probability 0.5 is ______ .

Easy
2014 PYQ

A fair coin is tossed repeatedly until a 'Head' appears for the first time. Let L be the number of tosses to get this first 'Head'. The entropy H (L) in bits is _______________.

Med
2012 PYQ

A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of the first symbol by a small amount $$\varepsilon $$ and decreases that o...

Med
2009 PYQ

A communication channel with AWGN operating at a signal to noise ratio SNR > > 1 and band width B has capacity $${{C_1}}$$ . If the SNR is doubled keeping B constant, the resulting capacity $${{C_2}}$$ is given

Easy
2008 PYQ

Consider a Binary Symmetric Channel (BSC) with probability of error being 'p'. To transit a bit, say 1, we transmit a sequence of three 1s. The receiver will interpret the received sequence to represent 1 if at least two...

Easy
2008 PYQ

A memory less source emits n symbols each with a probability p. The entropy of the source as a function of n

Easy
2006 PYQ

A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of symbols, the most efficient source encoder would have average bit rate as

Easy
1992 PYQ

Source encoding in a data communication system is done in order to

Easy
1991 PYQ

A binary source has symbol probabilities 0.8 and 0.2. If extension coding (blocks of 4 symbols) is used, the lower and upper bounds on the average code word length are : (a)lower___________. (b) higher_________.

Med
1990 PYQ

An image uses $$512\, \times \,512$$ picture elements. Each of the picture elements can take any of the 8 distinguishable intersity levels. The maximum entropy in the above image will be

Easy
1989 PYQ

A source produces 4 symbols with probabilities $${1 \over 2},\,{1 \over 4},\,{1 \over 8}\,\,and\,\,{1 \over 8}.$$ For this source, a practical coding scheme has an average codeword lenght of 2 bits/symbols. The efficienc...

Easy