Information Theory
GATE Electronics & Communication · 31 questions across 17 years (1989-2025) · 43% recurrence rate
Recurrence sparkline
1989–2025Difficulty mix
Question types
All 31 questions on Information Theory
Consider an additive white Gaussian noise (AWGN) channel with bandwidth $W$ and noise power spectral density $\frac{N_o}{2}$. Let $P_{a v}$ denote the average transmit power constraint. Which one of the following plots i...
$X$ and $Y$ are Bernoulli random variables taking values in $\{0,1\}$. The joint probability mass function of the random variables is given by: $$ \begin{aligned} & P(X=0, Y=0)=0.06 \\ & P(X=0, Y=1)=0.14 \\ & P(X=1, Y=0)...
The random variable $X$ takes values in $\{-1,0,1\}$ with probabilities $P(X=-1)=P(X=1)$ and $\alpha$ and $P(X=0)=1-2 \alpha$, where $0 Let $g(\alpha)$ denote the entropy of $X$ (in bits), parameterized by $\alpha$. Whic...
The generator matrix of a $(6,3)$ binary linear block code is given by $$ G=\left[\begin{array}{llllll} 1 & 0 & 0 & 1 & 0 & 1 \\ 0 & 1 & 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1 & 1 & 0 \end{array}\right] $$ The minimum Hamming di...
A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _______ .
The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be truthfully answered. The average number...
The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be truthfully answered. The average number...
Consider communication over a memoryless binary symmetric channel using a (7, 4) Hamming code. Each transmitted bit is received correctly with probability (1 $$-$$ $$\in$$), and flipped with probability $$\in$$. For each...
Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?
In a high school having equal number of boy students and girl students, 75% of the students study Science and the remaining 25% students study Commerce. Commerce students are two times more likely to be a boy than are Sc...
A binary random variable $X$ takes the value +2 or -2 . The probability $P(X=+2)=\alpha$. The value of $\alpha$ (rounded off to one decimal place), for which the entropy of $X$ is maximum, is $\_\_\_\_$ .
Let $$\left( {{X_1},\,{X_2}} \right)$$ be independent random variables, $${X_1}$$ has mean 0 and variance 1, while $${X_2}$$ has mean 1 and variance 4. The mutual information I $$\left( {{X_1},\,{X_2}} \right)$$ between...
Which one of the following graphs shows the Shannon capacity (channel capacity) in bits of a memory less binary symmetric channel with crossover probability P?
A discrete memoryless source has an alphabet $$({a_1},\,{a_2},\,{a_3},\,{a_4})\,$$ with corresponding probabilities$$\left( {{1 \over 2}\,\,,{1 \over 4},\,{1 \over 8},\,\,{1 \over 8}\,} \right)$$. The minimum required av...
A voice-grade AWGN (additive white Gaussian noise) telephone channel has a bandwidth of 4.0 kHz and two-sided noise power spectral density $${\eta \over 2} = 2.5\, \times \,{10^{ - 5}}$$ Watt per Hz. If information at th...
An analog baseband signal, band limited to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur independently with probabilities $${p_1}$$ = $${p_4}$$ = 0.125 and $${p_2}...
A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events: $${x_0}$$ : a " zero " is transmitted $${x_1}$$ : a " one " is transmitted $${y_0}$$ : a "...
Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilities of occurrence $$P = \left\{ {{1 \over 2},\,{1 \over 4},\,{1 \over...
A digital communication system uses a repetition code for channel encoding/decoding. During transmission, each bit is repeated three times (0 is transmitted as 000, and 1 is transmitted as 111). It is assumed that the so...
The capacity of band-limited additive white Gaussian noise (AWGN) channel is given by $$C = \,W\,\,{\log _2}\left( {1 + {P \over {{\sigma ^2}\,W}}} \right)$$ bits per second (bps), where W is the channel bandwidth, P is...
The capacity of a Binary Symmetric Channel (BSC) with cross - over probability 0.5 is ______ .
A fair coin is tossed repeatedly until a 'Head' appears for the first time. Let L be the number of tosses to get this first 'Head'. The entropy H (L) in bits is _______________.
A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of the first symbol by a small amount $$\varepsilon $$ and decreases that o...
A communication channel with AWGN operating at a signal to noise ratio SNR > > 1 and band width B has capacity $${{C_1}}$$ . If the SNR is doubled keeping B constant, the resulting capacity $${{C_2}}$$ is given
Consider a Binary Symmetric Channel (BSC) with probability of error being 'p'. To transit a bit, say 1, we transmit a sequence of three 1s. The receiver will interpret the received sequence to represent 1 if at least two...
A memory less source emits n symbols each with a probability p. The entropy of the source as a function of n
A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of symbols, the most efficient source encoder would have average bit rate as
Source encoding in a data communication system is done in order to
A binary source has symbol probabilities 0.8 and 0.2. If extension coding (blocks of 4 symbols) is used, the lower and upper bounds on the average code word length are : (a)lower___________. (b) higher_________.
An image uses $$512\, \times \,512$$ picture elements. Each of the picture elements can take any of the 8 distinguishable intersity levels. The maximum entropy in the above image will be
A source produces 4 symbols with probabilities $${1 \over 2},\,{1 \over 4},\,{1 \over 8}\,\,and\,\,{1 \over 8}.$$ For this source, a practical coding scheme has an average codeword lenght of 2 bits/symbols. The efficienc...