Information Theory
GATE Electronics & Communication · Information Theory · 1989-2025
Study anchor
Source-book anchor pending for this concept.
Practice action
Start latest PYQPYQs in this concept
All concepts →X and Y are Bernoulli random variables taking values in {0,1}. The joint probability mass function of the random variables is given by: $P(X = 0, Y = 0) = 0.06$ $P(X = 0, Y = 1) =...
The random variable $X$ takes values in $\{-1,0,1\}$ with probabilities $P(X=-1)=P(X=1)$ and $\alpha$ and $P(X=0)=1-2 \alpha$, where $0 Let $g(\alpha)$ denote the entropy of $X$ (i...
$X$ and $Y$ are Bernoulli random variables taking values in $\{0,1\}$. The joint probability mass function of the random variables is given by: $$ \begin{aligned} & P(X=0, Y=0)=0.0...
A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _________.
A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _______ .
The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be...
The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be...
Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?
In a high school having equal number of boy students and girl students, 75% of the students study Science and the remaining 25% students study Commerce. Commerce students are two t...
A binary random variable $X$ takes the value +2 or -2 . The probability $P(X=+2)=\alpha$. The value of $\alpha$ (rounded off to one decimal place), for which the entropy of $X$ is...
Let $$\left( {{X_1},\,{X_2}} \right)$$ be independent random variables, $${X_1}$$ has mean 0 and variance 1, while $${X_2}$$ has mean 1 and variance 4. The mutual information I $$\...
A discrete memoryless source has an alphabet $$({a_1},\,{a_2},\,{a_3},\,{a_4})\,$$ with corresponding probabilities$$\left( {{1 \over 2}\,\,,{1 \over 4},\,{1 \over 8},\,\,{1 \over...
A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events: $${x_0}$$ : a " zero " is transmitted $${x_1}$$ :...
Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilities of occurrence $$P = \left\...
The capacity of a Binary Symmetric Channel (BSC) with cross - over probability 0.5 is ______ .
A fair coin is tossed repeatedly until a 'Head' appears for the first time. Let L be the number of tosses to get this first 'Head'. The entropy H (L) in bits is _______________.
A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of the first symbol by a small amou...
A memory less source emits n symbols each with a probability p. The entropy of the source as a function of n
A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of symbols, the most efficient source enc...
Source encoding in a data communication system is done in order to
An image uses $$512\, \times \,512$$ picture elements. Each of the picture elements can take any of the 8 distinguishable intersity levels. The maximum entropy in the above image w...
A source produces 4 symbols with probabilities $${1 \over 2},\,{1 \over 4},\,{1 \over 8}\,\,and\,\,{1 \over 8}.$$ For this source, a practical coding scheme has an average codeword...