- Two binary random variables X and Y are distributed according to the joint Distribution given as P(X=Y=0) = P(X=Y=1) = P(X=Y=1) = 1/3.Then, [01D01]
- An independent discrete source transmits letters from an alphabet consisting of A and B with respective probabilities 0.6 and 0.4.If the consecutive letters are statistically independent , and two symbol words are transmitted, then the probability of the words with different symbols is [01D02]
- A memoryless source emits 2000 binary symbols/sec and each symbol has a probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.The minimum number of bits/sec required for error free transmission of this source in bits/symbol is [01M01]
- Which of the following channel matrices respresent a symmetric channel? [01M02]
- The capacity of the channel with the channel Matrix
where xi`s are transmitted messages and yj`s are received messages is [01M03] - Information rate of a source is [01S01]
- If `a` is an element of a Field `F`, then its additive inverse is [01S02]
- The minimum number of elements that a field can have is [01S03]
- Which of the following is correct? [01S04]
- If the output of a continuous source is limited to an average power of σ 2, then the Maximum entropy of the source is [01S05]
- A Convolutional encoder of code rate 1/2 consists of two stage shift register.The Generator sequence of top adder is (1,1,1) and that of the bottom adder is (1,0,1). The constraint length of the encoder is [02D01]
- The Parity Check Matrix of a (6,3) Systematic Linear Block code is
If the Syndrome vector computed for the received code word is [ 1 1 0] , then for error correction, which of the bits of the received code word is to be complemented? [02D02] - The minimum number of bits per message required to encode the output of source transmitting four different messages with probabilities 0.5,0.25,0.125 and 0.125 is [02M01]
- A Communication channel is represented by the channel Matrix given as
In the above matrix, rows correspond to the Transmitter X and the columns correspond to the Receiver Y. Then, the Conditional entropy H(Y/X) in bits/message is [02M02] - The Channel Matrix of a Noiseless channel [02M03]
- Enropy of a source is [02S01]
- Relative to Hard decision decoding, soft decision decoding results in [02S02]
- Which of the following is the essential requirement of a source coding scheme? [02S03]
- The transition probabilities for a BSC will be represented using [02S04]
- A Field is [02S05]
- The constraint length of a convolutional encoder of code rate 1/3 is 5. If the input of the encoder is a 5 bit message sequence, the length of the out put code word in bits is [03D01]
- A communication channel is represented by its channel matrix with rows representing the messages associated with the source and the columns representing the messages associated with the receiver given as
Its capacity in bits is [03D02] - A Binary Erasure channel has P(0/0) = P(1/1) = p; P(k/0) = P(k/1) = q. Its Capacity in bits/symbol is [03M01]
- When a pair of dice is thrown, the average amount of information contained in the message " The sum of the faces is 7" in bits is [03M02]
- A source emits messages A and B with probability 0.8 and 0.2 respectively. The redundancy provided by the optimum source coding scheme for the above Source is [03M03]
- Information content of a message [03S01]
- .Under error free reception, the syndrome vector computed for the received cyclic code word consists of [03S02]
- A continuous source will have maximum entropy associated if the pdf associated with its output is [03S03]
- Variable length source coding provides better coding efficiency, if all the messages of the source are [03S04]
- Shanon's Limit deals with [03S05]
- The Parity Check Matrix of a (6,3) Systematic Linear Block code is
If the Syndrome vector computed for the received code word is [ 0 1 1] , then for error correction, which of the bits of the received code word is to be complemented? [04D01] - A (7,4) Cyclic code has a generator polynomial given as 1+x+x3.If the error pattern 0001000, the corresponding syndrome vector is [04D02]
- The Memory length of a convolutional encoder is 3.If a 5 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [04M01]
- The code words of a systematic (6,3) Linear Block code are 001110,010011,011101, 100101,101011,110110,111000.Which of the following is also a code word of the c Code? [04M02]
- The syndrome S(x) of a cyclic code is given by Reminder of the division
, where V(x) is the transmitted code polynomial, E(x) is the error polynomial and g(x) is the generator polynomial. The S(x) is also equal to [04M03]
- Source1 is transmitting two messages with probabilities 0.2 and 0.8 and Source 2 is transmitting two messages with probabilities 0.5 and 0.5.Then [04S01]
- A source X and the receiver Y are connected by a noise free channel. Its capacity is [04S02]
- The entropy measure of a continuous source is a [04S03]
- Which of the following is correct? [04S04]
- Error free communication may be possible by [04S05]
- For the data word 1010 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x3, the code polynomial is [05D01]
- The output of a continuous source is a uniform random variable in the range
.The entropy of the source in bits/sample is [05D02]
- For the source X transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8, Maximum coding efficiency can be obtained by using [05M01]
- In Modulo-7 addition, 6+1 is equal to [05M02]
- A source is transmitting two messages A and B with probabilities 3/4, and 1/4 respectively. The coding efficiency of the first order extension of the source is [05M03]
- The noise characteristic of a communication channel is given as
Rows represent the source and columns represent the columns. The Channel is a [05M04] - The source coding efficiency can be increased by [05S01]
- The capacity of a channel with infinite band width is [05S02]
- The Hamming Weight of the (6,3) Linear Block coded word 101011 [05S03]
- The cascade of two Binary Symmetric Channels is a [05S04]
- In a (6,3) systematic Linear Block code, the number of `6` bit code words that are not useful is [06D01]
- The Parity check Matrix H of a (6,3) Linear systematic Block code is
Then [06D02]
- In a Binary Symmetric channel, a transmitted 0 is received as 0 with a probability of 1/8.Then, the transition probability of the transmitted 0 is [06M01]
- A source transmitting `n` number of messages is connected to a noise free channel. The capacity of the channel is [06M02]
- There are four binary words given as 0000,0001,0011,0111. Which of these can not be a member of the parity check matrix of a (15,11) linear Block code? [06M03]
- If X is the transmitter and Y is the receiver and if the channel is the noise free, then, the mutual information I(X,Y) is equal to [06S01]
- Which of the following is correct? [06S02]
- Which of the following is an FEC scheme? [06S03]
- A discrete source X is transmitting m messages and is connected to the receiver Y through a symmetric channel. The capacity of the channel is given as [06S04]
- If the received code word of a (6,3) linear Block code is 100111 with an error in the
bit, the corresponding error pattern will be [06S05]
- For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x3, the code polynomial is [07D01]
- The output of a source is band limited to 6KHz.It is sampled at a rate of 2KHz above the nyquist`s rate. If the entropy of the source is 2bits/sample, then the entropy of the source in bits/sec is [07D02]
- When two fair dice are thrown simultaneously, the information content of the message ` the sum of the faces is 12` in bits is [07M01]
- The encoder of a (7,4) systematic cyclic encoder with generating polynomial g(x) = 1+x2+x3 is basically a [07M02]
- A received code word of a (7,4) systematic received cyclic code word 1000011 is corrected as 1000111.The corresponding error pattern is [07M03]
- The product of 5 and 6 in Modulo-7 multiplication is [07S01]
- Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [07S02]
- The time domain behavior of a convolutional encoder of code rate 1/3 is defined in terms of a set of [07S03]
- Which of the following is correct? [07S04]
- A linear block code with Hamming distance 5 is [07S05]
- The channel capacity of a BSC with transition probability 1/2 is [08D01]
- White noise of PSD
w/Hz is applied to an ideal LPF with one sided band width of 1Hz .The two sided output noise power of the channel is [08D02]
- A convolutional encoder of code rate 1/2 is a 3 stage shift register with a message word length of 6.The code word length obtained from the encoder ( in bits) is [08M01]
- A source X with entropy 2 bits/message is connected to the receiver Y through a Noise free channel. The conditional probability of the source, given the receiver is H(X/Y) and the joint entropy of the source and the receiver H(X,Y) .Then [08M02]
- A channel with independent input and output acts as [08M03]
- Automatic Repeat Request is a [08S01]
- Channel coding [08S02]
- The information content available with a source is referred to as [08S03]
- In a Linear Block code [08S04]
- In Modulo-4 arithmetic, the product of 3 and 2 is [08S05]
- For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x2+x3, the code polynomial is [09D01]
- In a (7,4) systematic Linear Block code, the number of `7` bit code words that are not useful for the user is [09D02]
- Which of the following is a valid source coding scheme for a source transmitting four messages? [09M01]
- A system has a band width of 4KHz and an S/N ratio of 28 at the input to the Receiver. If the band width of the channel is doubled, then [09M02]
- The Memory length of a convolutional encoder is 4.If a 5 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [09M03]
- In Modulo-5 multiplication, the product of 4 and 3 is [09S01]
- Which of the following provides minimum redundancy in coding? [09S02]
- If C is the channel capacity and S is the signal input of the channel and
is the Input noise PSD, then which of the following is the Shannon`s limit? [09S03]
- A communication channel is fed with an input signal x(t) and the noise in the channel is negative. The Power received at the receiver input is [09S04]
- The fundamental limit on the average number of bits/source symbol is [09S05]
- The Parity Check Matrix of a (6,3) Systematic Linear Block code is
If the Syndrome vector computed for the received code word is [ 0 1 0] , then for error correction, which of the bits of the received code word is to be complemented? [10D01] - White noise of PSD
is applied to an ideal LPF with one sided band width of B Hz. The filter provides a gain of 2.If the output power of the filter is 8η , then the value of B in Hz is [10D02]
- The Memory length of a convolutional encoder is 5.If a 6 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [10M01]
- A source is transmitting four messages with equal probability. Then,for optimum Source coding efficiency, [10M02]
- Which of the following is a valid source coding scheme for a source transmitting five messages? [10M03]
- In Modulo-7 addition, 6 + 4 is equal to [10S01]
- Which of the following provides minimum redundancy in coding? [10S02]
- Which of the following involves the effect of the communication channel? [10S03]
- Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [10S04]
- Which of the following provides the facility to recognize the error at the receiver? [10S05]
- Which of the following coding schemes is linear ? [11D01]
- If the transition probability of messages 0 and 1 in a communication system is 0.1, the noise matrix of the corresponding Communication channel is [11D02]
- In a BSC, the rate of information transmission over the channel decreases as [11M01]
- A source X is connected to a receiver R through a lossless channel. Then [11M02]
- Which of the following is a valid source coding scheme for a source transmitting four messages? [11M03]
- The Hamming distance of a triple error correcting code is [11S01]
- A channel whose i/p is xi and output is yj is deterministic if [11S02]
- If a memoryless source of information rate R is connected to a channel with a channel capacity C, then on which of the following statements, the channel coding for the output of the source is based ? [11S03]
- Which of the following is correct? [11S04]
- The minimum source code word length of the message of a source is equal to [11S05]
- If the transition probability of messages 0 and 1 in a communication system is 0.2, the noise matrix of the corresponding Communication channel is [12D01]
- In a BSC, if the transition probability of the messages 0 and 1 is P, and if they are of equal transmission probability, then, the probability of these symbols to appear at the channel output is [12D02]
- The number of bits to be used by the efficient source encoder to encode the output of the source is equal to [12M01]
- A source X is connected to a receiver R through a deterministic channel. Then [12M02]
- Which of the following can be valid source coding scheme for a source transmitting 3 messages? [12M03]
- For an (n,k) cyclic code, E(x) is the error polynomial, g(x) is the generator Polynomial, R(x) is the received code polynomial and C(x) is the transmitted code polynomial. Then, the Syndrome polynomial S(x) is [12S01]
- If
is the input noise PSD and S is the input signal power for a communication channel of capacity C, then Which of the following is Shanon`s Limit? [12S02]
- The Hamming distance of an error correcting code capable of correcting 4 errors is [12S03]
- BCH codes capable of correcting single error are [12S04]
- Which of the following provides the facility to recognize the error at the receiver? [12S05]
- The output of a source is a continuous random variable uniformly distributed over (0,2).The entropy of the source in bits/sample is [13D01]
- An AWGN low pass channel with 4KHz band width is fed with white noise of PSD
= 10
W/Hz. The two sided noise power at the output of the channel is [13D02]
- A system has a band width of 3KHz and an S/N ratio of 29dB at the input of the receiver. If the band width of the channel gets doubled, then [13M01]
- A source is transmitting two symbols A and B with probabilities 7/8 and 1/8 respectively. The average source code word length can be decreased by [13M02]
- Non-Uniqueness of Huffman encoding results in [13M03]
- Shanon's limit is for [13S01]
- In Modulo-6 addition, the sum of 1 and 5 is [13S02]
- FEC and ARQ schemes of error control can be applied for the outputs of [13S03]
- The Hamming distance of an (n,k) systematic cyclic code is [13S04]
- Which of the following is effected by the communication channel? [13S05]
- The maximum average amount of information content measured in bits/sec associated with the output of a discrete information source transmitting 8 messages and 2000messages/sec is [14D01]
- Which of the following coding schemes is linear ? [14D02]
- A communication source is connected to a receiver using a communication channel such that, the uncertainty about the transmitted at the receiver, after knowing the received is zero. Then, the information gained by the observer at the receiver is [14M01]
- X(t) and n(t) are the signal and the noise each is band limited to 2B Hz applied to a communication channel band limited to BHz. Then ,the minimum number of samples/sec that should be transmitted to recover the input of the channel at its output is [14M02]
- The upper limit on the minimum distance of a linear block code is [14S01]
- Information rate of a source can be used to [14S02]
- A source is transmitting only one message. Then [14S03]
- If C is the code word and H is the Parity check Matrix of an (n,k) linear block code, then, [14S04]
- Theoritically, the entropy of a continuous random variable is [14S05]
- A convolutional encoder is having a constraint length of 4 and for each input bit, a two bit word is the output of the encoder. If the input message is of length 5, the exact code rate of the encoder is [15D01]
- In a message conveyed through a sequence of independent dots and dashes, the probability of occurrence of a dash is one third of that of dot.The information content of the word with two dashes in bits is [15D02]
- The voice frequency modulating signal of a PCM system is quantized into 16 levels. If the signal is band limited to 3KHz, the minimum symbol rate of the system is [15M01]
- A source is transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8. To have 100 % transmission efficiency, the average source code word length of the message of the source should be [15M02]
- A source is transmitting six messages with probabilities, 1/2, 1/4, 1/8, 1/16,1/32, and 1/32.Then [15M03]
- The average source code word length per bit can be decreased by [15S01]
- Trade-off between Band width and Signal to Noise ratio results in [15S02]
- Binary erasure channel is an example of [15S03]
- In a symmetric channel, [15S04]
- Which of the following is correct? [15S05]
- A source generates three symbols with probabilities of 0.25, 0.25 and 0.5 at a rate of 3000 symbols/sec. Assuming independent generation of symbols, the most efficient source encoder would have an average bit rate of [16D01]
- A memoryless source emits 2000 binary symbols/sec and each symbol has a Probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.Thre minimum number of bits/sec required for error free transmission of this source is [16D02]
- In a communication system, due to noise in the channel, an average of one symbol in each 100 received is incorrect. The symbol transmission rate is 1000.The number of bits in error in the received symbols is [16M01]
- Which of the following is a valid source coding scheme for a source transmitting Six messages? [16M02]
- The encoder of an (15,11) systematic cyclic code requires [16M03]
- The distance between the any code word and an all zero code word of an (n,k) linear Block code is referred to as [16S01]
- As per source coding Theorem, it is not possible to find any uniquely decodable code whose average length is [16S02]
- The coding efficiency due to second order extension of a source [16S03]
- Exchange between Band width and Signal noise ratio can be justified based on [16S04]
- A source X is connected to a receiver Y through a noise free channel. Its capacity is [16S05]
- A zero memory source emits two messages A and B with probability of 0.8 and 0.2 respectively. The entropy of the second order extension of the source is [17D01]
- A signal amplitude X is a uniform random variable in the range (-1,1).Its differential Entropy is [17D02]
- A communication channel is so noisy that the output Y of the channel is statistically independent of the input X. Then, [17M01]
- A transmitting terminal has 128 characters and the data sent from the terminal consist of independent sequences of equiprobable characters. The entropy of the above terminal in bits/character is [17M02]
- For a (7,4) systematic Cyclic code, the generator polynomial is 1+x+x3.Then, the Syndrome vector corresponding to the error pattern 0000010 is [17M03]
- Which of the following is correct? [17S01]
- In a communication system, information lost in the channel is measured using [17S02]
- Capacity of a BSC with infinite band width is not infinity, because [17S03]
- For a noise free channel, I(X,Y) is equal to [17S04]
- The output of a continuous source is a Gaussian random variable with variance σ 2 and is band limited to fm Hz. The maximum entropy of the source is [17S05]
- If the generator polynomial of a (7,4) Non-systematic cyclic code is given as g(x) = 1+x+x2+x4, then the binary word corresponding to x2.g(x) + g(x) is [18D01]
- An (7,4) systematic cyclic code has a generator polynomial g(x) = 1+x+x3,and the Code polynomial is V(x) = x+
. Then, the remainder of the division V(x)/g(x) is [18D02]
- The out put of a continuous source is a uniform random variable of (0,1).Then [18M01]
- The generator sequence of an adder in a convolutional encoder is (1,1,1,1).It is its response for an input sequence of [18M02]
- In a communication system, the average amount of uncertainty associated with the Source, sink, source and sink jointly in bits/message are 1.0613,1.5 and 2.432 respectively. Then the information transferred by the channel connecting the source and sink in bits is [18M03]
- The efficiency of transmission of information can be measured by [18S01]
- Binary Erasure channel is the mathematical modeling of [18S02]
- In which of the following matrices, the sum of each row is one ? [18S03]
- If T is the code vector and H is the Parity check Matrix of a Linear Block code, then the code is defined by the set of all code vectors for which [18S04]
- Which of the following is correct? [18S05]
- A BSC has a transition probability of P. The cascade of two such channels is [19D01]
- A source is transmitting four messages with probabilities of 0.5, 0.25, 0.125and 0.125.By using Huffman coding, the percentage reduction in the average source code word length is [19D02]
- The parity polynomial in the generation of a systematic (7,4) cyclic code for the data word 1 1 0 0 is 1+x2.The corresponding code word is [19M01]
- Which of the following are prefix free codes? [19M02]
- Which of the following is a single error correcting perfect code? [19M03]
- If X is the transmitted message and Y is the received message, then the average information content of the pair (X,Y) is equal to the average information of Y plus [19S01]
- The entropy H(
) is [19S02]
- If X and Y are the transmitter and the receiver, in a BSC, P(X = i/Y=j) measures [19S03]
- If X and Y are related in one-to-one manner, then, H(X/Y) in bits is [19S04]
- If the output of the channel is independent of the input, then [19S05]
- The Parity check matrix of a linear block code is
.Its Hamming distance is [20D01]
- A source with equally likely outputs is connected to a communication channel with channel matrix
.The columns of the matrix represent the probability that a transmitted bit is identified as 0, a transmitted bit unidentified, and a transmitted bit is identified as 1 respectively. Then, the probability that the bit is not identified is [20D02]
- The Hamming distance of the code vectors Ci and Cj is [20M01]
- The minimum number of parity bits required for the single error correcting linear block code for 11 data bits is [20M02]
- A source X with symbol rate of 1000 symbols/sec is connected to a receiver Y using a BSC with transition probability P. The messages of the source are equally likely. Then, rate of information transmission over the channel in bits per sec is [20M03]
- Entropy coding is a [20S01]
- Which of the following is correct? [20S02]
- For a BSC with transition probability P, the bit error probability is [20S03]
- A (4,3) Parity check code can [20S04]
- A source of information rate of 80 Kbps is connected to a communication channel of capacity 66.6 Kbps. Then [20S05]
Sunday, October 26, 2008
ONLINE EXAMINATIONS [Mid 2 - dc]
Subscribe to:
Post Comments (Atom)
About Me
- Be Innovative
- i am simple...........
No comments:
Post a Comment