Download VTU BE 2020 Jan ECE Question Paper 17 Scheme 5th Sem 17EC54 Information Theory and Coding

Download Visvesvaraya Technological University (VTU) BE ( Bachelor of Engineering) ECE (Electronic engineering) 2017 Scheme 2020 January Previous Question Paper 5th Sem 17EC54 Information Theory and Coding

1
1
7EC54
USN
Assume P(1) = P(2) = P(3) = 1


Fig.Q 1(c) (08 Marks)
OR
What is self information? Mentions its various measuring units and also mentions the
reasons for choosing logarithmic function. (06 Marks)
A binary source is emitting an independent sequence of 0's l's with probabilities of P and
1 ? P respectively. Plot the entropy of this source versus probability. (06 Marks)
For the first order Markov statistical model as shown in Fig.Q2(c).
(i) Find the probability of each state (ii) Find H(s) and H(s
2
)
Fifth Semester B.E. Degree Examination, Dec.2019/Jan220/0
Information Theory and Coding
Time: 3 hrs. Max. Marks: 100
Note: Answer any FIVE full questions, choosing ONE full question from each module.
Module-1
1 a. Suppose you are planning a trip to Miami, Florida from Minneapolis in the winter time. You
are receiving the following information from Miami Weather bureau:
(i) Mild and Sunny day (ii) Cold day (iii) Possible snow flurries
Explain the amount of information content in each statement. (06 Marks)
b. The output of an information source consists of 128 symbols, 16 of which occurs with
G prob
ability of ?
1
and the remaining 112 occurs with probability of . The source emits
te
C-
32 224
g 1000 symbols/sec. Assuming that the symbols are chosen independently. Find the Average
'
11

Information Rate of this source. (06 Marks)
11
oc
. (.1
c. The state diagram of a stationary Mark off Source is shown in Fig.Q1(c):
?
( i) Find the entropy of each state
CL
,

<1.) ) Find the entropy of the source
( iii) Find G1 and G, and verify that GI G2 ?. H.
0
17 .
0
ZA fit
. "

0 cr
C.>
Tc O
C
0 -0
ca ca
-0 8
F 0
E
f
g"
u
e
,
a

-
8
2 a.
0
c aq
b.
74
C.
8
> c.
C
(-4
(1)
0
z
4
E'
O
E
Fig.Q2(c) where A, B, and C are the states. (08 Marks)
FirstRanker.com - FirstRanker's Choice
1
1
7EC54
USN
Assume P(1) = P(2) = P(3) = 1


Fig.Q 1(c) (08 Marks)
OR
What is self information? Mentions its various measuring units and also mentions the
reasons for choosing logarithmic function. (06 Marks)
A binary source is emitting an independent sequence of 0's l's with probabilities of P and
1 ? P respectively. Plot the entropy of this source versus probability. (06 Marks)
For the first order Markov statistical model as shown in Fig.Q2(c).
(i) Find the probability of each state (ii) Find H(s) and H(s
2
)
Fifth Semester B.E. Degree Examination, Dec.2019/Jan220/0
Information Theory and Coding
Time: 3 hrs. Max. Marks: 100
Note: Answer any FIVE full questions, choosing ONE full question from each module.
Module-1
1 a. Suppose you are planning a trip to Miami, Florida from Minneapolis in the winter time. You
are receiving the following information from Miami Weather bureau:
(i) Mild and Sunny day (ii) Cold day (iii) Possible snow flurries
Explain the amount of information content in each statement. (06 Marks)
b. The output of an information source consists of 128 symbols, 16 of which occurs with
G prob
ability of ?
1
and the remaining 112 occurs with probability of . The source emits
te
C-
32 224
g 1000 symbols/sec. Assuming that the symbols are chosen independently. Find the Average
'
11

Information Rate of this source. (06 Marks)
11
oc
. (.1
c. The state diagram of a stationary Mark off Source is shown in Fig.Q1(c):
?
( i) Find the entropy of each state
CL
,

<1.) ) Find the entropy of the source
( iii) Find G1 and G, and verify that GI G2 ?. H.
0
17 .
0
ZA fit
. "

0 cr
C.>
Tc O
C
0 -0
ca ca
-0 8
F 0
E
f
g"
u
e
,
a

-
8
2 a.
0
c aq
b.
74
C.
8
> c.
C
(-4
(1)
0
z
4
E'
O
E
Fig.Q2(c) where A, B, and C are the states. (08 Marks)
Module-2
17Et
3 a. Identify wheth
Symbols Code A Code B Code C
S
I
00 1 0
S2 01 01 100
S3 10 001 101
S4 11 00 111
Table.Q3(a) (06 Marks)
b. Consider a Discrete Memory Source (DMS) with S = {X, Y, Z} with P = {0.6, 0.2, 0.2}.
Find the code word for the message "YXZXY" using Arithmetic code. (06 Marks)
c. An information source produces a sequence of independent symbols having the following
probabilities
Construct Binary Huffman encoding and find its efficiency. (08 Marks)
OR
4 a. Write the Shannon's Encoding Algorithms. (06 Marks)
b. Consider the following source with probabilities:
S= {A, B, C, D, E, F} P= {0.4, 0.2, 0.2, 0.1, 0.08, 0.02}
Find the code words using Shannon-Fano algorithm and also find its efficiency. (06 Marks)
c. Consider the following discrete memoryless source:
S = {S0, Si, S,, S3, S4} P = {0.55, 0.15, 0.15, 0.1, 0.05}
Compute Huffman code by placing composite symbol as high as possible. Also find average
code word length and variance of the code word. (08 Marks)
Module-3
5 a. What is Joint Probability Matrix? How it is obtained from Channel Matrix and also mention
properties of JPM. (06 Marks)
b. For the communication channel shown in Fig.Q5(b), determine Mutual Information and
Information Rate if r
s
= 1000 symbols/sec. Assume P(Xi) = 0.6 and P(X2) = 0.4.
0.%
Fig.Q5(b)
(06 Marks)
c. Discuss the Binary Erasure Channel and also prove that the capacity a Binary Erasure
Channel is C = P ? I-, bits/sec. (08 Marks)
OR
6 a. What is Mutual Information? Mention its properties. (06 Marks)
b. The noise characteristics of a channel shown in Fig.Q6(b). Find the capacity of a channel if
r
s
= 2000 symbols/sec using Muroga's method.
Symbol AB C DE F G
Probabilities 1
3
1
27
1
3
-10 ,
1
9
I
27
1
27
Fig.Q6(b)

(06 Marks)
(08 Marks) c. State and prove the Shannon-Hartley Law.
2 of 3

FirstRanker.com - FirstRanker's Choice
1
1
7EC54
USN
Assume P(1) = P(2) = P(3) = 1


Fig.Q 1(c) (08 Marks)
OR
What is self information? Mentions its various measuring units and also mentions the
reasons for choosing logarithmic function. (06 Marks)
A binary source is emitting an independent sequence of 0's l's with probabilities of P and
1 ? P respectively. Plot the entropy of this source versus probability. (06 Marks)
For the first order Markov statistical model as shown in Fig.Q2(c).
(i) Find the probability of each state (ii) Find H(s) and H(s
2
)
Fifth Semester B.E. Degree Examination, Dec.2019/Jan220/0
Information Theory and Coding
Time: 3 hrs. Max. Marks: 100
Note: Answer any FIVE full questions, choosing ONE full question from each module.
Module-1
1 a. Suppose you are planning a trip to Miami, Florida from Minneapolis in the winter time. You
are receiving the following information from Miami Weather bureau:
(i) Mild and Sunny day (ii) Cold day (iii) Possible snow flurries
Explain the amount of information content in each statement. (06 Marks)
b. The output of an information source consists of 128 symbols, 16 of which occurs with
G prob
ability of ?
1
and the remaining 112 occurs with probability of . The source emits
te
C-
32 224
g 1000 symbols/sec. Assuming that the symbols are chosen independently. Find the Average
'
11

Information Rate of this source. (06 Marks)
11
oc
. (.1
c. The state diagram of a stationary Mark off Source is shown in Fig.Q1(c):
?
( i) Find the entropy of each state
CL
,

<1.) ) Find the entropy of the source
( iii) Find G1 and G, and verify that GI G2 ?. H.
0
17 .
0
ZA fit
. "

0 cr
C.>
Tc O
C
0 -0
ca ca
-0 8
F 0
E
f
g"
u
e
,
a

-
8
2 a.
0
c aq
b.
74
C.
8
> c.
C
(-4
(1)
0
z
4
E'
O
E
Fig.Q2(c) where A, B, and C are the states. (08 Marks)
Module-2
17Et
3 a. Identify wheth
Symbols Code A Code B Code C
S
I
00 1 0
S2 01 01 100
S3 10 001 101
S4 11 00 111
Table.Q3(a) (06 Marks)
b. Consider a Discrete Memory Source (DMS) with S = {X, Y, Z} with P = {0.6, 0.2, 0.2}.
Find the code word for the message "YXZXY" using Arithmetic code. (06 Marks)
c. An information source produces a sequence of independent symbols having the following
probabilities
Construct Binary Huffman encoding and find its efficiency. (08 Marks)
OR
4 a. Write the Shannon's Encoding Algorithms. (06 Marks)
b. Consider the following source with probabilities:
S= {A, B, C, D, E, F} P= {0.4, 0.2, 0.2, 0.1, 0.08, 0.02}
Find the code words using Shannon-Fano algorithm and also find its efficiency. (06 Marks)
c. Consider the following discrete memoryless source:
S = {S0, Si, S,, S3, S4} P = {0.55, 0.15, 0.15, 0.1, 0.05}
Compute Huffman code by placing composite symbol as high as possible. Also find average
code word length and variance of the code word. (08 Marks)
Module-3
5 a. What is Joint Probability Matrix? How it is obtained from Channel Matrix and also mention
properties of JPM. (06 Marks)
b. For the communication channel shown in Fig.Q5(b), determine Mutual Information and
Information Rate if r
s
= 1000 symbols/sec. Assume P(Xi) = 0.6 and P(X2) = 0.4.
0.%
Fig.Q5(b)
(06 Marks)
c. Discuss the Binary Erasure Channel and also prove that the capacity a Binary Erasure
Channel is C = P ? I-, bits/sec. (08 Marks)
OR
6 a. What is Mutual Information? Mention its properties. (06 Marks)
b. The noise characteristics of a channel shown in Fig.Q6(b). Find the capacity of a channel if
r
s
= 2000 symbols/sec using Muroga's method.
Symbol AB C DE F G
Probabilities 1
3
1
27
1
3
-10 ,
1
9
I
27
1
27
Fig.Q6(b)

(06 Marks)
(08 Marks) c. State and prove the Shannon-Hartley Law.
2 of 3

17EC54
Module-4
7 a. What are the advantages and disadvantages of Error Control Coding? Discuss the methods
of controlling Errors. (06 Marks)
b. The parity check bits of a (7, 4) Hamming code are generated by
C
s
=d, +d
3
+d
4

C
o
= d, +d, +d
3

C, = d? + d
3
+ d
4

where dl, d2, d
3
and d
4
are the message bits.
(i) Find G and H for this code.
(ii) Prove that GH
T
= 0. (06 Marks)
c.
Design a syndrome calculating circuit for a (7, 4) cyclic code with g(X) =1+ X + X
3
and
also calculate the syndrome of the received vector R = 1110101. (08 Marks)
OR
8 a. For a systematic (6, 3) linear block code, the Parity Matrix P is given by
1 0 1
[Fs] = 0 1 1

1 1 0

(i) Find all possible code words.
(ii) Find error detecting and correcting capability. (06 Marks)
b. A (7, 4) cyclic code has the generator polynomial g(X) =1+ X + X
3
. Find the code vector
both in systematic and non-systematic form for the message bits (1101). (06 Marks)
c. Draw the Encoder circuit of a cyclic code using (n ? K) bit shift Registers and explain it.
(08 Marks)
Module-5
9 a. Consider (3, 1, 2) Convolution Encoder with g
t
= 110, g
(2)
= 101 and g
(3)
= 111.
(i) Draw the encoder diagram.
(ii) Find the code word for the message sequence (11101) using generator Matrix and
Transform domain approach. (16 Marks)
b. Discuss the BCH codes. (04 Marks)
OR
10 a. Consider the convolution encoder shown in Fig.Ql0(a).
(i) Write the impulse response and its polynomial.
(ii) Find the output corresponding to input message (10111) using time and transform
domain approach.
Fig.Q10(a)
b. Write a note on Golay codes.
3 of 3
(16 Marks)
(04 Marks)
FirstRanker.com - FirstRanker's Choice

This post was last modified on 02 March 2020