FirstRanker Logo

FirstRanker.com - FirstRanker's Choice is a hub of Question Papers & Study Materials for B-Tech, B.E, M-Tech, MCA, M.Sc, MBBS, BDS, MBA, B.Sc, Degree, B.Sc Nursing, B-Pharmacy, D-Pharmacy, MD, Medical, Dental, Engineering students. All services of FirstRanker.com are FREE

📱

Get the MBBS Question Bank Android App

Access previous years' papers, solved question papers, notes, and more on the go!

Install From Play Store

Download VTU BE 2020 Jan ECE Question Paper 17 Scheme 5th Sem 17EC54 Information Theory and Coding

Download Visvesvaraya Technological University (VTU) BE ( Bachelor of Engineering) ECE (Electronic engineering) 2017 Scheme 2020 January Previous Question Paper 5th Sem 17EC54 Information Theory and Coding

This post was last modified on 02 March 2020

1
1
7EC54
USN
Assume P(1) = P(2) = P(3) = 1

--- Content provided by FirstRanker.com ---



Fig.Q 1(c) (08 Marks)
OR
What is self information? Mentions its various measuring units and also mentions the

--- Content provided by FirstRanker.com ---

reasons for choosing logarithmic function. (06 Marks)
A binary source is emitting an independent sequence of 0's l's with probabilities of P and
1 ? P respectively. Plot the entropy of this source versus probability. (06 Marks)
For the first order Markov statistical model as shown in Fig.Q2(c).
(i) Find the probability of each state (ii) Find H(s) and H(s

--- Content provided by FirstRanker.com ---

2
)
Fifth Semester B.E. Degree Examination, Dec.2019/Jan220/0
Information Theory and Coding
Time: 3 hrs. Max. Marks: 100

--- Content provided by FirstRanker.com ---

Note: Answer any FIVE full questions, choosing ONE full question from each module.
Module-1
1 a. Suppose you are planning a trip to Miami, Florida from Minneapolis in the winter time. You
are receiving the following information from Miami Weather bureau:
(i) Mild and Sunny day (ii) Cold day (iii) Possible snow flurries

--- Content provided by FirstRanker.com ---

Explain the amount of information content in each statement. (06 Marks)
b. The output of an information source consists of 128 symbols, 16 of which occurs with
G prob
ability of ?
1

--- Content provided by FirstRanker.com ---

and the remaining 112 occurs with probability of . The source emits
te
C-
32 224
g 1000 symbols/sec. Assuming that the symbols are chosen independently. Find the Average

--- Content provided by FirstRanker.com ---

'
11

Information Rate of this source. (06 Marks)
11

--- Content provided by FirstRanker.com ---

oc
. (.1
c. The state diagram of a stationary Mark off Source is shown in Fig.Q1(c):
?
( i) Find the entropy of each state

--- Content provided by FirstRanker.com ---

CL
,

<1.) ) Find the entropy of the source
( iii) Find G1 and G, and verify that GI G2 ?. H.

--- Content provided by FirstRanker.com ---

0
17 .
0
ZA fit
. "

--- Content provided by FirstRanker.com ---


0 cr
C.>
Tc O
C

--- Content provided by FirstRanker.com ---

0 -0
ca ca
-0 8
F 0
E

--- Content provided by FirstRanker.com ---

f
g"
u
e
,

--- Content provided by FirstRanker.com ---

a

-
8
2 a.

--- Content provided by FirstRanker.com ---

0
c aq
b.
74
C.

--- Content provided by FirstRanker.com ---

8
> c.
C
(-4
(1)

--- Content provided by FirstRanker.com ---

0
z
4
E'
O

--- Content provided by FirstRanker.com ---

E
Fig.Q2(c) where A, B, and C are the states. (08 Marks)
FirstRanker.com - FirstRanker's Choice
1
1

--- Content provided by FirstRanker.com ---

7EC54
USN
Assume P(1) = P(2) = P(3) = 1


--- Content provided by FirstRanker.com ---

Fig.Q 1(c) (08 Marks)
OR
What is self information? Mentions its various measuring units and also mentions the
reasons for choosing logarithmic function. (06 Marks)
A binary source is emitting an independent sequence of 0's l's with probabilities of P and

--- Content provided by FirstRanker.com ---

1 ? P respectively. Plot the entropy of this source versus probability. (06 Marks)
For the first order Markov statistical model as shown in Fig.Q2(c).
(i) Find the probability of each state (ii) Find H(s) and H(s
2
)

--- Content provided by FirstRanker.com ---

Fifth Semester B.E. Degree Examination, Dec.2019/Jan220/0
Information Theory and Coding
Time: 3 hrs. Max. Marks: 100
Note: Answer any FIVE full questions, choosing ONE full question from each module.
Module-1

--- Content provided by FirstRanker.com ---

1 a. Suppose you are planning a trip to Miami, Florida from Minneapolis in the winter time. You
are receiving the following information from Miami Weather bureau:
(i) Mild and Sunny day (ii) Cold day (iii) Possible snow flurries
Explain the amount of information content in each statement. (06 Marks)
b. The output of an information source consists of 128 symbols, 16 of which occurs with

--- Content provided by FirstRanker.com ---

G prob
ability of ?
1
and the remaining 112 occurs with probability of . The source emits
te

--- Content provided by FirstRanker.com ---

C-
32 224
g 1000 symbols/sec. Assuming that the symbols are chosen independently. Find the Average
'
11

--- Content provided by FirstRanker.com ---


Information Rate of this source. (06 Marks)
11
oc
. (.1

--- Content provided by FirstRanker.com ---

c. The state diagram of a stationary Mark off Source is shown in Fig.Q1(c):
?
( i) Find the entropy of each state
CL
,

--- Content provided by FirstRanker.com ---


<1.) ) Find the entropy of the source
( iii) Find G1 and G, and verify that GI G2 ?. H.
0
17 .

--- Content provided by FirstRanker.com ---

0
ZA fit
. "

0 cr

--- Content provided by FirstRanker.com ---

C.>
Tc O
C
0 -0
ca ca

--- Content provided by FirstRanker.com ---

-0 8
F 0
E
f
g"

--- Content provided by FirstRanker.com ---

u
e
,
a

--- Content provided by FirstRanker.com ---

-
8
2 a.
0
c aq

--- Content provided by FirstRanker.com ---

b.
74
C.
8
> c.

--- Content provided by FirstRanker.com ---

C
(-4
(1)
0
z

--- Content provided by FirstRanker.com ---

4
E'
O
E
Fig.Q2(c) where A, B, and C are the states. (08 Marks)

--- Content provided by FirstRanker.com ---

Module-2
17Et
3 a. Identify wheth
Symbols Code A Code B Code C
S

--- Content provided by FirstRanker.com ---

I
00 1 0
S2 01 01 100
S3 10 001 101
S4 11 00 111

--- Content provided by FirstRanker.com ---

Table.Q3(a) (06 Marks)
b. Consider a Discrete Memory Source (DMS) with S = {X, Y, Z} with P = {0.6, 0.2, 0.2}.
Find the code word for the message "YXZXY" using Arithmetic code. (06 Marks)
c. An information source produces a sequence of independent symbols having the following
probabilities

--- Content provided by FirstRanker.com ---

Construct Binary Huffman encoding and find its efficiency. (08 Marks)
OR
4 a. Write the Shannon's Encoding Algorithms. (06 Marks)
b. Consider the following source with probabilities:
S= {A, B, C, D, E, F} P= {0.4, 0.2, 0.2, 0.1, 0.08, 0.02}

--- Content provided by FirstRanker.com ---

Find the code words using Shannon-Fano algorithm and also find its efficiency. (06 Marks)
c. Consider the following discrete memoryless source:
S = {S0, Si, S,, S3, S4} P = {0.55, 0.15, 0.15, 0.1, 0.05}
Compute Huffman code by placing composite symbol as high as possible. Also find average
code word length and variance of the code word. (08 Marks)

--- Content provided by FirstRanker.com ---

Module-3
5 a. What is Joint Probability Matrix? How it is obtained from Channel Matrix and also mention
properties of JPM. (06 Marks)
b. For the communication channel shown in Fig.Q5(b), determine Mutual Information and
Information Rate if r

--- Content provided by FirstRanker.com ---

s
= 1000 symbols/sec. Assume P(Xi) = 0.6 and P(X2) = 0.4.
0.%
Fig.Q5(b)
(06 Marks)

--- Content provided by FirstRanker.com ---

c. Discuss the Binary Erasure Channel and also prove that the capacity a Binary Erasure
Channel is C = P ? I-, bits/sec. (08 Marks)
OR
6 a. What is Mutual Information? Mention its properties. (06 Marks)
b. The noise characteristics of a channel shown in Fig.Q6(b). Find the capacity of a channel if

--- Content provided by FirstRanker.com ---

r
s
= 2000 symbols/sec using Muroga's method.
Symbol AB C DE F G
Probabilities 1

--- Content provided by FirstRanker.com ---

3
1
27
1
3

--- Content provided by FirstRanker.com ---

-10 ,
1
9
I
27

--- Content provided by FirstRanker.com ---

1
27
Fig.Q6(b)

(06 Marks)

--- Content provided by FirstRanker.com ---

(08 Marks) c. State and prove the Shannon-Hartley Law.
2 of 3

FirstRanker.com - FirstRanker's Choice
1

--- Content provided by FirstRanker.com ---

1
7EC54
USN
Assume P(1) = P(2) = P(3) = 1

--- Content provided by FirstRanker.com ---


Fig.Q 1(c) (08 Marks)
OR
What is self information? Mentions its various measuring units and also mentions the
reasons for choosing logarithmic function. (06 Marks)

--- Content provided by FirstRanker.com ---

A binary source is emitting an independent sequence of 0's l's with probabilities of P and
1 ? P respectively. Plot the entropy of this source versus probability. (06 Marks)
For the first order Markov statistical model as shown in Fig.Q2(c).
(i) Find the probability of each state (ii) Find H(s) and H(s
2

--- Content provided by FirstRanker.com ---

)
Fifth Semester B.E. Degree Examination, Dec.2019/Jan220/0
Information Theory and Coding
Time: 3 hrs. Max. Marks: 100
Note: Answer any FIVE full questions, choosing ONE full question from each module.

--- Content provided by FirstRanker.com ---

Module-1
1 a. Suppose you are planning a trip to Miami, Florida from Minneapolis in the winter time. You
are receiving the following information from Miami Weather bureau:
(i) Mild and Sunny day (ii) Cold day (iii) Possible snow flurries
Explain the amount of information content in each statement. (06 Marks)

--- Content provided by FirstRanker.com ---

b. The output of an information source consists of 128 symbols, 16 of which occurs with
G prob
ability of ?
1
and the remaining 112 occurs with probability of . The source emits

--- Content provided by FirstRanker.com ---

te
C-
32 224
g 1000 symbols/sec. Assuming that the symbols are chosen independently. Find the Average
'

--- Content provided by FirstRanker.com ---

11

Information Rate of this source. (06 Marks)
11
oc

--- Content provided by FirstRanker.com ---

. (.1
c. The state diagram of a stationary Mark off Source is shown in Fig.Q1(c):
?
( i) Find the entropy of each state
CL

--- Content provided by FirstRanker.com ---

,

<1.) ) Find the entropy of the source
( iii) Find G1 and G, and verify that GI G2 ?. H.
0

--- Content provided by FirstRanker.com ---

17 .
0
ZA fit
. "

--- Content provided by FirstRanker.com ---

0 cr
C.>
Tc O
C
0 -0

--- Content provided by FirstRanker.com ---

ca ca
-0 8
F 0
E
f

--- Content provided by FirstRanker.com ---

g"
u
e
,
a

--- Content provided by FirstRanker.com ---


-
8
2 a.
0

--- Content provided by FirstRanker.com ---

c aq
b.
74
C.
8

--- Content provided by FirstRanker.com ---

> c.
C
(-4
(1)
0

--- Content provided by FirstRanker.com ---

z
4
E'
O
E

--- Content provided by FirstRanker.com ---

Fig.Q2(c) where A, B, and C are the states. (08 Marks)
Module-2
17Et
3 a. Identify wheth
Symbols Code A Code B Code C

--- Content provided by FirstRanker.com ---

S
I
00 1 0
S2 01 01 100
S3 10 001 101

--- Content provided by FirstRanker.com ---

S4 11 00 111
Table.Q3(a) (06 Marks)
b. Consider a Discrete Memory Source (DMS) with S = {X, Y, Z} with P = {0.6, 0.2, 0.2}.
Find the code word for the message "YXZXY" using Arithmetic code. (06 Marks)
c. An information source produces a sequence of independent symbols having the following

--- Content provided by FirstRanker.com ---

probabilities
Construct Binary Huffman encoding and find its efficiency. (08 Marks)
OR
4 a. Write the Shannon's Encoding Algorithms. (06 Marks)
b. Consider the following source with probabilities:

--- Content provided by FirstRanker.com ---

S= {A, B, C, D, E, F} P= {0.4, 0.2, 0.2, 0.1, 0.08, 0.02}
Find the code words using Shannon-Fano algorithm and also find its efficiency. (06 Marks)
c. Consider the following discrete memoryless source:
S = {S0, Si, S,, S3, S4} P = {0.55, 0.15, 0.15, 0.1, 0.05}
Compute Huffman code by placing composite symbol as high as possible. Also find average

--- Content provided by FirstRanker.com ---

code word length and variance of the code word. (08 Marks)
Module-3
5 a. What is Joint Probability Matrix? How it is obtained from Channel Matrix and also mention
properties of JPM. (06 Marks)
b. For the communication channel shown in Fig.Q5(b), determine Mutual Information and

--- Content provided by FirstRanker.com ---

Information Rate if r
s
= 1000 symbols/sec. Assume P(Xi) = 0.6 and P(X2) = 0.4.
0.%
Fig.Q5(b)

--- Content provided by FirstRanker.com ---

(06 Marks)
c. Discuss the Binary Erasure Channel and also prove that the capacity a Binary Erasure
Channel is C = P ? I-, bits/sec. (08 Marks)
OR
6 a. What is Mutual Information? Mention its properties. (06 Marks)

--- Content provided by FirstRanker.com ---

b. The noise characteristics of a channel shown in Fig.Q6(b). Find the capacity of a channel if
r
s
= 2000 symbols/sec using Muroga's method.
Symbol AB C DE F G

--- Content provided by FirstRanker.com ---

Probabilities 1
3
1
27
1

--- Content provided by FirstRanker.com ---

3
-10 ,
1
9
I

--- Content provided by FirstRanker.com ---

27
1
27
Fig.Q6(b)

--- Content provided by FirstRanker.com ---

(06 Marks)
(08 Marks) c. State and prove the Shannon-Hartley Law.
2 of 3

17EC54

--- Content provided by FirstRanker.com ---

Module-4
7 a. What are the advantages and disadvantages of Error Control Coding? Discuss the methods
of controlling Errors. (06 Marks)
b. The parity check bits of a (7, 4) Hamming code are generated by
C

--- Content provided by FirstRanker.com ---

s
=d, +d
3
+d
4

--- Content provided by FirstRanker.com ---


C
o
= d, +d, +d
3

--- Content provided by FirstRanker.com ---


C, = d? + d
3
+ d
4

--- Content provided by FirstRanker.com ---


where dl, d2, d
3
and d
4

--- Content provided by FirstRanker.com ---

are the message bits.
(i) Find G and H for this code.
(ii) Prove that GH
T
= 0. (06 Marks)

--- Content provided by FirstRanker.com ---

c.
Design a syndrome calculating circuit for a (7, 4) cyclic code with g(X) =1+ X + X
3
and
also calculate the syndrome of the received vector R = 1110101. (08 Marks)

--- Content provided by FirstRanker.com ---

OR
8 a. For a systematic (6, 3) linear block code, the Parity Matrix P is given by
1 0 1
[Fs] = 0 1 1

--- Content provided by FirstRanker.com ---

1 1 0

(i) Find all possible code words.
(ii) Find error detecting and correcting capability. (06 Marks)
b. A (7, 4) cyclic code has the generator polynomial g(X) =1+ X + X

--- Content provided by FirstRanker.com ---

3
. Find the code vector
both in systematic and non-systematic form for the message bits (1101). (06 Marks)
c. Draw the Encoder circuit of a cyclic code using (n ? K) bit shift Registers and explain it.
(08 Marks)

--- Content provided by FirstRanker.com ---

Module-5
9 a. Consider (3, 1, 2) Convolution Encoder with g
t
= 110, g
(2)

--- Content provided by FirstRanker.com ---

= 101 and g
(3)
= 111.
(i) Draw the encoder diagram.
(ii) Find the code word for the message sequence (11101) using generator Matrix and

--- Content provided by FirstRanker.com ---

Transform domain approach. (16 Marks)
b. Discuss the BCH codes. (04 Marks)
OR
10 a. Consider the convolution encoder shown in Fig.Ql0(a).
(i) Write the impulse response and its polynomial.

--- Content provided by FirstRanker.com ---

(ii) Find the output corresponding to input message (10111) using time and transform
domain approach.
Fig.Q10(a)
b. Write a note on Golay codes.
3 of 3

--- Content provided by FirstRanker.com ---

(16 Marks)
(04 Marks)
FirstRanker.com - FirstRanker's Choice