FirstRanker's choice
Roll No: Firfirstranker.com
B.TECH.
--- Content provided by FirstRanker.com ---
THEORY EXAMINATION (SEM–IV) 2016-17
INFORMATION THEORY AND CODING
Time: 3 Hours
Max. Marks : 100
Note: Be precise in your answer. In case of numerical problem assume data wherever not provided.
--- Content provided by FirstRanker.com ---
SECTION – A
1. Explain the following: 10 x 2 = 20
- Draw the block diagram of communication system
- At what condition entropy attains maximum value? Write the expression for source efficiency
- Out of following code which one is non singular?
Source S1 S2 S3 S4 Code A 00 001 101 110 Code B 00 100 111 00 - List out two important properties of mutual information
- State Shannon Hartley Theorem with expression.
- List out the properties of Block codes.
- Find the hamming weight of two code vectors C1=0001010, C2=1010101
- What are convolutional codes? How is it different from block codes?
- Obtain an Expression for zero memory information sources emitting independent sequence of symbols
- Why (23, 12) Golay code is called Perfect code?
--- Content provided by FirstRanker.com ---
--- Content provided by FirstRanker.com ---
SECTION – B
Attempt any five of the following questions: 5 x 10 = 50
-
- (i) A source emits one of the four possible messages S1, S2, S3 and S4 with probabilities 4/11, 3/11, 2/11 and 2/11 respectively. Find the entropy of the source. List all the elements for the second extension of the source. Hence show that H (S²) =2 H(S).
--- Content provided by FirstRanker.com ---
(ii) Discuss the properties of Entropy - (i) Discuss External Property of Entropy with examples
(ii) Explain the need for source coding in communication system and discuss about compact code - (i) Consider the following S={X1, X2, X3, X4, X5, X6} with probability P= {0.4, 0.2, 0.2, 0.1, 0.08, 0.02}.Find the code words using Shannon fano Algorithm and efficiency of source
(ii) Clearly explain differential entropy of continuous signal. How it is different from entropy of discrete signals? - (i) Explain the properties of Mutual information.
(ii) For a Systematic (7, 4) linear block code, the parity matrix P is given by
P =1 1 1 1 1 0 1 0 1 0 1 1
(A) Find all possible code vectors. - (i) Discuss the data compression techniques
(ii) Consider the (4,3,2)code with input sequence u¹=(101), u²=(110) and u³=(011). The corresponding input polynomials are u(¹)(D)= 1+D2, u(2)(D)=1+D. construct the codeword using transform domain approach. - (i) A transmitter has symbol consisting of five letters {a1, a2, a3, a4, a5} and receiver as a symbol of four letters {b1, b2, b3, b4}.The joint probabilities of the system are given as
P (A, B) =0.25 0 0 0 0.10 0.30 0 0 0 0.05 0.10 0 0 0.05 0.1 0 0 0.05 0 0
Compute H (A), ? (?), ? (A, B) and I(A,B). - (i) Discuss about (i) priori entropy (ii) Posteriori Entropy (iii) Equivocation
(ii) Explain uniquely decodable code and optimal code. - (ii) An information source produces sequences of independent symbols having the following probabilities. Construct ternary code using Huffman coding procedure and find it efficiency.
A B C D E F G 1/3 1/27 1/3 1/9 1/9 1/27 1/27 - (i) Explain the Concept of Shortened Cyclic codes and Burst error correcting codes
(ii) A source produces sequence of symbols having the following probabilities.A B C D E 0.25 0.25 0.2 0.15 0.15
--- Content provided by FirstRanker.com ---
--- Content provided by FirstRanker.com ---
--- Content provided by FirstRanker.com ---
- (i) A source emits one of the four possible messages S1, S2, S3 and S4 with probabilities 4/11, 3/11, 2/11 and 2/11 respectively. Find the entropy of the source. List all the elements for the second extension of the source. Hence show that H (S²) =2 H(S).
SECTION – C
Attempt any two of the following questions: 2 x 15 = 30
--- Content provided by FirstRanker.com ---
- (a) A Binary Symmetric Channel has following matrix with Source probabilities P(X1) = 2/3, P(X2) =1/3. Determine H(X), H(Y), H(Y/X) and Chanel capacity
3/4 1/4 1/4 3/4
(b) Consider the four codes listed below. Identify the instantaneous codes using Kraft Mcmilan inequality theoremSource symbol Code A Code B Code C Code D S1 0 0 0 0 S2 100 10 100 10 S3 110 110 110 110 S4 111 11 11 111 - (a) Write a Short note On:
- BCH codes and RS codes
- Golay codes
- Burst and Random Error correcting codes
- (a) Discuss about hamming distance and minimum distance with good examples.
(b) Consider the (3,1,2) convolution codes with g(1)=(110), g(2)=(101) and g(3)=(111)- Draw the encoder diagram and find the generator matrix
- Find the codeword corresponding to the information sequence (11101) using time domain approach.
--- Content provided by FirstRanker.com ---
--- Content provided by FirstRanker.com ---
This download link is referred from the post: AKTU B-Tech Last 10 Years 2010-2020 Previous Question Papers || Dr. A.P.J. Abdul Kalam Technical University
--- Content provided by FirstRanker.com ---