Download JNTUA B.Tech 4-2 R13 2018 July Supple 13A04805 Pattern Recognition and Application Question Paper

Download JNTUA (JNTU Anantapur) B.Tech R13 (Bachelor of Technology) 4th Year 2nd Semester 2018 July Supple 13A04805 Pattern Recognition and Application Previous Question Paper || Download B-Tech 4th Year 2nd Sem 13A04805 Pattern Recognition and Application Question Paper || JNTU Anantapur B.Tech 4-2 Previous Question Paper || JNTU Anantapur B.Tech ME 4-2 Previous Question Paper || JNTU Anantapur B.Tech CSE 4-2 Previous Question Paper || JNTU Anantapur B.Tech Mech 4-2 Previous Question Paper || JNTU Anantapur B.Tech EEE 4-2 Previous Question Paper || JNTU Anantapur B.Tech ECE 4-2 Previous Question Paper


Code: 13A04805


B.Tech IV Year II Semester (R13) Advanced Supplementary Examinations July 2018
PATTERN RECOGNITION & APPLICATION
(Electronics and Communication Engineering)

Time: 3 hours Max. Marks: 70

PART ? A
(Compulsory Question)

*****
1 Answer the following: (10 X 02 = 20 Marks)
(a) List the various preprocessing steps in pattern recognition.
(b) What is associate memory?
(c) Differentiate between Posterior and Likelihood.
(d) Explain Neyman?Pearson criterion.
(e) What is the difference between PCA and Fisher Linear Discriminant?
(f) Explain the normal distribution along with their sufficient statistics.
(g) Explain the LMS rule or Widrow-Hoff algorithm.
(h) Explain Kesler?s construction.
(i) List the 3 central issues in HMM.
(j) Explain and define dendrogram.

PART ? B
(Answer all five units, 5 X 10 = 50 Marks)

UNIT ? I

2 What is the importance of feature extraction in pattern recognition? Explain the importance of translation,
scale and rotation invariant features.
OR
3 Draw the flow chart of the design cycle and explain each step in detail along with the problems
associated in each step.

UNIT ? II

4

Consider Minimax criterion for the Zero-one loss function, that is, ???? 11
= ???? 22
= 0 and ???? 12
= ???? 21
= 1.
Prove that in this the decision regions will satisfy,
? P(x|w
1
)dx = ? P(x|w
2
)dx
R
2
R
1

OR
5 Consider the three dimensional normal distribution P( x|w

) ~ N ( ?, ? ) where:
1 1 0 0
? = { 2 } and ? = { 0 5 2 }
2 0 2 5
Find the probability density at the point x
0
= (0.5, 0.1)
t
.

UNIT ? III

6 Explain the K-nearest?neighbor rule in pattern classification using examples and neat diagrams including
computational complexity issues.
OR
7

Explain in detail various steps involved in dimensionality reduction using PCA algorithm by taking a
suitable example.
Contd. in page 2






Page 1 of 2
R13
FirstRanker.com - FirstRanker's Choice

Code: 13A04805


B.Tech IV Year II Semester (R13) Advanced Supplementary Examinations July 2018
PATTERN RECOGNITION & APPLICATION
(Electronics and Communication Engineering)

Time: 3 hours Max. Marks: 70

PART ? A
(Compulsory Question)

*****
1 Answer the following: (10 X 02 = 20 Marks)
(a) List the various preprocessing steps in pattern recognition.
(b) What is associate memory?
(c) Differentiate between Posterior and Likelihood.
(d) Explain Neyman?Pearson criterion.
(e) What is the difference between PCA and Fisher Linear Discriminant?
(f) Explain the normal distribution along with their sufficient statistics.
(g) Explain the LMS rule or Widrow-Hoff algorithm.
(h) Explain Kesler?s construction.
(i) List the 3 central issues in HMM.
(j) Explain and define dendrogram.

PART ? B
(Answer all five units, 5 X 10 = 50 Marks)

UNIT ? I

2 What is the importance of feature extraction in pattern recognition? Explain the importance of translation,
scale and rotation invariant features.
OR
3 Draw the flow chart of the design cycle and explain each step in detail along with the problems
associated in each step.

UNIT ? II

4

Consider Minimax criterion for the Zero-one loss function, that is, ???? 11
= ???? 22
= 0 and ???? 12
= ???? 21
= 1.
Prove that in this the decision regions will satisfy,
? P(x|w
1
)dx = ? P(x|w
2
)dx
R
2
R
1

OR
5 Consider the three dimensional normal distribution P( x|w

) ~ N ( ?, ? ) where:
1 1 0 0
? = { 2 } and ? = { 0 5 2 }
2 0 2 5
Find the probability density at the point x
0
= (0.5, 0.1)
t
.

UNIT ? III

6 Explain the K-nearest?neighbor rule in pattern classification using examples and neat diagrams including
computational complexity issues.
OR
7

Explain in detail various steps involved in dimensionality reduction using PCA algorithm by taking a
suitable example.
Contd. in page 2






Page 1 of 2
R13

Code: 13A04805




UNIT ? IV

8 Explain the applicability of linear discriminants for Unimodal and Multimodal problems in 2 dimensional
through the following:
(i) Sketch two Multimodal distributions for which a linear discriminant could give excellent or possibally
even the optimal classification accuracy.
(ii) Sketch two Unimodal distributions for which even the best linear discriminant would give poor
classification accuracy.
OR
9 Explain the Descent procedure of Linear discriminant function and explain the algorithm.

UNIT ? V

10 Explain HMM computation using neat diagrams consisting of nodes and transition probabilities.
OR
11 Explain the Nearest?Neighbour algorithm and Farthest Neighbour algorithm using neat diagrams in
detail.

*****









































Page 2 of 2
R13
FirstRanker.com - FirstRanker's Choice

This post was last modified on 10 September 2020