FirstRanker's choice
Code: 13A04805
--- Content provided by FirstRanker.com ---
B.Tech IV Year II Semester (R13) Advanced Supplementary Examinations July 2018
PATTERN RECOGNITION & APPLICATION
(Electronics and Communication Engineering)
Time: 3 hours
Max. Marks: 70
--- Content provided by FirstRanker.com ---
PART - A
(Compulsory Question)
*****
- Answer the following: (10 X 02 = 20 Marks)
- List the various preprocessing steps in pattern recognition.
- What is associate memory?
- Differentiate between Posterior and Likelihood.
- Explain Neyman–Pearson criterion.
- What is the difference between PCA and Fisher Linear Discriminant?
- Explain the normal distribution along with their sufficient statistics.
- Explain the LMS rule or Widrow-Hoff algorithm.
- Explain Kesler's construction.
- List the 3 central issues in HMM.
- Explain and define dendrogram.
--- Content provided by FirstRanker.com ---
--- Content provided by FirstRanker.com ---
PART - B
--- Content provided by FirstRanker.com ---
(Answer all five units, 5 X 10 = 50 Marks)
UNIT - I
- What is the importance of feature extraction in pattern recognition? Explain the importance of translation, scale and rotation invariant features.
OR
Draw the flow chart of the design cycle and explain each step in detail along with the problems associated in each step.
UNIT - II
--- Content provided by FirstRanker.com ---
- Consider Minimax criterion for the Zero-one loss function, that is, ?11 = ?22 = 0 and ?12 = ?21 = 1. Prove that in this the decision regions will satisfy, ?P(x/?1)dx = ?P(x|?2)dx
R2 R1OR
Consider the three dimensional normal distribution P(xw) ~ N (µ, S) where:
µ = and S =
Find the probability density at the point x0 = (0.5, 0.1).
--- Content provided by FirstRanker.com ---
UNIT - III
- Explain the K-nearest-neighbor rule in pattern classification using examples and neat diagrams including computational complexity issues.
OR
Explain in detail various steps involved in dimensionality reduction using PCA algorithm by taking a suitable example.
FirstRanker's choice
--- Content provided by FirstRanker.com ---
Code: 13A04805
UNIT - IV
- Explain the applicability of linear discriminants for Unimodal and Multimodal problems in 2 dimensional through the following:
- Sketch two Multimodal distributions for which a linear discriminant could give excellent or possibly even the optimal classification accuracy.
- Sketch two Unimodal distributions for which even the best linear discriminant would give poor classification accuracy.
--- Content provided by FirstRanker.com ---
OR
Explain the Descent procedure of Linear discriminant function and explain the algorithm.
UNIT - V
- Explain HMM computation using neat diagrams consisting of nodes and transition probabilities.
OR
Explain the Nearest-Neighbour algorithm and Farthest Neighbour algorithm using neat diagrams in detail.
--- Content provided by FirstRanker.com ---
*****
This download link is referred from the post: JNTU Anantapur B-Tech 4-2 last 10 year question papers 2010 -2020 -All regulation- All branches- 4th Year 2nd Sem
--- Content provided by FirstRanker.com ---