Important RGPV Question
Table of Contents
ToggleAL603 (C) Pattern Recognition
VI Sem, AIML
UNIT 1-Introduction and Mathematical Preliminaries
Q.1) What are the important components of Learning?
Q.2) What do you mean by defuzzification?
Q.3) Define expectation and mean.
Q.4) What are dimension reduction methods? Explain Principle Component Analysis algorithm for dimension reduction. Explain also write its limitations.
Q.5) Explain multivariate normal density using mathematical notations.
UNIT 2-Pattern Recognition Basics Bayesian Decision Theory
Q.1) What is a prior probability?
Q.2) In case of equal discriminant function, pattern is assigned to which class? Also give the optimal discriminant function for two-class case.
Q.3) What are the different clustering techniques? Explain. Also explain agglomerative clustering algorithm step by step.
Q.4) Explain supervised and unsupervised Learning algorithms using a block diagram. Is clustering considered an unsupervised learning? Justify.
Q.5) What is Bayesian decision theory? Explain 2 category classificati
Q.6) What do you mean by fuzzy decision making? Also discuss the fuzzy classification using suitable examples in detail.
UNIT 3-Feature Selection And Extraction Problem Statement And Uses
Q.1) What is Machine Learning? Why do we need Machine Learning?
Q.2) Write an algorithm for k-nearest neighbor estimation. Explain.
Q.3) Explain the difference between feature selection and feature extraction. Provide one real-world example where feature selection is preferred over feature extraction and justify why.
Q.4) Describe the steps of the Branch and Bound algorithm for feature selection. What are its advantages and limitations when applied to high-dimensional datasets?
Q.5) Compare Sequential Forward Selection (SFS) and Sequential Backward Selection (SBS) in terms of computational complexity and their suitability for a dataset with 100 features. Which would you choose for a real-time application and why?
Q.6) Using the Cauchy-Schwarz inequality, explain how you can measure the correlation between two features to reduce redundancy in a dataset. Provide a simple numerical example to illustrate.
Q.7) Define probabilistic separability-based and interclass distance-based criteria functions for feature selection. Which criterion would you use for a speech recognition task with overlapping class distributions, and why?
UNIT 4-Visual Recognition Human Visual Recognition System
Q.1) Explain components of a typical pattern recognition system with a neat diagram.
Q.2) How does the human visual recognition system inspire computational models for visual recognition? Discuss one specific feature of the human visual system that is emulated in convolutional neural networks (CNNs).
Q.3) Differentiate between low-level modeling, mid-level abstraction, and high-level reasoning in visual recognition. Provide an example of a task for each level in the context of autonomous driving.
Q.4) Explain the difference between semantic segmentation and instance segmentation. Name one algorithm for each and describe a scenario where instance segmentation is more appropriate than semantic segmentation.
Q.5) What is the role of context in scene understanding? Illustrate with an example how contextual information can improve object recognition in a cluttered indoor environment.
Q.6) Discuss the challenges of large-scale search and recognition in image databases. How can techniques like KD-trees or hashing address these challenges, and what are their limitations in egocentric vision applications?
UNIT 5-Recent Advancements in Pattern Recognition
Q.1) A roulette wheel has 38 slots-18 red,18 black, and 2 green. You play five games and always bet on red slots. What is the probability that you win all the five games?
Q.2) Write short note on:
i) Reinforcement learning
ii) Expectation Maximization
Q.3) Compare the performance of a Support Vector Machine (SVM) and a deep neural network (DNN) for image classification using metrics like accuracy and F1-score. Under what conditions might SVM outperform DNN?
Q.4) Define covariance and explain its role in feature selection. Given a 2×2 covariance matrix for two features, demonstrate how you would interpret its elements to decide feature relevance.
Q.5) What is data condensation, and how can it be applied to reduce the size of a large dataset like ImageNet? Discuss one specific method and its potential impact on model performance.
Q.6) Explain the Fuzzy C-Means (FCM) clustering algorithm and its advantages over hard clustering methods like K-means. Provide an example of a real-life dataset where FCM would be more suitable.
Q.7) Describe how t-SNE can be used for data visualization in pattern recognition. Discuss its limitations and suggest an alternative visualization technique for a high-dimensional dataset like COCO.
— Best of Luck for Exam —