Download JNTUK M.Tech R19 CSE M.Tech Neural Networks Course Structure And Syllabus

Download JNTU Kakinada (Jawaharlal Nehru Technological University, Kakinada) M.Tech (Master of Technology) R19 CSE M.Tech Neural Networks Course Structure And Syllabus


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING


COURSE STRUCTURE & SYLLABUS M.Tech CSE for
CYBER SECURITY PROGRAMME

(Applicable for batches admitted from 2019-2020)





JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY KAKINADA


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


M. Tech. (NN) I SEMESTER
Course
Cate
S.No
Courses
L T P C
Code
gory
Program Core-1
1
MTNN1101
PC
3 0 0 3
Artificial Neural Networks
Program Core-2
2
MTNN1102
PC
3 0 0 3
Advanced Data Structures
Program Elective-1
1. Machine Learning
3
MTNN1103
PE
3 0 0 3
2. Intelligent Systems
3. Expert Systems
Program Elective-2
1. Data Warehouse and Data Mining
4
MTNN1104
2.
PE
3 0 0 3
Recommender Systems
3. Pattern Recognition
5
MTNN1105
Research Methodology and IPR
CC
0 2
Laboratory-1
6
MTNN1106
LB 0 0 4 2
Advanced Data Structures Lab
Laboartory-2
7
MTNN1107
LB 0 0 4 2
Neural Networks Lab
8
MTNN1108
Audit Course-1*
AC 2 0 0 0
Total Credits
18
M. Tech. (NN) II SEMESTER
Course
Cate
S.No
Courses
L T P C
Code
Gory
Program Core-3
1
MTNN1201
PC
3
0 0 3
Soft Computing
Program Core-4
2
MTNN1202
PC
3
0 0 3
Deep Learning
Program Elective-3
1. Computer Vision
3
MTNN1203
PE
3
0 0 3
2. Big Data Analytics
3. Remote Sensing
Program Elective-4
1. Cognitive Systems
4
MTNN1204
PE
3
0 0 3
2. Knowledge Discovery
3. Natural Language Processing
Laboratory-3
5
MTNN1205
LB
0
0 4 2
Soft Computing Lab
Laboartory-4
6
MTNN1206
LB
0
0 4 2
Deep Learning lab
7
MTNN1207
Mini Project with Seminar
MP 2
0 0 2
8
MTNN1208
Audit Course-2 *
AC
2
0 0 0
Total Credits
18

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


*Student has to choose any one audit course listed below.
Audit Course 1 & 2:
1. English for Research Paper Writing
5. Constitution of India
2. Disaster Management
6. Pedagogy Studies
3. Sanskrit for Technical Knowledge
7. Stress Management by Yoga
4. Value Education
8. Personality Development through Life
Enlightenment Skills
M. Tech. (NN) III SEMESTER
Course
Cate
S.No
Courses
L T P C
Code
gory
Program Elective-5
PE
1. Reinforcement Learning
2. Bio-Informatics
1
MTNN2101
3 0 0 3
3. Speech Processing
4. MOOCS-I(NPTEL/SWAYAM-Any 12
Weeks Program)
Open Elective
OE
1. MOOCS-II(NPTEL/SWAYAM-Any
12WeeksProgram-Interdisciplinary
2
MTNN2102
Course but not from the Parent
3 0 0 3
Department)
2. Courses offered by other departments in
the college
3
MTNN2103
Dissertation-I/Industrial Project #
PJ
0 0 20 10
Total Credits
16
#Students going for Industrial Project/Thesis will complete these courses through MOOCs
Open Electives offered to Other Departments
1. Python Programming 3.Machine Learning
2. Artificial Intelligence 4. Deep Learning

M. Tech. (NN) IV SEMESTER
Course
Cate
S.No
Courses
L T P C
Code
gory
1
MTNN2201
Dissertation-II
PJ
0 0 32 16
Total Credits
16


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

3
0
0
3
Artificial Neural Networks

Course Objective
:
? The main objective of this course is to provide the student with the basic understanding
of neural networks fundamentals,
? Program the related algorithms and Design the required and related systems
Course Outcomes:
? Demonstrate ANN structure and activation Functions
? Define foundations and learning mechanisms and state-space concepts
? Identify structure and learning of perceptions
? Explain Feed forward, multi-layer feed forward networks and Back propagation
algorithms
? Analyze Radial Basis Function Networks, Theor Regularization and RBF networks
UNIT-I: Introduction and ANN Structure, Biological neurons and artificial neurons. Model of an
ANN. Activation functions used in ANNs. Typical classes of network architectures.

UNIT-II:
Mathematical Foundations and Learning mechanisms.Re-visiting vector and matrix
algebra, State-space concepts, Concepts of optimization, Error-correction learning. Memory-
based learning, Hebbian learning. Competitive learning.
UNIT-III:
Single layer perceptrons, Structure and learning of perceptrons, Pattern classifier,
introduction and Bayes' classifiers, Perceptron as a pattern classifier, Perceptron convergence.
Limitations of a perceptrons.

UNIT-IV: Feed forward ANN, Structures of Multi-layer feed forward networks. Back
propagation algorithm, Back propagation - training and convergence, Functional approximation
with back propagation. Practical and design issues of back propagation learning.
UNIT-V: Radial Basis Function Networks, Pattern separability and interpolation, Regularization
Theor Regularization and RBF networks.RBF network design and training. Approximation
properties of RBF.

Text Books:
1. Simon Haykin, "Neural Networks: A comprehensive foundation", Second Edition, Pearson
Education Asia.
2. Satish Kumar, "Neural Networks: A classroom approach", Tata McGraw Hill, 2004.
Reference Books:
1. Robert J. Schalkoff, "Artificial Neural Networks", McGraw-Hill International Editions, 1997.



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



L
T
P
C

I Year - I Semester
3
0
0
3
Advanced Data Structures

Course Objectives:

? The student should be able to choose appropriate data structures, understand the
ADT/libraries, and use it to design algorithms for a specific problem.
? Students should be able to understand the necessary mathematical abstraction to solve
problems.
? To familiarize students with advanced paradigms and data structure used to solve
algorithmic problems.
? Student should be able to come up with analysis of efficiency and proofs of correctness.

Course Outcomes:

? After completion of course, students would be able to:
? Understand the implementation of symbol table using hashing techniques.
? Develop and analyze algorithms for red-black trees, B-trees and Splay trees.
? Develop algorithms for text processing applications.
? Identifysuitabledatastructuresanddevelopalgorithmsforcomputationalgeometry problems.

UNIT-I: Dictionaries:
Definition, Dictionary Abstract Data Type, and Implementation of
Dictionaries. Hashing: Review of Hashing, Hash Function, Collision Resolution Techniques in
Hashing, Separate Chaining, Open Addressing, Linear Probing, Quadratic Probing, Double
Hashing, Rehashing, Extendible Hashing.

UNIT-II: Skip Lists:
Need for Randomizing Data Structures and Algorithms, Search and
Update Operations on Skip Lists, Probabilistic Analysis of Skip Lists, Deterministic Skip Lists.

UNIT-III: Trees:
Binary Search Trees, AVL Trees, Red Black Trees, 2-3 Trees, B-Trees, Splay
Trees.
UNIT?IV: Text Processing:
Sting Operations, Brute-Force Pattern Matching, The Boyer-
Moore Algorithm, The Knuth-Morris-Pratt Algorithm, Standard Tries, Compressed Tries, Suffix
Tries, The Huffman Coding Algorithm, The Longest Common Subsequence Problem (LCS),
Applying Dynamic Programming to the LCS Problem.

UNIT-V: Computational Geometry:
One Dimensional Range Searching, Two Dimensional
Range Searching, Constructing a Priority Search Tree, Searching a Priority Search Tree, Priority
Range Trees, Quad trees, k-D Trees. Recent Trends in Hashing, Trees, and various
computational geometry methods for efficiently solving the new evolving problem.

Text Books:

1. Mark Allen Weiss, Data Structures and Algorithm Analysis in C++, 2nd Edition,
Pearson,2004
2. M T Goodrich, Roberto Tamassia, Algorithm Design, John Wiley,2002

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

3
0
0
3
Machine Learning

Course Objectives:
? Identify problems that are amenable to solution by AI methods, and which AI methods may be
suited to solving a given problem.
? Formalize a given problem in the language/framework of different AI methods (e.g., as a search
problem, as a constraint satisfaction problem, as a planning problem, as a Markov decision
process, etc).
? Implement basic AI algorithms (e.g., standard search algorithms or dynamic programming).
? Design and carry out an empirical evaluation of different algorithms on problem formalization,
and state the conclusions that the evaluation supports.

Course Outcomes:
After the completion of the course, student will be able to
? Explain the definition and usage of the term 'the internet of things' in different contexts.
? Demonstrate on various network protocols used in IoT.
? Analyze on various key wireless technologies used in IoT systems, such as WiFi, 6LoWPAN,
Bluetooth and ZigBee.
? Illustrate on the role of big data, cloud computing and data analytics in IoT system.
? Design a simple IoT system made up of sensors, wireless network connection, data analytics and
display/actuators, and write the necessary control software.
UNIT-I: Introduction-Towards Intelligent Machines, Well posed Problems, Example of Applications in
diverse fields, Data Representation, Domain Knowledge for Productive use of Machine Learning,
Diversity of Data: Structured Unstructured, Forms of Learning, Machine Learning and Data Mining,
Basic Linear Algebra in Machine Learning Techniques.

UNIT-II: Supervised Learning-
Rationale and Basics: Learning from Observations, Bias and Why
Learning Works: Computational Learning Theory, Occam's Razor Principle and Over fitting Avoidance
Heuristic Search in inductive Learning, Estimating Generalization Errors, Metrics for assessing
regression, Metris for assessing classification.

UNIT-III: Statistical Learning-
Machine Learning and Inferential Statistical Analysis, Descriptive
Statistics in learning techniques, Bayesian Reasoning a probabilistic approach to inference, K-Nearest
Neighbor Classifier. Discriminant functions and regression functions, Linear Regression with Least
Square Error Criterion, Logistic Regression for Classification Tasks, Fisher's Linear Discriminant and
Thresholding for Classification, Minimum Description Length Principle.

UNIT-IV: Support Vector Machines (SVM)-
Introduction, Linear Discriminant Functions for Binary
Classification, Perceptron Algorithm, Large Margin Classifier for linearly separable data, Linear Soft
Margin Classifier for Overlapping Classes, Kernel Induced Feature Spaces, Nonlinear Classifier,
Regression by Support vector Machines. Learning with Neural Networks: Towards Cognitive Machine,
Neuron Models, Network Architectures, Perceptrons, Linear neuron and the Widrow-Hoff Learning Rule,
The error correction delta rule.

UNIT-V: Decision Tree Learning
: Introduction, Example of classification decision tree, measures of
impurity for evaluating splits in decision trees, ID3, C4.5, and CART decision trees, pruning the tree,
strengths and weakness of decision tree approach.


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Textbooks:
1. Applied Machine Learning, M.Gopal, Mc Graw Hill Education
2. Machine Learning, Tom Mitchell, c Graw Hill
References
1. Introduction to Machine Learning with Python: A Guide for Data Scientists, Andreas C. M?ller and
Sarah Guido, O'Reilly
2. Machine Learning, The Art and Science of Algorithms that Make Sense of Data, Peter Flach,
Cambridge press


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

3
0
0
3
Intelligent Systems
Course Objectives:
? Understand the fine structure or deeper origin of knowledge
? Generate intelligent behavior on the basis of statistical evidence.

Course Outcomes:
After completion of course, students would be:
? Demonstrate Data representation and Logical operations.
? Analyze backward reasoning and solving problems by reduction.
? Identify Tools for representation of Lisp programming.
? Explain the architecture of real time expert systems.
? Define Quantitative simulation and Petri Nets.
UNIT I: Knowledge Representation:
Data and knowledge: Data representation and data items in traditional databases, Data representation and
data items in relational databases. Rules: Logical operations, Syntax and semantics of rules, Data log rule
sets ,The dependence graph of data log rule sets, Objects, Solving problems by reasoning: The structure
of the knowledge base, The reasoning algorithm, Conflict resolution, Explanation of the reasoning.
Unit II: Rule Based Systems:
Forward reasoning: The method of forward reasoning, A simple case study of forward reasoning,
Backward reasoning: Solving problems by reduction, The method of backward reasoning, A simple case
study of backward reasoning, Bidirectional reasoning. Contradiction freeness: The notion of contradiction
freeness, Testing contradiction freeness, The search problem of contradiction freeness .Completeness:
The notion of completeness, Testing Completeness, The search problem of completeness .Decomposition
of knowledge bases: Strict decomposition, Heuristic decomposition
UNIT III: Tools for Representation and Reasoning:
The Lisp programming language: The fundamental data types in Lisp, Expressions and their evaluation,
some useful Lisp primitives, some simple examples in Lisp, The Prolog programming language: The
elements of Prolog programs, The execution of Prolog programs, Built-in predicates, and Some simple
examples in Prolog. Expert system shells: Components of an expert system shell, Basic functions and
services in an expert system shell
UNIT IV: Real-Time Expert Systems:
The architecture of real-time expert systems: The real-time subsystem, The intelligent subsystem
Synchronization and communication between real-time and intelligent subsystems: Synchronization and
communication primitives, Priority handling and time-out. Data exchange between the real-time and the
intelligent subsystems: Loose data exchange, the blackboard architecture. Software engineering of real-
time expert systems: The software lifecycle of real time expert systems, Special steps and tool, An
Example of A Real-Time expert System.
UNIT V: Qualitative Reasoning and Petri Nets:
Sign and interval calculus, Qualitative simulation: Constraint type qualitative differential equations, The
solution of QDEs: the qualitative simulation algorithm: Initial data for the simulation, Steps of the
simulation algorithm, Simulation results. Qualitative physics, Signed directed graph (SDG) models, The
Notion of Petri nets, The firing of transitions, Special cases and extensions, The state-space of Petri nets

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


The use of Petri nets for intelligent control, The analysis of Petri nets: Analysis Problems for Petri Nets,
Analysis techniques.

Text Books:
1. Intelligent Control Systems-An Introduction with Examples by Katalin M. Hangos, Roz?lia Lakner ,
Mikl?s Gerzson, Kluwer Academic Publishers.
2. Intelligent Systems and Control: Principles and Applications Paperback ? 12 Nov 2009 by
Laxmidhar Behera, Indrani Kar by OXFORD.
References Books:
1. Intelligent Systems and Technologies Methods and Applications by Springer publications.
2. Intelligent Systems - Modeling, Optimization and Control, by Yung C. Shin and Chengying Xu, CRC
Press, Taylor & Francis Group, 2009.


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

3
0
0
3
Expert Systems

Corse Objectives
:
? In this course the student will learn the methodology used to transfer the knowledge of a human
expert into an intelligent program that can be used to solve problems.

Course Outcomes:
After completing this course, the student should be able to:
? Apply the methodology to transfer human knowledge into an expert system
? Apply knowledge representation
? Design a knowledge base and Implement a rule-based expert system
? Evaluate Expert System tools
? Apply CLIPS for the implementation of an expert system

UNIT-I:
Introduction what is AI? The Foundations of AI, What is an AI Technique?-Tic-Tac-Toe.
Problems, Problem Spaces and Search Defining the problem as a state space search, Production systems,
Problem characteristics, production system characteristics, Issues in the design of search programs.
UNIT-II:
Heuristic Search Techniques Generate-and-test, Hill climbing, Simulated Annealing, Best-
First search, A* algorithm, AO* algorithm, Constraint satisfaction, Means-Ends Analysis.
UNIT-III:
First-Order Logic Syntax and Semantics, Extensions and Notational Variations, Using First-
Order Logic, Representing Change in the world, deducing hidden properties of the world. Interface in
First-Order Logic Inference rules involving Quantifiers, An Example proof, Generalized Modus Ponens,
Forward and Backward Chaining, Completeness, Resolution, Completeness of Resolution.
UNIT-IV:
Slot-and-Filler Structures Semantic Nets, Frames, And Conceptual Dependency. Game
Playing Overview, The Mini-max Search Procedure, Adding Alpha-Beta Cutoffs, Additional
Refinements, Iterative Deepening.
UNIT-V:
Natural Language Processing Introduction, Syntactic processing, Semantic analysis. Expert
Systems Representing and Using Domain Knowledge, Expert System Shells, Explanation, Knowledge
Acquisition.



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Text Books:
1. Rich, Elaine and Knight, Kevin, Artificial Intelligence, Tata McGraw-Hill publications, 2nd Edition,
2006
2. Russell, Stuart and Norvig, Peter, Artificial Intelligence A Modern Approach, Pearson Education

Reference Books:

1. Eugene Charniak and Drew McDermott, Introduction to Artificial Intelligence, Addison Wesley,
Pearson Education, 2005
2. George F Luger, Artificial Intelligence Structures and Strategies for Complex Problem Solving,
Pearson Education Ltd., 2nd Edition, 2002.
3. Dan W Patterson, Introduction to Artificial Intelligence and Expert Systems, Prentice-Hall of India,
2001.


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



L
T
P
C
I Year - I Semester

3
0
0
3
Data Warehousing and Data Mining

Course Objectives:
? This course will introduce the concepts of data ware house and data mining, which gives a
complete description about the principles
? Student able to understand architectures, applications, design and implementation of data mining
and data ware housing concepts.
Course Outcomes:
? Understand the functionality of the various data mining and data warehousing component
? Appreciate the strengths and limitations of various data mining and data warehousing models
? Explain the analyzing techniques of various data
? Describe different methodologies used in data mining and data ware housing
? Compare different approaches of data ware housing and data mining with various technologies
UNIT-I: Data Warehousing-Data warehousing Components ?Building a Data warehouse ?- Mapping
the Data Warehouse to a Multiprocessor Architecture ? DBMS Schemas for Decision Support ? Data
Extraction, Cleanup, and Transformation Tools ?Metadata.
UNIT-II: Business Analysis-Reporting and Query tools and Applications ? Tool Categories ? The Need
for Applications ? Cognos Impromptu ? Online Analytical Processing (OLAP) ? Need ?
Multidimensional Data Model ? OLAP Guidelines ? Multidimensional versus Multirelational OLAP ?
Categories of Tools ? OLAP Tools and the Internet.
UNIT-III: Data Mining-Introduction ? Data ? Types of Data ? Data Mining Functionalities ?
Interestingness of Patterns ? Classification of Data Mining Systems ? Data Mining Task Primitives ?
Integration of a Data Mining System with a Data Warehouse ? Issues ?Data Preprocessing.
UNIT-IV: Association Rule Mining And Classification -Mining Frequent Patterns, Associations and
Correlations ? Mining Methods ? Mining various Kinds of Association Rules ? Correlation Analysis ?
Constraint Based Association Mining ? Classification and Prediction ? Basic Concepts ? Decision Tree
Induction
?
Bayesian
Classification
?
Rule
Based
Classification ? Classification by Back propagation ? Support Vector Machines
Associative Classification ? Lazy Learners ? Other Classification Methods ? Prediction.
UNIT-V: Clustering And Trends In Data Mining-Cluster Analysis ? Types of Data ? Categorization of
Major Clustering Methods ? K-means? Partitioning Methods ? Hierarchical Methods ? Density-Based
Methods ?Grid Based Methods ? Model-Based Clustering Methods ? Clustering High Dimensional Data
Constraint-Based Cluster Analysis ? Outlier Analysis ? Data Mining Applications.

Text Books:
1. Alex Berson and Stephen J.Smith, "Data Warehousing, Data Mining and
OLAP", Tata McGraw ? Hill Edition, Thirteenth Reprint 2008.
2. Jiawei Han and Micheline Kamber, "Data Mining Concepts and Techniques", Third
Edition, Elsevier, 2012.


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Reference Books:
1. Pang-Ning
Tan,
Michael
Steinbach
and
Vipin
Kumar,
"Introduction
to
DataMining", PersonEducation,2007.
2. K.P. Soman, Shyam Diwakar and V. Aja, "Insight into Data Mining Theory and Practice",
Eastern Economy Edition, Prentice Hall of India, 2006.
3. G. K. Gupta, "Introduction to Data Mining with Case Studies", Eastern Economy Edition,
Prentice Hall of India, 2006.
4. Daniel T.Larose, "Data Mining Methods and Models", Wiley-Interscience, 2006.

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

3
0
0
3
Recommender Systems
Course Objectives:
? To learn techniques for making recommendations, including non-personalized, content-based, and
collaborative filtering
? To automate a variety of choice-making strategies with the goal of providing affordable, personal,
and high-quality recommendations

Course Outcomes:
? Design recommendation system for a particular application domain.
? Evaluate recommender systems on the basis of metrics such as accuracy, rank accuracy, diversity,
product coverage, and serendipity
? Explain User-based recommendation, knowledge-based recommender system
? Define Opportunities for hybridization, Monolithic hybridization
? Identify hybridization design, Weighted, Switching, Mixed, Pipelined hybridization
UNIT-I: Introduction: Overview of Information Retrieval, Retrieval Models, Search and Filtering
Techniques: Relevance Feedback, User Profiles, Recommender system functions, Matrix operations,
covariance matrices, Understanding ratings, Applications of recommendation systems, Issues with
recommender system.
UNIT-II: Content-based Filtering: High level architecture of content-based systems, Advantages and
drawbacks of content based filtering, Item profiles, Discovering features of documents, pre-processing
and feature extraction, Obtaining item features from tags, Methods for learning user profiles, Similarity
based retrieval, Classification algorithms.
UNIT-III: Collaborative Filtering: User-based recommendation, Item-based recommendation, Model
based approaches, Matrix factorization, Attacks on collaborative recommender systems. Types of
Recommender Systems: Recommender systems in personalized web search, knowledge-based
recommender system, Social tagging recommender systems, Trust-centric recommendations, Group
recommender systems

UNIT-IV: Hybrid Approaches:
opportunities for hybridization, Monolithic hybridization design:
Feature combination, Feature augmentation, Parallelized hybridization design: Weighted, Switching,
Mixed, Pipelined hybridization design: Cascade, Meta-level, Limitations of hybridization strategies

UNIT-V: Evaluating Recommender System: Introduction, General properties of evaluation research,
Evaluation designs: Accuracy, Coverage, confidence, novelty, diversity, scalability, serendipity,
Evaluation on historical datasets, Offline evaluations.
Text Books:
1. Jannach D., Zanker M. and FelFering A., Recommender Systems: An Introduction, Cambridge
University Press (2011), 1st ed.
2. Charu C. Aggarwal, Recommender Systems: The Textbook, Springer (2016), 1st ed.
Reference Books:
1. Ricci F., Rokach L., Shapira D., Kantor B.P., Recommender Systems Handbook,
Springer(2011), 1st ed.
2. Manouselis N., Drachsler H., Verbert K., Duval E., Recommender Systems For Learning,
Springer (2013), 1st ed.
L
T
P
C
I Year - I Semester

3
0
0
3
Pattern Recognition


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Course Objectives:
? To implement pattern recognition and machine learning theories
? To design and implement certain important pattern recognition techniques
? To apply the pattern recognition theories to applications of interest
? To implement the entropy minimization, clustering transformation and feature ordering
Course Outcomes:
? Design systems and algorithms for pattern recognition (signal classification), with focus on
sequences of patterns that are analyzed using, e.g., hidden Markov models (HMM)
? Analyze classification problems probabilistically and estimate classifier performance,
? Understand and analyze methods for automatic training of classification systems,
? Apply Maximum-likelihood parameter estimation in relatively complex probabilistic models,
such as mixture density models and hidden Markov models
? Understand the principles of Bayesian parameter estimation and apply them in relatively simple
probabilistic models
UNIT- I: Introduction - Basic concepts, Applications, Fundamental problems in pattern Recognition
system design, Design concepts and methodologies, Examples of Automatic Pattern recognition
systems, Simple pattern recognition model, Decision and Distance Functions - Linear and generalized
decision functions, Pattern space and weight space, Geometrical properties, implementations of decision
functions, Minimum-distance pattern classifications.

UNIT-II: Probability-Probability of events, Random variables, Joint distributions and densities,
Movements of random variables, Estimation of parameter from samples, Statistical Decision Making -
Introduction, Baye's theorem, Multiple features, Conditionally independent features, Decision
boundaries, Unequal cost of error, estimation of error rates, the leaving-one-out-techniques, characteristic
curves, estimating the composition of populations. Baye's classifier for normal patterns.

UNIT-III: Non Parametric Decision Making - Introduction, histogram, kernel and window estimation,
nearest neighbor classification techniques. Adaptive decision boundaries, adaptive discriminate functions,
Minimum squared error
Discriminate functions, choosing a decision making techniques. Clustering and Partitioning - Hierarchical
Clustering: Introduction, agglomerative clustering algorithm, the single-linkage, complete-linkage and
average-linkage algorithm. Ward's method Partition clustering-Forg's algorithm, K-means's algorithm,
Isodata algorithm.
UNIT-IV: Pattern Preprocessing and Feature Selection: Introduction, distance measures, clustering
transformation and feature ordering, clustering in feature selection through entropy minimization,
features selection through orthogonal expansion, binary feature selection.
UNIT-V: Syntactic Pattern Recognition & Application Of Pattern Recognition: Introduction,
concepts from formal language theory, formulation of syntactic pattern recognition problem, syntactic
pattern description, recognition grammars, automata as pattern recognizers, Application of pattern
recognition techniques in bio-metric, facial recognition, IRIS scan, Finger prints, etc.,
Text Books:
1. Gose. Johnsonbaugh. Jost. " Pattern recognition and Image Analysis",PHI.
Tou. Rafael. Gonzalez. "Pattern Recognition Principle", PearsonEducation
Reference Book:
1. Richard duda, Hart., David Strok, "Pattern Classification", John Wiley.

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

0
0
4
2
Advanced Data Structures Lab

Course Objectives:
From the course the student will learn
? Knowing about oops concepts for a specific problem.
? Various advanced data structures concepts like arrays, stacks, queues, linked lists, graphs and
trees.

Course Outcomes
:
? Identify classes, objects, members of a class and relationships among them needed for a specific
problem.
? Examine algorithms performance using Prior analysis and asymptotic notations.
? Organize and apply to solve the complex problems using advanced data structures (like arrays,
stacks, queues, linked lists, graphs and trees.)
? Apply and analyze functions of Dictionary


List of Experiments

Experiment 1:
Implement Multi stacks.

Experiment 2:
Implement Double Ended Queue (Dequeues) & Circular Queues.
Experiment 3:
Implement various Recursive operations on Binary Search Tree.
Experiment 4:
Implement various Non-Recursive operations on Binary Search Tree.
Experiment 5:
Implement BFS for a Graph
Experiment 6:
Implement DFS for a Graph.
Experiment 7:
Implement Merge & Heap Sort of given elements.
Experiment 8:
Implement Quick Sort of given elements.
Experiment 9:
Implement various operations on AVL trees.
Experiment 10:
Implement B Tree operations.


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Experiment 11:
Implementation of Binary trees Traversals Techniques.
Experiment 12:
Implement Krushkal's algorithm to generate a min-cost spanning tree.
Experiment 13:
Implement Prim's algorithm to generate a min-cost spanning tree.
Experiment 14:
Implement functions of Dictionary using Hashing.

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - I Semester

0
0
4
2
Neural Networks Lab

Course Objectives:

? This course will serve as a comprehensive introduction to various topics in machine learning. At
the end of the course
? The students should be able to design and implement machine learning solutions to classification,
regression, and clustering problems; and be able to evaluate and interpret the results of the
algorithms.

Course Outcomes
:
? Create a custom feed-forward network.
? Design Constructing Layers
? Setting Transfer Functions, Each layer has its own transfer function
? Define which is set through the net. Layers{i}.transferFcn property
? Discriminative Learning models: Logistic Regression, Perceptrons, Artificial Neural Networks,
Support Vector Machines.

Note: The experiments need to be implemented using MATLAB.

List of Experiments

Sample Problem Statement: Create a custom feed-forward network .It consists of the following
sections:
1. Network Layers
? Constructing Layers
? Connecting Layers
? Setting Transfer Functions
2. Weights and Biases
3. Training Functions & Parameters
? The difference between train and adapt
? Performance Functions
? Train Parameters
4. Conclusion

1. Network Layers
? Constructing Layers
assume you have an empty network object named `net' in your workspace
>> net = network
Define properties of input layer
>> net.numInputs = 1
Define the number of neurons in the input layer. This should of course be equal to the
dimensionality of data set. The appropriate property to set is net.

inputs{i}.size, where i is the index of the input layers.
So to make a network which has 2 dimensional points as inputs, type:
>> net.inputs{1}.size = 2;


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


net.numLayers represents the total number of layers in the network, and net.layers{i}.size, which
sets the number of neurons in the ith layer. To build our example network, we define 2 extra layers
(a hidden layer with 3 neurons and an output layer with 1 neuron),using:
>> net.numLayers = 2;
>> net.layers{1}.size = 3;
>> net.layers{2}.size = 1;
? Connecting Layers
>>
net.inputConnect(1) = 1;
>> net.layerConnect(2, 1) = 1;
>> net.outputConnect(2) = 1;
>> net.targetConnect(2) = 1;
? Setting Transfer Functions
Each layer has its own transfer function which is set through the net.layers{i}.transferFcn
property. So to make the first layer use sigmoid transfer functions, and the second layer use linear
transfer functions,
>> net.targetConnect(2) = 1;
>> net.layers{2}.transferFcn = 'purel
2. Weights and Biases
Define which layers have biases by setting the elements of net.bias Connect to either 0 or 1, where
net.biasConnect(i) = 1 means layer i has biases attached to it.
To attach biases to each layer in network use
>> net.biasConnect = [ 1 ; 1];
? Iinitialisation procedure for the weights

reset all weights and biases by using
>> net = init(net);

? Each layer of weights and biases use their own initialisation routine to initialise.
>> net.initFcn = 'initlay';

? The initialisation for each set of weights and biases separately.

>> net.layers{i}.initFcn = 'initnw';

>> net.layers{i}.initFcn = 'initwb';
3. Weights and Biases
Define which layers have biases by setting the elements of net.bias Connect to either 0 or 1, where
net.biasConnect(i) = 1 means layer i has biases attached to it.
To attach biases to each layer in network use
>> net.biasConnect = [ 1 ; 1];
? Iinitialisation procedure for the weights
reset all weights and biases by using
>> net = init(net);

? Each layer of weights and biases use their own initialisation routine to initialise.
>> net.initFcn = 'initlay';

? The initialisation for each set of weights and biases separately.
>> net.layers{i}.initFcn = 'initnw';

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


>> net.layers{i}.initFcn = 'initwb';

? Define the initialisation for the input weights,
>> net.inputWeights{1,1}.initFcn = 'rands';
? Define the initialisation for each set of biases
>> net.biases{i}.initFcn = 'rands';

? Define the initialisation for weight matrices
>> net.layerWeights{i,j}.initFcn = 'rands';
where net.layerWeights{i,j} denotes the weights from layer j to layer i.
2. Training Functions & Parameters
? The difference between train and adapt
When using adapt, both incremental and batch training can be used. Which one is actually used
depends on the format of your training set. If it consists of two matrices of input and target vectors
>> P = [ 0.3 0.2 0.54 0.6 ; 1.2 2.0 1.4 1.5]
P is input vector and T is Target vector
>> T = [ 0 1 1 0 ]
? Performance Functions
The performance function is set with means quare error
>> net.performFcn = 'mse';
? Train Parameters
Train network using a Gradient Descent w/ Momentum algorithm
>> net.trainFcn = 'traingdm';
? set the parameters
>> net.trainParam.lr = 0.1;
>> net.trainParam.mc = 0.9;
lr is the learning rate, and mcis the the momentum term
Two other useful parameters are net.trainParam.epochs, which is the maximum number of times
the complete data set may be used for training, and net.trainParam.show, which is the time
between status reports of the training function.
>> net.trainParam.epochs = 1000;
>>net.trainParam.show=100;


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

3
0
0
3
Soft Computing
Course Objectives:
? Develop the skills to gain a basic understanding of neural network theory and fuzzy logic theory.
? Introduce students to artificial neural networks and fuzzy theory from an engineering perspective
Course Outcomes:
? Comprehend the fuzzy logic and the concept of fuzziness involved in various systems and fuzzy
set theory.
? Understand the concepts of fuzzy sets, knowledge representation using fuzzy rules, approximate
reasoning, fuzzy inference systems, and fuzzy logic
? To understand the fundamental theory and concepts of neural networks, Identify different neural
network architectures, algorithms, applications and their limitations
? Understand appropriate learning rules for each of the architectures and learn several neural
network paradigms and its applications
? Reveal different applications of these models to solve engineering and other problems.
UNIT-I: Fuzzy Set Theory: Introduction to Neuro ? Fuzzy and Soft Computing, Fuzzy Sets, Basic
Definition and Terminology, Set-theoretic Operations, Member Function Formulation and
Parameterization, Fuzzy Rules and Fuzzy Reasoning, Extension Principle and Fuzzy Relations, Fuzzy If-
Then Rules, Fuzzy Reasoning, Fuzzy Inference Systems, Mamdani Fuzzy Models, Surgeon Fuzzy
Models, Tsukamoto Fuzzy Models, Input Space Partitioning and Fuzzy Modeling.
UNIT-II: Optimization: Derivative based Optimization, Descent Methods, The Method of Steepest
Descent, Classical Newton's Method, Step Size Determination, Derivative-free Optimization, Genetic
Algorithms, Simulated Annealing and Random Search ? Downhill Simplex Search.

UNIT-III: Artificial Intelligence:
Introduction, Knowledge Representation, Reasoning, Issues and
Acquisition: Prepositional and Predicate Calculus Rule Based knowledge Representation Symbolic
Reasoning under Uncertainty Basic knowledge Representation Issues Knowledge acquisition, Heuristic
Search: Techniques for Heuristic search Heuristic Classification State Space Search: Strategies
Implementation of Graph Search based on Recursion Patent directed Search Production System and
Learning.
UNIT-IV: Neuro Fuzzy Modeling: Adaptive Neuro-Fuzzy Inference Systems, Architecture ? Hybrid
Learning Algorithm, Learning Methods that Cross-fertilize ANFIS and RBFN ? Coactive Neuro Fuzzy
Modeling, Framework Neuron Functions for Adaptive Networks ? Neuro Fuzzy Spectrum.

UNIT-V: Applications Of Computational Intelligence:
Printed Character Recognition, Inverse
Kinematics Problems, Automobile Fuel Efficiency Prediction, Soft Computing for Color Recipe
Prediction.
Text Books:
1. J.S.R.Jang, C.T.Sun and E.Mizutani, "Neuro-Fuzzy and Soft Computing",
PHI, 2004, Pearson Education 2004.
2.N.P.Padhy, "Artificial Intelligence and Intelligent Systems", Oxford University
Press, 2006.

References
:

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


1. Elaine Rich & Kevin Knight, Artificial Intelligence, Second Edition, Tata Mcgraw Hill Publishing
Comp., 2006, New Delhi.
2. Timothy J.Ross, "Fuzzy Logic with Engineering Applications", McGraw-Hill, 1997.
3. Davis E.Goldberg, "Genetic Algorithms: Search, Optimization and Machine Learning", Addison
Wesley, N.Y., 1989.

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

3
0
0
3
Deep Learning
Course Objectives:
At the end of the course, the students will be expected to:
? Learn deep learning methods for working with sequential data,
? Learn deep recurrent and memory networks,
? Apply such deep learning mechanisms to various learning problems.
? Learn deep Turing machines, the open issues in deep learning, and have a grasp of the current
research directions.

Course Outcomes:
After the completion of the course, student will be able to
? Demonstrate the basic concepts fundamental learning techniques and layers.
? Discuss the Neural Network training, various random models.
? Explain different types of deep learning network models.
? Classify the Probabilistic Neural Networks.
? Implement tools on Deep Learning techniques.


UNIT-I: Introduction: Various paradigms of learning problems, Perspectives and Issues in deep learning
framework, review of fundamental learning techniques. Feed forward neural network: Artificial Neural
Network, activation function, multi-layer neural network
UNIT-II: Training Neural Network: Risk minimization, loss function, back propagation, regularization,
model selection, and optimization. Deep Neural Networks: Difficulty of training deep neural networks,
Greedy layer wise training.
UNIT-III: Deep Learning: Deep Feed Forward network, regularizations, training deep models, dropouts,
Convolution Neural Network, Recurrent Neural Network, and Deep Belief Network.
UNIT-IV: Probabilistic Neural Network: Hopfield Net, Boltzmann machine, RBMs, Sigmoid net,
Auto encoders.

UNIT-V: Applications:
Object recognition, sparse coding, computer vision, natural language
processing. Introduction to Deep Learning Tools: Tensor Flow, Caffe, Theano, Torch.




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Text Books:
1. Goodfellow, I., Bengio,Y., and Courville, A., Deep Learning, MIT Press, 2016..
2. Bishop, C. ,M., Pattern Recognition and Machine Learning, Springer, 2006.
Reference Books:
1. Yegnanarayana, B., Artificial Neural Networks PHI Learning Pvt. Ltd, 2009.
2. Golub, G.,H., and Van Loan,C.,F., Matrix Computations, JHU Press,2013.
3. Satish Kumar, Neural Networks: A Classroom Approach, Tata McGraw-Hill Education, 2004
4. Neural Networks: A Systematic Introduction, Ra?l Rojas, 1996
5. Pattern Recognition and Machine Learning, Christopher Bishop, 2007

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

3
0
0
3
Computer Vision
Course Objectives:
? Be familiar with both the theoretical and practical aspects of computing with images
? Have described the foundation of image formation, measurement, and analysis.
? Understand the geometric relationships between 2D images and the 3D world.
? Grasp the principles of state-of-the-art deep neural networks.

Course Outcomes:

? Developed the practical skills necessary to build computer vision applications
? Gain and exposure to object and scene recognition and categorization from images
? Identify Regularization theory, Optical computation
? Explain Deformable curves and surfaces, Snakes and active contours, Level set representations
? Demonstrate object recognition methods, Shape correspondence and shape matching, Principal
Component analysis
UNIT ?I: Image Formation Models: Monocular imaging system, Orthographic & Perspective
Projection, Camera model and Camera calibration, Binocular imaging systems
UNIT- II: Image Processing and Feature Extraction:
Image representations (continuous and discrete),
Edge detection
UNIT- III: Motion Estimation:
Regularization theory, Optical computation, Stereo Vision, Motion
estimation, Structure from motion
UNIT- IV: Shape Representation and Segmentation:
Deformable curves and surfaces, Snakes and
active contours, Level set representations, Fourier and wavelet descriptors, Medial representations, and
Multi resolution analysis
UNIT- V: Object recognition:
Hough transforms and other simple object recognition methods, Shape
correspondence and shape matching, Principal Component analysis, Shape priors for recognition
Text Books:
1. Computer Vision - A modern approach, by D. Forsyth and J. Ponce, Prentice
Hall
2. Robot Vision, by B. K. P. Horn, McGraw-Hill.

Reference Books:
1. Introductory Techniques for 3D Computer Vision, by E. Trucco and A. Verri,
Publisher: Prentice Hall.
L
T
P
C
I Year - II Semester

3
0
0
3
Big Data Analytics

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Course Objectives:
? Optimize business decisions and create competitive advantage with Big Data analytics
? Introducing Java concepts required for developing map reduce programs
? Derive business benefit from unstructured data
? Imparting the architectural concepts of Hadoop and introducing map reduce paradigm
? To introduce programming tools PIG & HIVE in Hadoop echo system.

Course Outcomes:

? Demonstrate Java Collections for various data structures and Generic types.
? Construct Hadoop Cluster with the building blocks and configure it.
? Illustrate Map Reduce framework applications to process huge amounts of data.
? Explain the Hadoop IO techniques, data integrity & compression for developing distributed
systems such as serialization frame works.
? Prepare Pig Latin scripts to process Big Data in Hadoop.
? Apply Hive to process Big Data in Hadoop.

UNIT-I:
Data structures in Java: Linked List, Stacks, Queues, Sets, Maps; Generics: Generic classes and
Type parameters, Implementing Generic Types, Generic Methods, Wrapper Classes, Concept of
Serialization

UNIT-II: Working with Big Data: Google File System, Hadoop Distributed File System (HDFS) ?
Building blocks of Hadoop (Namenode, Datanode, Secondary Namenode, Job Tracker, Task Tracker),
Introducing and Configuring Hadoop cluster (Local, Pseudo-distributed mode, Fully Distributed mode),
Configuring XML files.

UNIT-III: Writing Map Reduce Programs: A Weather Dataset, Understanding Hadoop API for
MapReduce Framework (Old and New), Basic programs of Hadoop MapReduce: Driver code, Mapper
code, Reducer code, Record Reader, Combiner, Partitioner
UNIT-IV:
Hadoop I/O: The Writable Interface, Writable Comparable and comparators, Writable Classes:
Writable wrappers for Java primitives, Text, Bytes Writable, Null Writable, Object Writable and Generic
Writable, Writable collections, Implementing a Custom Writable: Implementing a Raw Comparator for
speed, Custom comparators

UNIT-V:
Pig: Hadoop Programming Made Easier Admiring the Pig Architecture, Going with the Pig
Latin Application Flow, Working through the ABCs of Pig Latin, Evaluating Local and Distributed
Modes of Running Pig Scripts, Checking out the Pig Script Interfaces, Scripting with Pig Latin
Applying Structure to Hadoop Data with Hive: Saying Hello to Hive, Seeing How the Hive is Put
Together, Getting Started with Apache Hive, Examining the Hive Clients, Working with Hive Data
Types, Creating and Managing Databases and Tables, Seeing How the Hive Data Manipulation Language
Works, Querying and Analyzing Data



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



Text Books:

1. Big Java 4th Edition, Cay Horstmann, Wiley John Wiley & Sons, INC
2. Hadoop: The Definitive Guide by Tom White, 3rd Edition, O'reilly
3. Hadoop in Action by Chuck Lam, MANNING Publ.
4. Hadoop for Dummies by Dirk deRoos, Paul C.Zikopoulos, Roman B.Melnyk,Bruce Brown, Rafael
Coss

Reference Books:
1. Hadoop in Practice by Alex Holmes, MANNING Publ.
2. Hadoop MapReduce Cookbook, Srinath Perera, Thilina Gunarathne




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

3
0
0
3
Remote Sensing
Course Objectives:
? To provide an opportunity for individuals to learn Remote Sensing and Geo information Science
for the benefit of their professional career
? This basic course in Remote Sensing and Geo information Science will allow graduates to build
their knowledge and practical expertise in RS and GIS technologies with independent study and
project experience at the certificate level
Course Outcomes:
At the end of the course, the student will be able to:
? Select the type of remote sensing technique / data for required purpose
? Identify the earth surface features from satellite images
? Analyze the energy interactions in the atmosphere and earth surface features
? Perform corrections and process digital satellite data Mapping of course outcomes with program
outcomes
UNIT-I:Physics of Remote Sensing: Sources of Energy, Active and Passive Radiation, Electromagnetic
Radiation - Reflectance, Transmission, Absorption, Thermal Emissions, Interaction with Atmosphere,
Atmospheric windows, Spectral reflectance of Earth's surface features, Multi concept of Remote Sensing.

UNIT-II: Platforms:
Various types of platforms, different types of aircraft, manned and unmanned
spacecrafts used for data acquisition - characteristics of different types of platforms - airborne and space
borne.

UNIT-III: Data Acquisition Systems:
Optical, Thermal and Microwave; Resolutions - spatial, spectral,
radiometric and temporal, signal to noise ratio.

UNIT-IV: Image Processing:
Data Products and Their Characteristics, Digital image formation, digital
image display mechanism, image histograms, look up table data, Pre-processing ? Atmospheric,
Radiometric, Geometric Corrections - Basic Principles of Visual Interpretation, Equipment for Visual
Interpretation, Ground Truth, Ground Truth Equipment.

UNIT-V Image enhancements
: Linear and non-linear Contrast enhancement techniques, density slicing,
pseudo colour images, spatial enhancement techniques (convolution filtering), spectral enhancement
techniques, Image algebra. Applications of Remote sensing in various Engineering and Science domains
such as Agriculture, Forest, Soil, Geology, LU/LC, Water Resources, Urban etc

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Textbooks:
1. James B. Campbell & Randolph H. Wynne., Introduction to Remote Sensing, The Guilford Press,
2011.
2. Charles Elach&Jakob van Zyl., Introduction to the physics and techniques of Remote Sensing, John
Wiley & Sons publications, 2006.
3. Lille sand T.M & Kiefer R.W., Remote Sensing and Image Interpretation, John Wiely and Sons,
2008.
4. ChritianMatzler., Thermal microwave radiation: Applications for remote sensing, The institution of
Engineering and Technology, London, 2006.
5. Rees, W. G., Physical principles of Remote Sensing, Cambridge University Press, 2001Paul Curran
P.J., Principles of Remote Sensing, ELBS Publications, 1985.

References:
1. Fundamentals of Remote Sensing by Jeorge Joseph, Third Edition


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

3
0
0
3
Cognitive Systems
Course Objectives:
? The purpose of this course is to give students at the bachelor's program in cognitive science a
broad understanding of the field of cognitive science from an interdisciplinary perspective.
? The course discusses models of cognition that are based on empirical studies of human and
artificial intelligence.

Course Outcomes:

? Identify knowledge from computing, linguistics, psychology
? Define philosophy can combine to give us an understanding of thinking.
? Analyze how the other courses in the program contribute to cognitive science.
? Explain basis for specialization in the master degree.
? Identify reflect critically on issues that are central to cognitive science
UNIT--I: Introduction to Knowledge Based Artificial Intelligence (KBAI) and Cognitive Systems.
Where Knowledge-Based AI fits into AI as whole, Cognitive systems: what are they? Cognitive Science
and Cognition, AI and cognition: how are they connected?
UNIT- II:
Fundamentals: Semantic Networks, Generate & Test, Means-Ends Analysis, Problem
Reduction, Production Systems
UNIT-III:
Kinematics of Robot Manipulator: Introduction, General Mathematical Preliminaries on
Vectors& Matrices, Direct Kinematics problem, Geometry Based Direct kinematics problem, Co-ordinate
and vector transformation using matrices, Rotation matrix, Inverse Transformations, Problems,
Composite Rotation matrix, Homogenous Transformations, Robotic Manipulator Joint Co-Ordinate
System, Euler Angle & Euler Transformations, Roll-Pitch-Yaw(RPY) Transformation H Representation
& Displacement Matrices for Standard Configurations, Jacobian Transformation in Robotic
Manipulation. (SLE: Geometrical Approach to Inverse Kinematics.)
UNIT-IV:
Visuospatial Reasoning Constraint Propagation, Visuospatial Reasoning UNIT- Design &
Creativity Configuration, Diagnosis, Design, Creativity Met cognition Learning by Correcting Mistakes,
Meta-Reasoning, AI Ethics Module II (Cognitive Systems)Introduction: Automation and Robotics,
Historical Development, Definitions, Basic Structure of Robots, Robot Anatomy, Complete Classification
of Robots, Fundamentals about Robot Technology, Factors related to use Robot Performance, Basic
Robot Configurations and their Relative Merits and Demerits, the Wrist & Gripper Subassemblies.
Concepts about Basic Control System, ,Control Loops of Robotic Systems, Different Types of Controllers
Proportional, Integral, Differential, PID controllers. (SLE: Types of Drive Systems and their Relative
Merits)



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



UNIT-V:
Robot Sensing & Vision: Various Sensors and their Classification, Use of Sensors and Sensor
Based System in Robotics, Machine Vision System, Description, Sensing, Digitizing, Image Processing
and Analysis and Application of Machine Vision System, Robotic Assembly Sensors and Intelligent
Sensors.
Text Books:

1. Chris Forsythe el al. , Cognitive Systems: Human Cognitive Models in Systems Design, Kindle
Edition
2. G. F. Markus. The Algebraic Mind - Integrating Connectionism & Cognitive Science , MIT Press.
3. Fu, Lee and Gonzalez, Robotics, control vision and intelligence. McGraw Hill . 4. John J. Craig,
Introduction to Robotics, Addison Wesley

References:
1. Henrik Christensen, Cognitive Systems (Cognitive Systems Monographs), Springer
2. Yoram Koren, Robotics for Engineers, McGraw Hill International, 1st edition, 1985.
3. Groover, Weiss, Nagel, Industrial Robotics, McGraw Hill International, 2nd edition, 2012. 4.
Robotic Engineering -An Integrated approach, Klafter, Chmielewski and Negin, PHI, 1st edition,
2009

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

3
0
0
3
Knowledge Discovery
Course Objectives:
? To learn the latest development of knowledge discovery and data mining concepts and techniques.
? Theories and algorithms for data mining and knowledge discovery will be introduced
? Relevant applications in specific domains such as medicine and heath care will be covered.
Course Outcomes:
? Identify and distinguish data mining applications from other IT applications
? Describe data mining algorithms, Web usage mining
? Describe applicability of data mining
? Analyze data mining algorithms and techniques
? Explain Web terminology and characteristics, Web content mining,
UNIT-I : Introduction to Data mining, types of Data, Data Quality, Data Processing, Measures of
Similarity and Dissimilarity, Exploring Data: Data Set, Summary Statistics, Visualization, OLAP and
multi dimensional data analysis.

UNIT-II: Classification: Basic Concepts, Decision Trees and model evaluation: General approach for
solving a classification problem, Decision Tree induction, Model over fitting: due to presence of noise,
due to lack of representation samples, Evaluating the performance of classifier. Nearest Neighborhood
classifier, Bayesian Classifier, Support vector Machines: Linear SVM, Separable and Non Separable case.

UNIT-III
: Association Analysis: Problem Definition, Frequent Item-set generation, rule generation,
compact representation of frequent item sets, FP-Growth Algorithms. Handling Categorical, Continuous
attributes, Concept hierarchy, Sequential, Sub graph patterns

UNIT-IV: Clustering:
Over view, K-means, Agglomerative Hierarchical clustering, DBSCAN, Cluster
evaluation: overview, Unsupervised Cluster Evaluation using cohesion and separation, using proximity
matrix, Scalable Clustering algorithm

UNIT-V: Web data mining:
Introduction, Web terminology and characteristics, Web content mining,
Web usage mining, web structure mining, Search Engines Characteristics, Functionality, Architecture,
Ranking of WebPages, Enterprise search

Text Books:

1. Introduction to Data Mining: Pang-Ning tan, Michael Steinbach, Vipinkumar, Addision-Wesley.
2. Introduction to Data Mining with Case Studies: GK Gupta; Prentice Hall.

Reference Books:

1. Data Mining: Introductory and Advanced Topics, Margaret H Dunham, Pearson, 2008.
2. Fundamentals of data warehouses, 2/e ,Jarke, Lenzerini, Vassiliou, Vassiliadis, Springer.
3. Data Mining Theory and Practice, Soman, Diwakar, Ajay, PHI, 2006.
4. Data Mining , Concepts and Techniques, 2/e, Jiawei Han, MichelineKamber, Elsevier, 2006.
L
T
P
C
I Year - II Semester

3
0
0
3
Natural Language Processing
Course Objectives:

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


? Make them understand the concepts of morphology, syntax, semantics and pragmatics of the
language and that they are able to give the appropriate examples that will illustrate the above
mentioned concepts.
? Teach them to recognize the significance of pragmatics for natural language understanding.
? Enable students to be capable to describe the application based on natural language processing and
to show the points of syntactic, semantic and pragmatic processing.
Course Outcomes:
? Explain approaches to syntax and semantics in NLP.
? Demonstrate approaches to discourse, generation, dialogue and summarization within NLP.
? Explain current methods for statistical approaches to machine translation.
? Identify machine learning techniques used in NLP, including hidden Markov models and
probabilistic
? Explain context-free grammars, clustering and unsupervised methods, log-linear and
discriminative models, and the EM algorithm as applied within NLP
UNIT-I: Introduction: NLP tasks in syntax, semantics, and pragmatics. Applications such as
information extraction, question answering, and machine translation. The problem of ambiguity. The role
of machine learning. Brief history of the field.
UNIT-II: N-gram Language Models:
The role of language models, Simple Ngram models. Estimating
parameters and smoothing. Evaluating language models. Part of Speech Tagging and Sequence
Labeling:
Lexical syntax. Hidden Markov Models. Maximum Entropy Models. Conditional Random
Fields

UNIT-III: Syntactic parsing: Grammar formalisms and tree banks. Efficient parsing for context-free
grammars (CFGs). Statistical parsing and probabilistic CFGs (PCFGs). Lexicalized PCFGs.
UNIT-IV: Semantic Analysis:
Lexical semantics and word-sense disambiguation. Compositional
semantics. Semantic Role Labeling and Semantic Parsing.
UNIT- V: Information Extraction (IE) and Machine Translation (MT): Named entity recognition and
relation extraction. IE using sequence labeling. Basic issues in MT. Statistical translation, word
alignment, phrase based translation, and synchronous grammars. Dialogues: Turns and utterances,
grounding, dialogue acts and structures Natural Language Generation: Introduction to language
generation, architecture, discourse planning (text schemata, rhetorical relations).



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Text Books:
1. D. Jurafsky & J. H. Martin ? "Speech and Language Processing ? An introduction to Language
processing, Computational Linguistics, and Speech Recognition", Pearson Education

References:

1. Allen, James. 1995. ? "Natural Language Understanding". Benjamin/ Cummings, 2ed.
2. Bharathi, A., Vineet Chaitanya and Rajeev Sangal. 1995. Natural Language Processing- "A
Pananian Perspective". Prentice Hll India, Eastern Economy Edition.
3. Eugene Cherniak: "Statistical Language Learning", MIT Press, 1993.
4. Manning, Christopher and Heinrich Schutze. 1999. "Foundations of Statistical Natural Language
Processing". MIT Press.

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

0
0
4
2
Soft Computing Lab
Course Objectives:
? From the course the student will learn
? Describe supervised and unsupervised learning differences.
? Describe the data science life cycle.
? Use machine Take data science into production.
? Introducing data science, with a focus on the job outlook and market requirements.
? Hands-on Applied Statistics Concepts using Python.
? Graphics and Data Visualization Libraries in Python.
? Machine Learning algorithms, Models and Case Studies with Python.

Course Outcomes:

? Use Deep Learning techniques to build concise representations of the meanings of words in all
significant languages
? Use machine learning methods to solve the real-world problems.
? Develop a feed forward, convolution and recurrent neural networks.
? Experiment with AI and data visualization techniques
? Examine map reduce, Naive Byes and K-Means Clustering.

Experiment 1
To solve the real-world problems using the following machine learning methods:
a) Linear Regression b) Logistic Regression
Experiment 2
Implement Support Vector Machines

Experiment 3
Implement K-Means Clustering & PCA

Experiment 4
Implementation of map reduce

Experiment 5
Implementation of Naive Bayes

Experiment 6
Exploratory Data Analysis for Classification using Pandas and Matplotlib

Experiment 7
Implement a program for Bias, Variance, and Cross Validation

Experiment 8
Implementation of feed forward neural networks

Experiment 9
Implementation of convolution neural networks


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Experiment 10
Implementation of recurrent neural networks

Experiment 11
Write a program to simulate a perception network for pattern classification and function approximation.

Experiment 12
Solve optimal relay coordination as a linear programming problem using Genetic Algorithm.

Experiment 13
Solve optimal relay coordination as a non-Linear programming problem using Genetic algorithm.

Experiment 14
Solve economic load dispatch problem using Genetic algorithm

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
I Year - II Semester

0
0
4
2
Deep Learning Lab
Course Objectives:
From the course the student will learn
?
Describe supervised and unsupervised learning differences.
?
Describe the data science life cycle.
?
Use machine Take data science into production.
? Introducing data science, with a focus on the job outlook and market
Requirements.
?
Hands-on Applied Statistics Concepts using Python.
?
Graphics and Data Visualization Libraries in Python.
?
Machine Learning algorithms, Models and Case Studies with Python.

Course Outcomes:

? Use Deep Learning techniques to build concise representations of the meanings of words in all
significant languages
? Use Voice Recognition application using
? Develop a feed forward, convolution and recurrent neural networks.
? Experiment with AI and data visualization techniques
? Examine Object Recognition application
Get Familiarity with popular deep learning frame works such as Tensor Flow, PyTorch, Keras, etc.. For
applications like

Experiment 1
Implement Face Recognition application using any one of frame works

Experiment 2
Implement Voice Recognition application using any one of frame works

Experiment 3
Implement Object Recognition application using any one of frame works

Experiment 4
Implement Object Counting application using any one of frame works

Experiment 5
Implement Sentiment Analysis application using any one of frame works

Experiment 6
Implement Detection of Fake News application using any one of frame works, etc....
1.

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
II Year - I Semester

3
0
0
3
Reinforcement Learning

Course Objectives:

? There have been many recent advances in the field of reinforcement learning.
? To provide exposure to these advances and facilitate in depth discussions on chosen topics.
? Student has to done the introduction to reinforcement learning course.
Course Outcomes:
? Gain Knowledge about advances in reinforcement learning and case studies
? Define Bayesian approach to data-efficient reinforcement learning with off-line data
? Demonstrate Inverse reinforcement learning and human supervision
? Describe deep Q network, deep actor critic
? Explain hierarchical frame works, option discovery
UNIT-I: Introduction- Recent Advances in Reinforcement Learning Atari Game Player, Alpha Go, and
other case studies.
UNIT-II: Model Based RL- Bayesian Approaches to Reinforcement Learning; Data-efficient
Reinforcement Learning; Learning with off-line data; Learning with incompletely specified models; RL
and planning.
UNIT-III: Human in the Loop RL- Learning with human supervision; imitation learning; inverse
reinforcement learning; learning from demonstration.
UNIT-IV: Representation Learning for RL- Deep Q network; Deep Actor-Critic; Representation and
policy transfer in RL
UNIT-V: Hierarchical RL- Hierarchical frameworks; Option discovery; safe state abstractions;
hierarchies for transfer.
Text Books:
1. Richard S. Sutton and Andrew G. Barto. Introduction to Reinforcement Learning, 2nd Edition, MIT
Press. 2017. [Draft copies available now]
2. Neuro Dynamic Programming. DimitriBertsikas and John G. Tsitsiklis. Athena Scientific. 1996
Reference Books:
1. Reinforcement Learning: An Intruduction, 2e,Richard S.Sutton and Andrew
G. Barto

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
II Year - I Semester

3
0
0
3
Bioinformatics

Course Objectives:

? The basic objective is to give students an introduction to the basic practical techniques
of bioinformatics.
? Emphasis will be given to the application of bioinformatics and biological databases to problem
solving in real research problems.

Course Outcomes:

? Demonstrate mastery of the core concepts of Bioinformatics, including computational biology,
database design
? Explain Gene Prediction and Regularity element Prediction
? Illustrate protein structure prediction and protein classification
? Describe genome mapping, functional genomics
UNIT-I: Introduction to Bioinformatics and Biological Databases, Sequence alignment, Pair wise
Sequence alignment, multiple sequence alignment, database Similarities.
UNIT-II: Molecular phylogenetics- Basics, gene phylogeny Vs Systems Phylogeny, Tree construction
methods and programs, advanced Statistical approaches, profiles and Hidden markow models.
UNIT-III: Gene and promoter prediction-Gene Prediction, promoter and regulatory element
prediction, RNA structure prediction, protein motives and domain prediction
UNIT-IV: Structural Bioinformatics- Basics, Protein structure Visualization, comparison,
classification, protein secondary structure prediction, protein tertiary structure prediction.
UNIT-V: Genomics and Proteomics- Genome Mapping, Assembly, comparison, functional genomics,
proteomics.
Text Books:
1.
Essential Bioinformatics: Jin Xiong 2006, Cambridge University Press.
Reference Books:
1. "Bioinformatics: A Biologist's Guide to Biocomputing and the Internet" , 1e,
Eaton pub co,2000

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
II Year - I Semester

3
0
0
3
Speech Processing
Course Objectives:
? Student Focuses on methods for recording speech and other vocal signals, for processing and
modifying such recordings, and for synthesizing artificial speech.
? Addition to class discussion and short, hands-on exercises, each student develops, executes, and
presents a hands-on term project.
Course Outcomes:
? Explain and apply knowledge of landmark findings and theories in cognitive science.
? Design, interpret, and evaluate simple behavioral and neuro scientific experiments.
? Interpret and appreciate formal and computational approaches in cognitive science.
? Describe Articulator synthesis, Formant synthesis, LPC synthesis.
? To analyze a speech signal in terms of its frequency content.
UNIT I: Introduction-Production of speech, sound perception, speech Analysis, speech coding, speech
Enhancement, speech Synthesis, speech and speaker Recognition, Signals and Linear Systems: Simple
signal, Filtering and convolution, Frequency Analysis: Fourier Transform, spectra and Correlation,
Laplace Transform: Poles and Zeros, Discrete ?Time Signal and Systems: Sampling, Frequency
Transforms of Discrete-Time Signals, Decimation and Interpolation Filter: Band pass Filter, Digital
Filters, Difference Equations and Interpolation Speech analysis: Introduction, Short-Time speech
Analysis: Windowing, Spectra of Windows: Wide-and Narrow ? Band Spectrograms, Time-domain
Parameters: Signal Analysis in the Time Domain, Short ?Time Average Energy and Magnitude, Short ?
Time Average Zero-Crossing Rate ( ZCR), short-Time Autocorrelation Function , Frequency?Domain
(Spectral) Parameters: Filter?Bank Analysis, Short-Time Fourier Transform Analysis, Spectral Displays,
Formant Estimation and Tracking .
UNIT II: Speech Production And Acoustic Phonetics-Anatomy and Physiology of the speech Organs:
the Lungs and the Thorax, Larynx and Vocal Folds(cords), Vocal Tract, Articulatory phonetics: Manner
of Atriculatory, Structure of the Syllable, Voicing, Place of the Articulation, Phonemes in Other
Language, Articulatory Models, Acoustic Phonetics : Spectrograms, Vowels, Diphthongs, glides and
Liquids, Nasals, Fricatives, stops (Plosives), Variants of Normal Speech.
UNIT III: Linear Predictive Coding (Lpc) Analysis- Basic Principles of LPC, Least ?Squares
Autocorrelation Method, Least ?Squares Covariance Method, Computation Considerations, Spectral
Estimation Via LPC, Updating the LPC Model Sample by Sample, Window Considerations, Cepstral
Analysis: Mathematical details of Cepstral analysis, Applications for the spectrum, Mel- Scale Cestrum,
F0 Pitch estimation: Time domain F0 estimation methods, short-time Spectral methods

UNIT IV: Speech synthesis- Introduction, Principles of speech synthesis: Types of stored speech units
to concatenate, Memory size, Synthesis method, Limited text voice response system, unrestricted- text
TTS systems. Synthesizer methods: Articulator synthesis, Formant synthesis, LPC synthesis.
UNIT V: Introduction- Van Curability in speech signals, segmenting speech into smaller units,
Performance evaluation, Database for speech recognition, pattern recognition methods, pre=processing,
parametric representation: parameters used in speech recognition, feature extraction, Evaluation of
similarity of speech patterns: frame-based distance measures, Making ASR decisions, HMMs
Speaker recognition: Introduction, Verification Vs. Recognition, Recognition techniques: Model
evaluation, text dependence, statical Vs. dynamic features, stochastic models, vector quantization,
similarity and distance measures, cepstral analysis, Features that distinguish the Speakers measures of

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


the effectiveness of features, techniques to choose features, spectral features, prosodic features
Text Books:
1 Speech Communication Douglas O' Shaughnessy, Universities Press
Reference Books:

1. Fundamentals of Speech Recognition, Lawrence Rabiner, Biing-Hwang Juang,Pearson Education
2. Speech and Language processing, Daniel Jurafsky, James H. Martin, Pearson Education
Open Electives offered to Other Departments
1. Python Programming 3. Machine Learning
2. Artificial Intelligence 4. Deep Learning



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


L
T
P
C
II Year - I Semester

3
0
0
3
Python Programming

Course Objectives:
? Knowledge and understanding of the different concepts of Python.
? Using the GUI Programming and Testing in real-time applications.
? Using package Python modules for reusability.

Course Outcomes
: At the end of the course, student will be able to
? Understand and comprehend the basics of python programming.
? Demonstrate the principles of structured programming and be able to describe, design, implement,
and test structured programs using currently accepted methodology.
? Explain the use of the built-in data structures list, sets, tuples and dictionary.
? Make use of functions and its applications.
? Identify real-world applications using oops, files and exception handling provided by python.

Syllabus:
UNIT?I: Introduction-
History of Python, Python Language, Features of Python, Applications of
Python, Using the REPL (Shell), Running Python Scripts, Variables, Assignment, Keywords, Input-
Output, Indentation.
UNIT?II: Types, Operators and Expressions-
Types - Integers, Strings, Booleans; Operators-
Arithmetic Operators, Comparison (Relational) Operators, Assignment Operators, Logical Operators,
Bitwise Operators, Membership Operators, Identity Operators, Expressions and order of evaluations,
Control Flow- if, if-elif-else, for, while, break, continue, pass.

UNIT?III: Data Structures-Lists - Operations, Slicing, Methods; Tuples, Sets, Dictionaries, Sequences,
Comprehensions.
UNIT?IV: Functions
- Defining Functions, Calling Functions, Passing Arguments, Keyword Arguments,
Default Arguments, Variable-length arguments, Anonymous Functions, Fruitful Functions (Function
Returning Values), Scope of the Variables in a Function - Global and Local Variables, Modules: Creating
modules, import statement, from.. import statement, name spacing, Python packages, Introduction to PIP,
Installing Packages via PIP, Using Python Packages Error and Exceptions: Difference between an error
and Exception, Handling Exception, try except block, Raising Exceptions, User Defined Exceptions.
UNIT?V: Object Oriented Programming OOP in Python
-Classes, 'self variable', Methods,
Constructor Method, Inheritance, Overriding Methods, Datahiding, Brief Tour of the Standard Library -
Operating System Interface - String Pattern Matching, Mathematics, Internet Access, Dates and Times,
Data Compression, Multithreading, GUI Programming, Turtle Graphics, Testing: Why testing is required
?, Basic concepts of testing, Unit testing in Python, Writing Test cases, Running Tests.



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Text Books:
1. Fundamentals of Python First Programs, Kenneth. A. Lambert, Cengage
2. Introduction to Programming Using Python, Y. Daniel Liang, Pearson
Reference Books:
1. Introduction to Python Programming, Gowrishankar.S, Veena A, CRC Press
2. Think Python, Allen Downey, Green Tea Press
3. Core Python Programming, W. Chun, Pearson




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



L
T
P
C
II Year - I Semester

3
0
0
3
Artificial Intelligence

Course objectives:

? To learn the difference between optimal reasoning Vs human like reasoning
? To understand the notions of state space representation, exhaustive search, heuristic search
along with the time and space complexities
? To learn different knowledge representation techniques
? To understand the applications of AI: namely Game Playing, Theorem Proving, Expert
Systems, Machine Learning and Natural Language Processing

Course Outcomes:

? Formulate an efficient problem space for a problem expressed in English.
? Select a search algorithm for a problem and characterize its time and space complexities.
? Experiment with knowledge using the appropriate techniques for Problem reduction
? Apply AI techniques to solve problems of Game Playing, Expert Systems, Machine Learning
and Natural Language Processing

Syllabus:
UNIT-I: Introduction to artificial intelligence:
Introduction ,history, intelligent systems, foundations of
AI, applications, tic-tac-tie game playing, development of ai languages, current trends in AI
UNIT-II: Problem solving: state-space search and control strategies:
Introduction, general problem
solving, characteristics of problem, exhaustive searches, heuristic search techniques, iterative-deepening
a*, constraint satisfaction
Problem reduction and game playing:
Introduction, problem reduction, game playing, alpha-beta
pruning, two-player perfect information games

UNIT-III: Logic concepts:
Introduction, propositional calculus, proportional logic, natural deduction
system, axiomatic system, semantic tableau system in proportional logic, resolution refutation in
proportional logic, predicate logic
UNIT-IV: Knowledge representation:
Introduction, approaches to knowledge representation,
knowledge representation using semantic network, extended semantic networks for KR, knowledge
representation using frames
Advanced knowledge representation techniques: Introduction, conceptual dependency theory, script
structure, cyc theory, case grammars, semantic web
UNIT-V: Expert system and applications:
Introduction phases in building expert systems, expert
system versus traditional systems, rule-based expert systems blackboard systems truth maintenance
systems, application of expert systems, list of shells and tools



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



Text Books:

1. Artificial Intelligence, 1st edition, Saroj Kaushik, CENGAGE Learning, 2011
2. Artificial intelligence, A modern Approach , 2nd edition, Stuart Russel, Peter Norvig, PEA, 2009
3. Artificial Intelligence, 3rd edition, Rich, Kevin Knight, Shiva Shankar B Nair, TMH,2017
4. Introduction to Artificial Intelligence, 1st edition, Patterson, PHI, 2015

Reference Books:

1. Artificial intelligence, structures and Strategies for Complex problem solving, 5th ed, George F
Lugar, PEA, 2008
2. Introduction to Artificial Intelligence, 1st edition, Ertel, Wolf Gang, Springer,2011
3. Artificial Intelligence, 1st edition, A new Synthesis, Nils J Nilsson, Elsevier, 1998




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



L
T
P
C
II Year - I Semester

3
0
0
3
Machine Learning

Course Objectives:

Machine Learning course will
? Develop an appreciation for what is involved in learning from data.
? Demonstrate a wide variety of learning algorithms.
? Demonstrate how to apply a variety of learning algorithms to data.
? Demonstrate how to perform evaluation of learning algorithms and model selection.

Course Outcomes:
After the completion of the course, student will be able to
Domain Knowledge for Productive use of Machine Learning and Diversity of Data.
Demonstrate on Supervised and Computational Learning
Analyze on Statistics in learning techniques and Logistic Regression
Illustrate on Support Vector Machines and Perceptron Algorithm
Design a Multilayer Perceptron Networks and classification of decision tree
?
Syllabus:
Unit-I: Introduction-
Towards Intelligent Machines, Well posed Problems, Example of Applications in
diverse fields, Data Representation, Domain Knowledge for Productive use of Machine Learning,
Diversity of Data: Structured / Unstructured, Forms of Learning, Machine Learning and Data Mining,
Basic Linear Algebra in Machine Learning Techniques.

Unit-II: Supervised Learning-
Rationale and Basics: Learning from Observations, Bias and Why
Learning Works: Computational Learning Theory, Occam's Razor Principle and Over fitting Avoidance
Heuristic Search in inductive Learning, Estimating Generalization Errors, Metrics for assessing
regression, Metris for assessing classification.

Unit-III: Statistical Learning-
Machine Learning and Inferential Statistical Analysis, Descriptive
Statistics in learning techniques, Bayesian Reasoning: A probabilistic approach to inference, K-Nearest
Neighbor Classifier. Discriminant functions and regression functions, Linear Regression with Least
Square Error Criterion, Logistic Regression for Classification Tasks, Fisher's Linear Discriminant and
Thresholding for Classification, Minimum Description Length Principle.

Unit-IV: Support Vector Machines (SVM)-
Introduction, Linear Discriminant Functions for Binary
Classification, Perceptron Algorithm, Large Margin Classifier for linearly seperable data, Linear Soft
Margin Classifier for Overlapping Classes, Kernel Induced Feature Spaces, Nonlinear Classifier,
Regression by Support vector Machines.
Learning with Neural Networks: Towards Cognitive Machine, Neuron Models, Network Architectures,
Perceptrons, Linear neuron and the Widrow-Hoff Learning Rule, The error correction delta rule.

Unit -V:
Multilayer Perceptron Networks and error back propagation algorithm, Radial Basis Functions
Networks. Decision Tree Learning: Introduction, Example of classification decision tree, measures of
impurity for evaluating splits in decision trees, ID3, C4.5, and CART decision trees, pruning the tree,
strengths and weakness of decision tree approach.



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


Textbooks:
1. Artificial Intelligence, 1st edition, Saroj Kaushik, CENGAGE Learning, 2011
2. Machine Learning: A Probabilistic Perspective, 1st edition, Kevin Murphy, MIT Press,2012
3. The Elements of Statistical Learning, 2nd edition, Trevor Hastie, Robert Tibshirani, Jerome
Friedman, Springer, 2009 (freely available online)

Reference Books:

1. Pattern Recognition and Machine Learning, 1st edition, Christopher Bishop, Springer,2007
2. Programming Collective Intelligence: Building Smart Web 2.0 Applications - 1st edition, Toby
Segaran,2007
3. Building Machine Learning Systems with Python ,1st edition, WilliRichert, Luis Pedro
Coelho,2013




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



L
T
P
C
II Year - I Semester

3
0
0
3
Deep Learning

Course Objectives:

At the end of the course, the students will be expected to:
? Learn deep learning methods for working with sequential data,
? Learn deep recurrent and memory networks,
? Apply such deep learning mechanisms to various learning problems.
? Learn deep Turing machines, the open issues in deep learning, and have a grasp of the current
research directions.

Course Outcomes:
After the completion of the course, student will be able to
? Demonstrate the basic concepts fundamental learning techniques and layers.
? Discuss the Neural Network training, various random models.
? Explain different types of deep learning network models.
? Classify the Probabilistic Neural Networks.
? Implement tools on Deep Learning techniques.

UNIT-I: Introduction:
Various paradigms of learning problems, Perspectives and Issues in deep learning
framework, review of fundamental learning techniques.
Feed forward neural network: Artificial Neural Network, activation function, multi-layer neural network

UNIT-II: Training Neural Network:
Risk minimization, loss function, back propagation, regularization,
model selection, and optimization.
Deep Neural Networks: Difficulty of training deep neural networks, Greedy layer wise training.

UNIT-III: Deep Learning:
Deep Feed Forward network, regularizations, training deep models, dropouts,
Convolution Neural Network, Recurrent Neural Network, Deep Belief Network.
UNIT-IV: Probabilistic Neural Network: Hopfield Net, Boltzmann machine, RBMs, Sigmoid net,
Auto encoders.

UNIT V: Applications:
Object recognition, sparse coding, computer vision, natural language
processing.
Introduction to Deep Learning Tools: TensorFlow, Caffe, Theano, Torch.

Text Books:

1. Goodfellow, I., Bengio,Y., and Courville, A., Deep Learning, MIT Press, 2016..
2. Bishop, C.,M., Pattern Recognition and Machine Learning, Springer, 2006.



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India



Reference Books:

1. Yegnanarayana, B., Artificial Neural Networks PHI Learning Pvt. Ltd, 2009.
2. Golub, G.,H., and Van Loan,C.,F., Matrix Computations, JHU Press,2013.
3. Satish Kumar, Neural Networks: A Classroom Approach, Tata McGraw-Hill Education, 2004
4. Neural Networks: A Systematic Introduction, Ra?l Rojas, 1996
5. Pattern Recognition and Machine Learning, Christopher Bishop, 2007




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


AUDIT 1 and 2: ENGLISH FOR RESEARCH PAPER WRITING


Course objectives:
Students will be able to:
1. Understand that how to improve your writing skills and level of readability
2. Learn about what to write in each section
3. Understand the skills needed when writing a Title Ensure the good quality of paper at very first-
time submission

Syllabus
Units

CONTENTS
Hou

rs
1
Planning and Preparation, Word Order, Breaking up long sentences, 4
Structuring Paragraphs and Sentences, Being Concise
and Removing Redundancy, Avoiding Ambiguity and Vagueness
2
Clarifying Who Did What, Highlighting Your Findings, Hedging and 4
Criticising, Paraphrasing and Plagiarism, Sections of a Paper,
Abstracts. Introduction
3
Review of the Literature, Methods, Results, Discussion,
4
Conclusions, The Final Check.
4
key skills are needed when writing a Title, key skills are needed when 4
writing an Abstract, key skills are needed when writing an
Introduction, skills needed when writing a Review of the Literature,
5
skills are needed when writing the Methods, skills needed when writing the 4
Results, skills are needed when writing the Discussion,
skills are needed when writing the Conclusions
6
useful phrases, how to ensure paper is as good as it could possibly
4
be the first- time submission

Suggested Studies:

1. Goldbort R (2006) Writing for Science, Yale University Press (available on Google Books)
2. Day R (2006) How to Write and Publish a Scientific Paper, Cambridge University Press
3. Highman N (1998), Handbook of Writing for the Mathematical Sciences, SIAM.
Highman'sbook .
4. Adrian Wallwork , English for Writing Research Papers, Springer New York Dordrecht
Heidelberg London, 2011


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
AUDIT 1 and 2: DISASTER MANAGEMENT


Course Objectives: -Students will be able to:
1. learn to demonstrate a critical understanding of key concepts in disaster risk reduction and
humanitarian response.
2. critically evaluate disaster risk reduction and humanitarian response policy and practice from
multiple perspectives.
3. develop an understanding of standards of humanitarian response and practical relevance in specific
types of disasters and conflict situations.
4. critically understand the strengths and weaknesses of disaster management approaches, planning
and programming in different countries, particularly their home country or the countries
they work in
Syllabus
Units CONTENTS

Hours
1
Introduction
4
Disaster: Definition, Factors And Significance; Difference Between Hazard And
Disaster; Natural And Manmade Disasters: Difference,
Nature, Types And Magnitude.
2
Repercussions Of Disasters And Hazards: Economic Damage, Loss Of
4
Human And Animal Life, Destruction Of Ecosystem.
Natural Disasters: Earthquakes, Volcanisms, Cyclones, Tsunamis, Floods,
Droughts And Famines, Landslides And Avalanches, Man- made disaster:
Nuclear Reactor Meltdown, Industrial Accidents, Oil Slicks And Spills,
Outbreaks Of Disease And Epidemics, War And
Conflicts.
3
Disaster Prone Areas In India
4
Study Of Seismic Zones; Areas Prone To Floods And Droughts, Landslides And
Avalanches; Areas Prone To Cyclonic And Coastal Hazards With Special
Reference To Tsunami; Post-Disaster Diseases
And Epidemics
4
Disaster Preparedness And Management
4
Preparedness: Monitoring Of Phenomena Triggering A Disaster Or Hazard;
Evaluation Of Risk: Application Of Remote Sensing, Data From Meteorological
And Other Agencies, Media Reports:
Governmental And Community Preparedness.
5
Risk Assessment
4
Disaster Risk: Concept And Elements, Disaster Risk Reduction, Global And
National Disaster Risk Situation. Techniques Of Risk Assessment, Global Co-
Operation In Risk Assessment And Warning, People's
Participation In Risk Assessment. Strategies for Survival.
6
Disaster Mitigation
4
Meaning, Concept And Strategies Of Disaster Mitigation, Emerging
Trends In Mitigation. Structural Mitigation And Non-Structural Mitigation,
Programs Of Disaster Mitigation In India.





JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
Suggested Readings:
1. R. Nishith, Singh AK, "Disaster Management in India: Perspectives, issues and strategies
"'New Royal book Company.
2. Sahni, PardeepEt.Al. (Eds.)," Disaster Mitigation Experiences And Reflections", Prentice Hall
Of India, New Delhi.
3. Goel S. L. , Disaster Administration And Management Text And Case Studies" ,Deep &Deep
Publication Pvt. Ltd., New Delhi.

AUDIT 1 and 2: SANSKRIT FOR TECHNICAL KNOWLEDGE

Course Objectives

1. To get a working knowledge in illustrious Sanskrit, the scientific language in the world
2. Learning of Sanskrit to improve brain functioning
3. Learning of Sanskrit to develop the logic in mathematics, science & other subjects
enhancing the memory power
4. The engineering scholars equipped with Sanskrit will be able to explore the huge
knowledge from ancient literature
Syllabus

Unit Content
Hours
1
Alphabets in Sanskrit,
4
Past/Present/Future Tense,
Simple Sentences
2
Order
4
Introduction of roots
Technical information about Sanskrit Literature
3
Technical concepts of Engineering-Electrical,
4
4
Technical concepts of Engineering - Mechanical.
4
5
Technical concepts of Engineering - Architecture.
4
6
Technical concepts of Engineering ? Mathematics.
4

Suggested reading

1. "Abhyaspustakam" ? Dr.Vishwas, Samskrita-Bharti Publication, New Delhi
2. "Teach Yourself Sanskrit" Prathama Deeksha-Vempati Kutumbshastri, Rashtriya Sanskrit
Sansthanam, New Delhi Publication
3. "India's Glorious Scientific Tradition" Suresh Soni, Ocean books (P) Ltd., New Delhi.
Course Output
Students will be able to
1. Understanding basic Sanskrit language
2. Ancient Sanskrit literature about science & technology can be understood
3. Being a logical language will help to develop logic in students



JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India


AUDIT 1 and 2: VALUE EDUCATION

Course Objectives
Students will be able to
1. Understand value of education and self- development
2. Imbibe good values in students
3. Let the should know about the importance of character
Syllabus

Unit Content
Hours
1
Values and self-development ?Social values and individual attitudes. Work ethics, 4
Indian vision of humanism.
Moral and non- moral valuation. Standards and principles.
Value judgements
2
Importance of cultivation of values.
4
Sense of duty. Devotion, Self-reliance. Confidence, Concentration. Truthfulness,
Cleanliness.
Honesty, Humanity. Power of faith, National Unity.
Patriotism.Love for nature ,Discipline
3
Personality and Behavior Development - Soul and Scientific attitude. Positive 4
Thinking. Integrity and discipline.
Punctuality, Love and Kindness.
Avoid fault Thinking.
4
Free from anger, Dignity of labour.
4
Universal brotherhood and religious tolerance.
True friendship.
Happiness Vs suffering, love for truth.
Aware of self-destructive habits.
Association and Cooperation.
Doing best for saving nature
5
Character and Competence ?Holy books vs Blind faith.
4
Self-management and Good health.
Science of reincarnation.
Equality, Nonviolence ,Humility, Role of Women.
6
All religions and same message.
4
Mind your Mind, Self-control.
Honesty, Studying effectively

Suggested reading
1 Chakroborty, S.K. "Values and Ethics for organizations Theory and practice", Oxford
University Press, New Delhi
Course outcomes
Students will be able to 1.Knowledge of self-development
2.Learn the importance of Human values 3.Developing the overall personality


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
AUDIT 1 and 2: CONSTITUTION OF INDIA

Course Objectives:
Students will be able to:
1. Understand the premises informing the twin themes of liberty and freedom from a civil rights
perspective.
2. To address the growth of Indian opinion regarding modern Indian intellectuals' constitutional role and
entitlement to civil and economic rights as well as the emergence of nationhood in the early years of
Indian nationalism.
3. To address the role of socialism in India after the commencement of the Bolshevik Revolution in 1917
and its impact on the initial drafting of the Indian Constitution.
Syllabus
Units

Content
Hours
History of Making of the Indian Constitution:
1
History
4
Drafting Committee, ( Composition & Working)
Philosophy of the Indian Constitution:
2
Preamble Salient Features
4
Contours of Constitutional Rights & Duties:
Fundamental Rights
Right to Equality
Right to Freedom
3
Right against Exploitation
4
Right to Freedom of Religion
Cultural and Educational Rights
Right to Constitutional Remedies
Directive Principles of State Policy
Fundamental Duties.
Organs of Governance:
Parliament
Composition
Qualifications and Disqualifications
Powers and Functions
4
Executive
4
President
Governor
Council of Ministers
Judiciary, Appointment and Transfer of Judges, Qualifications
Powers and Functions
Local Administration:
District's Administration head: Role and Importance,
5
Municipalities: Introduction, Mayor and role of Elected Representative, CE of
Municipal Corporation.
4
Pachayati raj: Introduction, PRI: ZilaPachayat.
Elected officials and their roles, CEO ZilaPachayat: Position and role.
Block level: Organizational Hierarchy (Different departments),
Village level: Role of Elected and Appointed officials,
Importance of grass root democracy


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
Election Commission:
Election Commission: Role and Functioning.
6
Chief Election Commissioner and Election Commissioners.
4
State Election Commission: Role and Functioning.
Institute and Bodies for the welfare of SC/ST/OBC and women.

Suggested reading

1. The Constitution of India, 1950 (Bare Act), Government Publication.
2. Dr. S. N. Busi, Dr. B. R. Ambedkar framing of Indian Constitution, 1st Edition, 2015.
3. M. P. Jain, Indian Constitution Law, 7th Edn., Lexis Nexis, 2014.
4. D.D. Basu, Introduction to the Constitution of India, Lexis Nexis, 2015.
Course Outcomes:
Students will be able to:
1. Discuss the growth of the demand for civil rights in India for the bulk of Indians before the
arrival of Gandhi in Indian politics.
2. Discuss the
intellectual
origins of
the
framework
of
argument
that
informed
the conceptualization of social reforms leading to revolution in
India.
3. Discuss the circumstances surrounding the foundation of the Congress Socialist Party
[CSP] under the leadership of Jawaharlal Nehru and the eventual failure of the proposal of
direct elections through adult suffrage in the Indian Constitution.
4. Discuss the passage of the Hindu Code Bill of 1956.





JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
AUDIT 1 and 2: PEDAGOGY STUDIES

Course Objectives:
Students will be able to:
4. Review existing evidence on the review topic to inform programme design and policy
making undertaken by the DfID, other agencies and researchers.
5. Identify critical evidence gaps to guide the development.
Syllabus
Units Content

Hours
Introduction and Methodology:

Aims and rationale, Policy background, Conceptual framework and terminology
1
Theories of learning, Curriculum, Teacher education.
4
Conceptual framework, Research questions.
Overview of methodology and Searching.
Thematic overview: Pedagogical practices are being used by teachers in formal
2
and informal classrooms in developing countries.
4
Curriculum, Teacher education.
3
Evidence on the effectiveness of pedagogical practices
Methodology for the in depth stage: quality assessment of included studies.
How can teacher education (curriculum and practicum) and the school curriculum
and guidance materials best support effective pedagogy?
4
Theory of change.
Strength and nature of the body of evidence for effective pedagogical practices.
4
Pedagogic theory and pedagogical approaches.
4
Teachers' attitudes and beliefs and Pedagogic strategies.
Professional development: alignment with classroom practices and follow-up
support
Peer support
4
5
Support from the head teacher and the community.
Curriculum and assessment
Barriers to learning: limited resources and large class sizes
6
Research gaps and future directions
4
Research design
Contexts
Pedagogy
Teacher education
Curriculum and assessment
Dissemination and research impact.




JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India

Suggested reading

1. Ackers J, Hardman F (2001) Classroom interaction in Kenyan primary schools, Compare,
31 (2): 245-261.
2. Agrawal M (2004) Curricular reform in schools: The importance of evaluation, Journal of
Curriculum Studies, 36 (3): 361-379.
3. Akyeampong K (2003) Teacher training in Ghana - does it count? Multi-site teacher
education research project (MUSTER) country report 1. London: DFID.
4. Akyeampong K, Lussier K, Pryor J, Westbrook J (2013) Improving teaching and learning
of basic maths and reading in Africa: Does teacher preparation count? International Journal
Educational Development, 33 (3): 272?282.
5. Alexander RJ (2001) Culture and pedagogy: International comparisons in primary
education. Oxford and Boston: Blackwell.
6. Chavan M (2003) Read India: A mass scale, rapid, `learning to read' campaign.
7. www.pratham.org/images/resource%20working%20paper%202.pdf.

Course Outcomes:
Students will be able to understand:
1. What pedagogical practices are being used by teachers in formal and informal classrooms
in developing countries?
2. What is the evidence on the effectiveness of these pedagogical practices, in what
conditions, and with what population of learners?
3. How can teacher education (curriculum and practicum) and the school curriculum and
guidance materials best support effective pedagogy?


JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
AUDIT 1 and 2: STRESS MANAGEMENT BY YOGA

Course Objectives

1. To achieve overall health of body and mind
2. To overcome stress
Syllabus
Unit Content
Hours
1 Definitions of Eight parts of yog. ( Ashtanga )
5
2 Yam and Niyam. Do`s and Don't's in life.
5
Ahinsa, satya, astheya, bramhacharya and aparigraha
3 Yam and Niyam. Do`s and Don't's in life.
5
Shaucha, santosh, tapa, swadhyay, ishwarpranidhan
4 Asan and Pranayam
5
Various yog poses and their benefits for mind & body
5 Regularization of breathing techniques and its effects-Types of pranayam
4

Suggested reading

1. `Yogic Asanas for Group Tarining-Part-I" : Janardan Swami YogabhyasiMandal, Nagpur
2. "Rajayoga or conquering the Internal Nature" by Swami Vivekananda, Advaita
Ashrama (Publication Department), Kolkata
Course Outcomes:
Students will be able to:
1. Develop healthy mind in a healthy body thus improving social health also
2. Improve efficiency







JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY: KAKINADA
KAKINADA ? 533 003, Andhra Pradesh, India
AUDIT 1 and 2: PERSONALITY DEVELOPMENT THROUGH LIFE
ENLIGHTENMENT SKILLS
Course Objectives
1. To learn to achieve the highest goal happily
2. To become a person with stable mind, pleasing personality and determination
3. To awaken wisdom in students
Syllabus
Unit
Content
Hours
1
Neetisatakam-Holistic development of personality
4
Verses- 19,20,21,22 (wisdom)
Verses- 29,31,32 (pride & heroism)
Verses- 26,28,63,65 (virtue)
2
Neetisatakam-Holistic development of personality
4
Verses- 52,53,59 (dont's)
Verses- 71,73,75,78 (do's)
3
Approach to day to day work and duties.
4
Shrimad Bhagwad Geeta : Chapter 2-Verses 41, 47,48,
4
Chapter 3-Verses 13, 21, 27, 35, Chapter 6-Verses 5,13,17, 23, 35,
4
Chapter 18-Verses 45, 46, 48.
5
Statements of basic knowledge.
4
Shrimad Bhagwad Geeta: Chapter2-Verses 56, 62, 68
Chapter 12 -Verses 13, 14, 15, 16,17, 18
6
Personality of Role model. Shrimad Bhagwad Geeta: Chapter2-Verses 4
17, Chapter 3-Verses 36,37,42,
Chapter 4-Verses 18, 38,39
Chapter18 ? Verses 37,38,63
Suggested reading
1. "Srimad Bhagavad Gita" by Swami Swarupananda Advaita Ashram (Publication Department),
Kolkata
2. Bhartrihari's Three Satakam (Niti-sringar-vairagya) by P.Gopinath, Rashtriya Sanskrit
Sansthanam, New Delhi.
Course Outcomes
Students will be able to
1. Study of Shrimad-Bhagwad-Geeta will help the student in developing his personality and
achieve the highest goal in life
2. The person who has studied Geeta will lead the nation and mankind to peace and prosperity
3. Study of Neetishatakam will help in developing versatile personality of students



This post was last modified on 16 March 2021