UNIT I INTRODUCTION TO MACHINE LEARNING
Review of Linear Algebra for machine learning- Introduction and motivation
for machine learning; Examples of machine learning applications - Vapnik-
Chervonenkis (VC) dimension- Probably Approximately Correct (PAC)
learning- Hypothesis spaces- Inductive bias –Generalization-Bias variance
trade-off.
.
UNIT II SUPERVISED LEARNING
11
Linear Regression Models: Least squares, single & multiple variables-
Bayesian linear regression-gradient descent, Linear Classification Models-
Discriminant function – Perceptron algorithm- Probabilistic discriminative
model - Logistic regression-Probabilistic generative model – Naive Bayes-
Maximum margin classifier – Support vector machine- Decision Tree-Random
Forests.
UNIT III ENSEMBLE TECHNIQUES AND UNSUPERVISED
LEARNING 9
Combining multiple learners: Model combination schemes-Voting, Ensemble
Learning – bagging- boosting, stacking-Unsupervised learning: K-means-
Instance Based Learning: KNN- Gaussian mixture models and Expectation
maximization.
UNIT IV NEURAL NETWORKS
9
Multilayer perceptron, activation functions-network training – gradient
descent optimization – stochastic gradient descent-error back propagation-
from shallow networks to deep networks –Unit saturation (aka the vanishing
gradient problem) – ReLU, hyper parameter tuning-batch normalization-
regularization, dropout.
UNIT V DESIGN AND ANALYSIS OF MACHINE LEARNING
EXPERIMENTS 8
Guidelines for machine learning experiments-Cross Validation (CV) and re
sampling – K-fold CV, bootstrapping-measuring classifier performance-
assessing a single classification algorithm and comparing two classification
algorithms – t test, McNemar’s test, K-fold CV paired t test.