This course introduces the basic concepts in the pattern recognition field. The Bayesian classification, the
minimum distance (Euclidean and Mahalanobis), and the nearest neighbour classifiers. The course deals with the
design of linear classifiers. The probability estimation property of the mean square solution as well as the bias
variance dilemma are only briefly mentioned in the course. The basic philosophy underlying the support vector
machines can also be explained, emphasis is put on the linear separability issue, the perceptron algorithm, and the
mean square and least squares solutions. This course also covers the design of nonlinear classifiers. A description
of its rationale is given, and the students experiment with it using MATLAB. The issues related to Pruning will
be introduced with an emphasis on generalization issues. Emphasis is also given to Cover’s theorem and radial
basis function (RBF) networks. The nonlinear support vector machines and decision trees are only briefly
touched via a discussion on the basic philosophy behind their rationale. The idea of template matching, dynamic
programming (DP) and the Viterbi algorithm are presented.
تاريخ النشر
19 جمادى الأول 1447
تاريخ أخر تعديل
19 جمادى الأول 1447