﻿ Decision Theory | Department of Computer Engineering & Informatics

# Decision Theory

Course Code:
CEID_ΝΕ5237
Period:
Winter Semester
Credit Points:
5

Course outline

Basic concepts and example of machine perception, pattern recognition systems (sensors, segmentation and clustering, feature extraction, classification, meta-processing), system’s design cycle (data collection, feature selection, model selection, training, validation, computational complexity), learning and adaptation (supervised learning, un-supervised learning, reinforcement learning). Bayes Decision Theory for continuous features (two classes classification). Minimum error rate classification (criteria minimax and Neyman - Pearson). Classifiers, discriminant functions and decision surfaces (the case of multi and two categories). The normal probability density function (probability density function unimodal and multimodal, discriminant functions for the normal probability density function). Error probabilities and intervals. Error limits for normal probability density functions ( Chernoff limit, Bhattacharrya limit, signal detection theory and receiver operating characteristic – ROC). Bayes Decision Theory for discrete features (independent discrete features). Maximum Likelihood Estimation (generl concept, the Gaussian case). Bayes estimation (conditional probabilities, parameters’ distribution). Bayesian parameters’ estimation ( Gaussian case, general theory). The curse of dimensionality. Hidden Markov Models. Non parametric techniques. Calculation of the probability density function. Parzen Windows (mean value convergence, covariance convergence, applications). The Kn nearest neighbor method. The nearest neighbor rule (convergence, error rate, error limits, computational complexity). Nearest neighbor classification and performance measures (measures; properties, tangent distance). Linear discriminant functions and decision surfaces (the two classes and multi classes case).Generalized discriminant functions. The case of two liner separated classes (steepest descent procedures). Perceptron algorithm (minimization of the cost function, proof of convergence). Relaxation procedures (the descent algorithm, proof of convergence). Non separable behavior). Mean square error procedures ( LMS algorithm). Procedures Ho-Kashyap. Game Theory. Historical flashback. Basic concepts. Game classification. Games’ description and analysis. Zero sum games. The clear strategy case. Mixed strategies.

## Related Announcements

Startup Growth Lite is a free theme, contributed to the Drupal Community by More than Themes.