
Dr. Farid Melgani
Associate Professor of Telecommunications
Dept. of Information Engineering and Computer Science, University of Trento,
Via Sommarive, 14, I-38123, Trento, Italy
Phone: +39-0461-281573
Fax: +39-0461-282093
E-mail: melgani@disi.unitn.it
Farid Melgani received the State Engineer degree in electronics from the University of Batna, Algeria, in 1994, the M.Sc. degree in electrical engineering from the University of Baghdad, Iraq, in 1999, and the Ph.D. degree in electronic and computer engineering from the University of Genoa, Italy, in 2003.
From 1999 to 2002, he cooperated with the Signal Processing and Telecommunications Group, Department of Biophysical and Electronic Engineering, University of Genoa. Since 2002, he has been an Assistant Professor and then an Associate Professor of telecommunications at the University of Trento, Italy, where he has taught pattern recognition, machine learning, radar remote-sensing systems, and digital transmission. He is currently the Head of the Intelligent Information Processing (I2P) Laboratory, Department of Information Engineering and Computer Science, University of Trento. His research interests are in the area of processing, pattern recognition and machine learning techniques applied to remote sensing and biomedical signals/images (classification, regression, multitemporal analysis, and data fusion). He is coauthor of more than 130 scientific publications and is a referee for several international journals.
Dr. Melgani has served on the scientific committees of several international conferences. He is an Associate Editor of the IEEE Geoscience and Remote Sensing Letters and a Senior Member of the IEEE Society.
Remote Sensing

Biosignals

Spectrophotometry

Teaching
Recognition Systems
A course for the Master Degree in Telecommunications Engineering
1 semester (6 Credits)
Topics
1. Introduction
Basic concepts. Recognition systems. Design example. Multisensor recognition systems. Density Estimation. Regression and Interpolation. Application examples
2. Statistical Estimation Theory
Introduction. PDF Estimation. Maximum Likelihood Estimation. Bayesian Estimation. K-NN Estimation. Parzen Windows Estimation. Orthogonal Function Approximation. Expectation-Maximization Algorithm.
3. Feature Reduction
Introduction. Hughes Effect. Statistical Separability Measures. Sequential Forward/Backward Strategy. Sequential Forward/Backward Floating Strategy. Branch & Bound Strategy. Principal Component Analysis. Linear Discriminant Analysis.
4. Supervised Classification
Introduction. Bayesian Classification. Minimum Risk Theory. Discriminant Functions. Supervised Signal Detection. Classifier Combination. Markov Random Fields.
5. Artificial Neural Networks
Introduction. Artificial Neuron. Multilayer Perceptron. Radial Basis Function. Probabilistic Neural Network. Regression with Neural Networks. General Regression Neural Network. Self-Organizing Map. Adaptive Resonance Theory.
6. Support Vector Machines
Introduction. Kernel Representation. Generalization Theory. VC Dimension. Structural Risk Minimization. Margin-Based Bounds. Generalization for Regression. Maximal Margin Classification. Soft Margin Classification. Multiclass Support Vector Machines. Support Vector Regression.
Examination Type: Oral
Instead to carry out a complete oral examination, the student may also choose to make a partial oral examination integrated with a project related to a thesis or a detailed theoretical study of an argument of her/his choice treated during the course.
Reference Books
R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. 2nd Edition, New York: John Wiley & Sons Inc, 2001.
K. Fukunaga. Statistical Pattern Recognition. 2nd Edition, New York: Academic Press, 1990.
S. Theodoridis and K. Koutroumbas. Pattern Recognition. 1st edition, Academic Press, 1999.
H. L. Van Trees. Detection, Estimation, and Modulation Theory, Sets I, II. John Wiley & Sons, 2002.
R. Rojas. Neural Networks: A Systematic Introduction. Berlin: Sprinter-Verlag, 1996.
N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge: Cambridge University Press, 2000.
V. N. Vapnik. Statistical Learning Theory. Wiley-Interscience, 1998.
