Search result: Catalogue data in Autumn Semester 2019

Mathematics Master Information
Application Area
Only necessary and eligible for the Master degree in Applied Mathematics.
One of the application areas specified must be selected for the category Application Area for the Master degree in Applied Mathematics. At least 8 credits are required in the chosen application area.
Information and Communication Technology
NumberTitleTypeECTSHoursLecturers
227-0427-00LSignal Analysis, Models, and Machine LearningW6 credits4GH.‑A. Loeliger
AbstractMathematical methods in signal processing and machine learning.
I. Linear signal representation and approximation: Hilbert spaces, LMMSE estimation, regularization and sparsity.
II. Learning linear and nonlinear functions and filters: neural networks, kernel methods.
III. Structured statistical models: hidden Markov models, factor graphs, Kalman filter, Gaussian models with sparse events.
ObjectiveThe course is an introduction to some basic topics in signal processing and machine learning.
ContentPart I - Linear Signal Representation and Approximation: Hilbert spaces, least squares and LMMSE estimation, projection and estimation by linear filtering, learning linear functions and filters, L2 regularization, L1 regularization and sparsity, singular-value decomposition and pseudo-inverse, principal-components analysis.
Part II - Learning Nonlinear Functions: fundamentals of learning, neural networks, kernel methods.
Part III - Structured Statistical Models and Message Passing Algorithms: hidden Markov models, factor graphs, Gaussian message passing, Kalman filter and recursive least squares, Monte Carlo methods, parameter estimation, expectation maximization, linear Gaussian models with sparse events.
Lecture notesLecture notes.
Prerequisites / NoticePrerequisites:
- local bachelors: course "Discrete-Time and Statistical Signal Processing" (5. Sem.)
- others: solid basics in linear algebra and probability theory
227-0101-00LDiscrete-Time and Statistical Signal ProcessingW6 credits4GH.‑A. Loeliger
AbstractThe course introduces some fundamental topics of digital signal processing with a bias towards applications in communications: discrete-time linear filters, inverse filters and equalization, DFT, discrete-time stochastic processes, elements of detection theory and estimation theory, LMMSE estimation and LMMSE filtering, LMS algorithm, Viterbi algorithm.
ObjectiveThe course introduces some fundamental topics of digital signal processing with a bias towards applications in communications. The two main themes are linearity and probability. In the first part of the course, we deepen our understanding of discrete-time linear filters. In the second part of the course, we review the basics of probability theory and discrete-time stochastic processes. We then discuss some basic concepts of detection theory and estimation theory, as well as some practical methods including LMMSE estimation and LMMSE filtering, the LMS algorithm, and the Viterbi algorithm. A recurrent theme throughout the course is the stable and robust "inversion" of a linear filter.
Content1. Discrete-time linear systems and filters:
state-space realizations, z-transform and spectrum,
decimation and interpolation, digital filter design,
stable realizations and robust inversion.

2. The discrete Fourier transform and its use for digital filtering.

3. The statistical perspective:
probability, random variables, discrete-time stochastic processes;
detection and estimation: MAP, ML, Bayesian MMSE, LMMSE;
Wiener filter, LMS adaptive filter, Viterbi algorithm.
Lecture notesLecture Notes
227-0417-00LInformation Theory IW6 credits4GA. Lapidoth
AbstractThis course covers the basic concepts of information theory and of communication theory. Topics covered include the entropy rate of a source, mutual information, typical sequences, the asymptotic equi-partition property, Huffman coding, channel capacity, the channel coding theorem, the source-channel separation theorem, and feedback capacity.
ObjectiveThe fundamentals of Information Theory including Shannon's source coding and channel coding theorems
ContentThe entropy rate of a source, Typical sequences, the asymptotic equi-partition property, the source coding theorem, Huffman coding, Arithmetic coding, channel capacity, the channel coding theorem, the source-channel separation theorem, feedback capacity
LiteratureT.M. Cover and J. Thomas, Elements of Information Theory (second edition)
  •  Page  1  of  1