Postponed to 2021

4th Modelling Symposium:

Introducing Deep Neural Networks

Organized by Felix Ball, Nico Marek & Tömme Noesselt

Funded by ...


We are pleased to announce the 4th Modelling Symposium which provides once more a mix of theoretical contents and application-oriented analyses. The next symposium will cover Deep Neural Networks (DNNs) including basic introductions into DNNs, common building blocks, design patterns and architectures, best practices, optimization, applications etc. To this end, we welcome a new tutor -- Prof. Dr. Sebastian Stober. 


Goal: Please note that DNNs are complex and that this course will help you to get started with DNN analyses. The workshop provides a general introduction into DNNs covering a wide range of topics. After the 4 days you should have an overview of different DNNs, their strength and weaknesses and which parameters of the model might be important and which ones you might have to tweak. The course will also help you to make decisions about which information/parameter can be important in steps XY and it also helps you to better understand the DNN literature (e.g. whether author's omitted important information about the presented models).

When & Where


  • New date! : 26.07.2020 - 30.07.2021


  • 27.07.2021; Start 7 pm
  • We will dine at the Hoflieferant. Please register for the dinner so we can book a table. The dinner is optional and self-paid.

Wednesday off!

  • There has to be some time to digest!

Location (might be subject to change but will be in Magdeburg)

  • Universitätsplatz campus, Gebäude 28, room 27

Detailed Program (subject to changes)

1st half of week



(Basics and CNNs)

09.00 - 10.30: General introduction (machine learning basics)

11.00 - 12.30: Convolutional Neural Networks I  (Basics)

14.00 - 15.30: Convolutional Neural Networks II (Hands-on)

16.00 - 17.30: Convolutional Neural Networks III (Advanced)



17.40 - 18.30: OPTIONAL - Discussing your data models




(common building blocks, design patterns and architectures)


09.00 - 10.30: Recurrent Neural Networks I  (Basics)

11.00 - 12.30: Recurrent Neural Networks I  (Hands-on)

14.00 - 15.30: Attention mechanisms

16.00 - 17.30: Transformers



17.40 - 18.30: OPTIONAL - Discussing your data models

2nd half of week



(best practices [BP], optimization and introspection)

09.00 - 10.30: Best practices, optimisation and regularization  

                         techniques I (Basics)

11.00 - 12.30: Best practices, optimisation and regularization  

                         techniques II (Hands-on))

14.00 - 15.30: Introspection I (Basics)

16.00 - 17.30: Introspection II (Hands-on)



17.40 - 18.30: OPTIONAL - Discussing your data models


(Applications, transfer learning and sneak peek)


09.00 - 10.30: Present your data

11.00 - 12.30: Possible applications (EEG and fMRI)

14.00 - 15.30: Model compression and transfer learning

16.00 - 17.30: Sneak peek and summary

Software, Code, Equipment, & Requirements

All information will be regularely updated, so please check for updates!


Software: Hands-on sessions will be based on Python and Tensorflow.


Code & Equipment: The code will be provided during the symposium. We will use a computation cluster so you do not have to worry about software and installation.  All you need is a laptop. We will also provide power sockets. In case you are registering for the "data talk sessions", please also bring a VGA and HDMI adapter for your presentation.


Requirements: The hands-on sessions require that you have general coding skills and are not an absolut beginner. You should have already written pieces of code, maybe a data analysis or e.g. an experiment. You should know Python and also Numpy, you should know what loops and conditions are, different types of variables, n-dimensional arrays, what a function is etc. Please note that we do not have time to cover basic programming.


Literature: This course will cover a variety of topics related to DNNs. To enhance your experience and avoid being overwhelmed (e.g. in case, you have never heard about DNNs before), you should consider reading about DNNs beforehand. Here are suggestions for starting with DNNs and computational models (more might follow):



Paper and books


Storrs & Kriegeskorte; Kriegeskorte & Douglas; Cichy & Kaiser; Goodfellow, Bengio & Courville





TensorFlow and DNNs without PhD


Speaker: Prof. Dr. Sebastian Stober

Foto Quelle: Jana Dünnhaupt / Universität Magdeburg

Sebastian Stober is an interdisciplinary researcher with a PhD in computer science and a background in (applied) machine learning, (music) information retrieval and cognitive neuroscience. He is especially interested in so-called “human-in-the-loop” scenarios, in which both humans and machines learn from each other and together contribute to the solution of a problem. Since October 2018, he is Professor for Artificial Intelligence at the Otto-von-Guericke-University Magdeburg. Before, he was head of a new junior research group on Machine Learning in Cognitive Science at the University of Potsdam and from 2013 to 2015, he was post-doctoral fellow in the labs of Adrian Owen and Jessica Grahn at the Brain and Mind Institute at Western University in London, Ontario.


Registration will open Dec 2020 - Jan 2021

(inlcludes information about costs, motivation letter etc.)




Last update




Links to Associated Institutions