Zuse Research Seminar   📅

Institute
ZIB
Head
Tim Conrad and Christoph Spiegel
Description
The Zuse Research Seminar at the Zuse Institute Berlin serves as an interdisciplinary forum for researchers in the field of Applied Mathematics and Computer Science. Talks are plenary-style and last 45 minutes plus time for questions. They should provide both an accessible overview over the research field as well as some broadly understandable insights into the speaker's own research and how it relates to the field. Active discussion and questions are encouraged. Talks are predominantly given by researchers from the institute with the aim to showcase their department's work in an accessible way. We also invite a select number of external speakers each semester to present their research. The seminar is open to the public and we encourage everyone interested to attend.
Usual venue
Number of talks
9
Tue, 02.07.24 at 11:00
Machine Learned Force Fields, Coarse Graining, HPC, & Beyond
Abstract. Machine learned force fields (MLFFs), particular those using deep neural networks to model interaction potentials are quickly becoming a powerful tool for modelling complex molecular systems at scale with both classical and quantum accuracy. In this talk, we demonstrate an application of developing transferable coarse grained (CG) MLFFs for proteins using HPC resources to show how machine learning can be used easily and effectively on compute clusters to solve relevant chemical/physical problems. We furthermore discuss growing trends and usage of MLFFs with HPC resources, including emerging datasets, hardware demands, and integrations of machine learned potentials with existing simulation software.
Tue, 28.05.24 at 13:00
Backpropagation and Nonsmooth Optimization for Machine Learning
Abstract. Backpropagation is the major workhorse for many machine learning algorithms. In this presentation, we will examine the theory behind backpropagation as provided by the technique of algorithmic differentiation. Subsequently, we will discuss how this classic derivative information can be used for nonsmooth optimization. Examples from reail will illustrate the application of the proposed nonsmooth optimization algorithm.
Tue, 07.05.24 at 13:00
Geometric Deep Learning
Abstract. The increasing success of deep learning techniques during the last decade express a paradigm shift in machine learning and data science. While learning generic functions in high dimensions is a cursed estimation problem, many challenging tasks such as protein folding or image-based diagnosis, have now been shown to be achievable with appropriate computational resources. These breakthroughs can be attributed to the fact that most tasks of interest aren't actually generic; they possess inherent regularities derived from the effective low-dimensionality and structure of the physical world. In this talk, we will see how geometric concepts allow to expose these regularities and how we can use them to incorporate prior (physical) knowledge into neural architectures.
Mon, 15.04.24 at 11:30
An Introduction to Conditional Gradients
Abstract. Conditional Gradient methods are an important class of methods to minimize (non-)smooth convex functions over (combinatorial) polytopes. Recently these methods received a lot of attention as they allow for structured optimization and hence learning, incorporating the underlying polyhedral structure into solutions. In this talk I will give a broad overview of these methods, their applications, as well as present some recent results both in traditional optimization and learning as well as in deep learning.
Thu, 15.02.24 at 11:30
Multiobjective Shortest Path Problems
Abstract. In this talk we discuss new algorithms for the Multiobjective Shortest Path (MOSP) problem. The baseline algorithm, the Multiobjective Dijkstra Algorithm (MDA) has already been introduced in seminars at ZIB. New aspects discussed in this talk are its output-sensitive running time bound and how the bound compares to the one derived for previously existing MOSP algorithms, a version of the MDA for One-to-One MOSP instances, and the usage of the MDA as a subroutine. The discussed application in which the MDA acts as a subroutine are the Multiobjective Minimum Spanning Tree problem and the K-Shortest Simple Path problem.
Mon, 29.01.24 at 11:30
Thoughts on Machine Learning
Abstract. Techniques of machine learning (ML) and what is called “artificial intelligence” (AI) today find a rapidly increasing range of applications touching upon social, economic, and technological aspects of everyday life. They are also being used increasingly and with great enthusiasm to fill in gaps in our scientific knowledge by data-based modelling approaches. I have followed these developments over the past almost 20 years with interest and concern, and with mounting disappointment. This leaves me sufficiently worried to raise here a couple of pointed remarks.
Wed, 10.01.24 at 11:30
On the state of QUBO solving
Abstract. It is regularly claimed that quantum computers will bring breakthrough progress in solving challenging combinatorial optimization problems relevant in practice. In particular, Quadratic Unconstrained Binary Optimization (QUBO) problems are said to be the model of choice for use in (adiabatic) quantum systems during the NISQ era. Even the first commercial quantum-based systems are advertised to solve such problems and QUBOs are certainly an interesting way of modeling combinatorial optimization problems. Theoretically, any Integer Program can be converted into a QUBO. In practice, however, there are some caveats. Furthermore, even for problems that can be nicely modeled as a QUBO, this might not be the most effective way to solve them. We review the state of QUBO solving on digital and Quantum computers and give some insights regarding current benchmark instances and modeling.
Mon, 11.12.23 at 11:30
The few and the many
Abstract. The talk will give a short introduction to complex dynamics of interacting systems of individual units that can be particles (molecules, …),  or agents (individual humans, media agents, …). We are interested in systems with at least two types of such units, one type of which just a “few” individual units are present and another type of which there are “many”. For such systems we will review mathematical models on different levels: from the micro-level in which all particles/agents are described individually to the macro-level where the “many” are modelled in an aggregated way. The effective dynamics given by these models will be illustrated by examples from cellular systems (neurotransmission processes) and opinion dynamics in social networks.  You will be able to follow the talk even if you do not have any detailed knowledge about particles/agents or cellular/social processes (at least I hope).
Wed, 22.11.23 at 11:30
Sparse Personalized PageRank:
New results on the 25 billion dollar eigenvector problem
Abstract. This talk will go over the basics of the PageRank problem, studied initially by the founders of Google, which allowed them to create their search engine by applying it to the internet graph with links defining edges. Then, we will explain some of our results on the problem for undirected graphs, whose main application is finding local clusters in networks, and is used in many branches of science. We can now find local clusters fast in a time that does not depend on the whole graph but on the local cluster itself. <p>This is joint work with and Sebastian Pokutta.</p>