Featured Event

  • Bilevel Learning for Inverse Problems

    Wed, Jul 7, 2021, 12:00 pm
    Variational regularization techniques are dominant in the field of inverse problems. A drawback of these techniques is that they are dependent on a number of parameters which have to be set by the user. This issue can be approached by machine learning where we estimate these parameters from data. This is known as "Bilevel Learning" and has been successfully applied to many tasks, some as small-dimensional as learning a regularization parameter, others as high-dimensional as learning a sampling pattern in MRI. While mathematically appealing this strategy leads to a nested optimization problem which is computationally difficult to handle. In this talk we discuss several applications of bilevel learning for imaging as well as new computational approaches. There are quite a few open problems in this relatively recent field of study, some of which I will highlight along the way.
  • Variational models and gradient flows for graph clustering

    Wed, Jun 23, 2021, 12:00 pm
    Discrete graph-based variants of the Allen--Cahn and total variation variational models have proven to be successful tools for clustering and classification on graphs. In this talk we will study these models and the gradient flows that are derived from them. We will see deep connections between the various discrete gradient flows as well as between the discrete gradient flows and their continuum relatives.
  • CSML Poster Session Event

    Mon, May 2, 2022, 8:00 am

    The annual CSML Poster Session event will be held in person or virtually. Watch this space for further details.

    Published date of event is subject to change.

    Due date for independent work posters and papers TBA. Please check your email for details.

    Check out this article on 2021's poster session here.

  • Real-Time Remote Sensing and Fusion Plasma Control: A Reservoir Computing Approach

    Thu, Jun 17, 2021, 11:00 am
    Nuclear fusion power is a potential source of safe, non-carbon-emitting and virtually limitless energy. The tokamak is a promising approach to fusion based on magnetic plasma confinement, constituting a complex physical system with many control challenges. However, plasma instabilities pose an existential threat to a reactor, which has not yet been solved. Since current physical understanding is not sufficiently advanced to reliably predict instabilities, a way forward is artificial intelligence and data-driven models.
  • Convergence of Stochastic Gradient Descent for analytic target functions

    Wed, Jun 9, 2021, 12:00 pm
    In this talk we discuss almost sure convergence of Stochastic Gradient Descent in discrete and continuous time for a given twice continuously-differentiable target function F. In a first step we give assumptions on the step-sizes and perturbation size to ensure convergence of the target value F and gradient f=DF assuming that f is locally Hölder-continuous. This result entails convergence of the iterates itself in the case where F does not possess a continuum of critical points.
  • Smooth bilevel programming for sparse regularisation

    Wed, May 19, 2021, 12:00 pm
    Nonsmooth regularisers are widely used in machine learning for enforcing solution structures (such as the l1 norm for sparsity or the nuclear norm for low rank). State of the art solvers are typically first order methods or coordinate descent methods which handle nonsmoothness by careful smooth approximations and support pruning. In this work, we revisit the approach of iteratively reweighted least squares (IRLS) and show how a simple reparameterization coupled with a bilevel resolution leads to a smooth unconstrained problem. We are therefore able to exploit the machinery of smooth optimisation, such as BFGS, to obtain local superlinear convergence. The result is a highly versatile approach which is able to significantly outperform state of the art methods for a wide range of problems.
  • Transport information Bregman divergences

    Wed, May 12, 2021, 12:00 pm
    In this talk, we talk about a joint intersection between optimal transport and information geometry. We study Bregman divergences in probability density space embedded with the Wasserstein-2 metric. Several properties and dualities of transport Bregman divergences are provided. In particular, we derive the transport Kullback-Leibler (KL) divergence by a Bregman divergence of negative Boltzmann-Shannon entropy in Wasserstein-2 space. We also derive analytical formulas of transport KL divergence for one-dimensional probability densities and Gaussian families.
  • The efficiency of kernel methods on structured datasets

    Wed, May 5, 2021, 4:30 pm
    Inspired by the proposal of tangent kernels of neural networks (NNs), a recent research line aims to design kernels with a better generalization performance on standard datasets. Indeed, a few recent works showed that certain kernel machines perform as well as NNs on certain datasets, despite their separations in specific cases implied by theoretical results. Furthermore, it was shown that the induced kernels of convolutional neural networks perform much better than any former handcrafted kernels. These empirical results pose a theoretical challenge to understanding the performance gaps in kernel machines and NNs in different scenarios.
  • Barriers to Deploying Deep Learning Models During the COVID-19 Pandemic

    Wed, Apr 28, 2021, 12:00 pm
    A promising application for deep learning models is in assisting clinicians with interpreting X-ray and CT scans, especially when treating respiratory diseases. At the onset of the COVID-19 pandemic, radiologists had to quickly learn how to identify a new disease on chest X-rays and CT scans, and use this information to decide how to allocate scarce resources like ventilators. Researchers around the world developed deep learning models to help clinicians with these decisions, and some models were deployed after only three weeks of testing.

Pages

Subscribe to Featured Event