Upcoming Seminars

No upcoming events found.

Previous Seminars

Analysis of Stochastic Gradient Descent in Continuous Time

Wed, Nov 4, 2020, 12:00 pm

Stochastic gradient descent is an optimisation method that combines classical gradient descent with random subsampling within the target functional. In this work, we introduce the stochastic gradient process as a continuous-time representation of stochastic gradient descent.


Consistency of Cheeger cuts: Total Variation, Isoperimetry, and Clustering

Wed, Oct 21, 2020, 12:00 pm

Clustering unlabeled point clouds is a fundamental problem in machine learning. One classical method for constructing clusters on graph-based data is to solve for Cheeger cuts, which balance between finding clusters that require cutting few graph edges and finding clusters which are similar in size.


CITP Seminar: Tal Zarsky – When Small Change Makes a Big Difference: Algorithmic Equity Among Similarly Situated Individuals

Tue, Oct 20, 2020, 12:30 pm

Please join the webinar here.


Online Optimization & Energy

Thu, Oct 15, 2020, 12:30 pm
Online optimization is a powerful framework in machine learning that has seen numerous applications to problems in energy and sustainability. In my group at Caltech, we began by applying online optimization to ‘right-size’ capacity in data centers nearly a decade ago; and by now tools from online optimization have been applied to develop...

Hydrological modeling in the era of big data and artificial intelligence

Wed, Oct 14, 2020, 8:00 pm
Nowadays, all sorts of sensors, from ground to space, collect a huge volume of data about the Earth. Recent advances in artificial intelligence (AI) provide unprecedented opportunities for data-driven hydrological modeling using such “Big Earth Data”. However, many critical issues remain to be addressed. For example, there lacks efficient...

Geometric Insights into Spectral Clustering by Graph Laplacian Embeddings

Wed, Sep 23, 2020, 12:00 pm

We present new theoretical results for procedures identifying coarse structures in a given data set by means of appropriate spectral embeddings. We combine ideas from spectral geometry, metastability, optimal transport, and spectral analysis of weighted graph Laplacians to describe the embedding geometry.


Towards a Secure Collaborative Learning Platform

Tue, Sep 22, 2020, 12:30 pm
Multiple organizations often wish to aggregate their sensitive data and learn from it, but they cannot do so because they cannot share their data. For example, banks wish to run joint anti-money laundering algorithms over their aggregate transaction data because criminals hide their traces across different banks. Bio: Raluca Ada Popa is an...

Analysis of Gradient Descent on Wide Two-Layer ReLU Neural Networks

Wed, Aug 26, 2020, 12:00 pm

In this talk, we propose an analysis of gradient descent on wide two-layer ReLU neural networks that leads to sharp characterizations of the learned predictor. The main idea is to study the dynamics when the width of the hidden layer goes to infinity, which is a Wasserstein gradient flow.


Uniform Error Estimates for the Lanczos Method

Mon, Aug 24, 2020, 1:30 pm

Abstract:            The computation of extremal eigenvalues of large, sparse matrices has proven to be one of the most important problems in numerical linear algebra.


A Few Thoughts on Deep Network Approximation

Wed, Aug 12, 2020, 12:00 pm

Deep network approximation is a powerful tool of function approximation via composition. We will present a few new thoughts on deep network approximation from the point of view of scientific computing in practice: given an arbitrary width and depth of neural networks, what is the optimal approximation rate of various function...