Upcoming Events

Princeton University is actively monitoring the situation around coronavirus (COVID-19) and the evolving guidance from government and health authorities. The latest guidance for Princeton members and visitors is available on the University’s Emergency Management website

No upcoming events found.

Events Archive

Geometric Insights into Spectral Clustering by Graph Laplacian Embeddings

We present new theoretical results for procedures identifying coarse structures in a given data set by means of appropriate spectral embeddings. We combine ideas from spectral geometry, metastability, optimal transport, and spectral analysis of weighted graph Laplacians to describe the embedding geometry.

Location: https://www.oneworldml.org/upcoming-events
Speaker(s):

Towards a Secure Collaborative Learning Platform

Multiple organizations often wish to aggregate their sensitive data and learn from it, but they cannot do so because they cannot share their data. For example, banks wish to run joint anti-money laundering algorithms over their aggregate transaction data because criminals hide their traces across different banks. Bio: Raluca Ada Popa is an...
Location: https://princeton.zoom.us/j/97190696906
Speaker(s):

DataX Workshop Series: Synthetic Control Methods | Day 2 (POSTPONED UNTIL 2021)

This event has been postponed until 2021. More information will be provided when available.

DataX Workshop Series: Synthetic Control Methods | Day 1 (POSTPONED UNTIL 2021)

This event has been postponed until 2021. More information will be provided when available.

Analysis of Gradient Descent on Wide Two-Layer ReLU Neural Networks

In this talk, we propose an analysis of gradient descent on wide two-layer ReLU neural networks that leads to sharp characterizations of the learned predictor. The main idea is to study the dynamics when the width of the hidden layer goes to infinity, which is a Wasserstein gradient flow.

Location: https://www.oneworldml.org/home
Speaker(s):

Uniform Error Estimates for the Lanczos Method

Abstract:            The computation of extremal eigenvalues of large, sparse matrices has proven to be one of the most important problems in numerical linear algebra.

Location: https://princeton.zoom.us/j/96788606925
Speaker(s):

A Few Thoughts on Deep Network Approximation

Deep network approximation is a powerful tool of function approximation via composition. We will present a few new thoughts on deep network approximation from the point of view of scientific computing in practice: given an arbitrary width and depth of neural networks, what is the optimal approximation rate of various function...

Location: https://www.oneworldml.org/upcoming-events
Speaker(s):

Data Wrangling: How to Keep Your Data Workflows Orderly and Efficient

This webinar will provide several practical considerations to help you better manage your research data between the points of collection and analysis. We will review the principles of open research and cover best practices for documentation and metadata generation amidst collation, aggregation, and cleaning tasks.

Tags: Seminars

Tradeoffs between Robustness and Accuracy

Standard machine learning produces models that are highly accurate on average but that degrade dramatically when the test distribtion deviates from the training distribution. While one can train robust models, this often comes at the expense of standard accuracy (on the training distribution).

Location: https://www.oneworldml.org/upcoming-events
Speaker(s):

Thematic Day on the Mean Field Training of Deep Neural Networks

12pm: Roberto I. Oliveira – TBA 

1pm: Konstantinos Spiliopoulos  - Mean field limits of neural networks: typical behavior and fluctuations

2pm: Huy Tuan Pham - A general framework for the mean field limit of multilayer neural networks

Location: https://www.oneworldml.org/thematic-days/mean-field-training-of-multi-layer-networks

Pages