Featured Event

  • Geometric Insights into Spectral Clustering by Graph Laplacian Embeddings

    Wed, Sep 23, 2020, 12:00 pm

    We present new theoretical results for procedures identifying coarse structures in a given data set by means of appropriate spectral embeddings. We combine ideas from spectral geometry, metastability, optimal transport, and spectral analysis of weighted graph Laplacians to describe the embedding geometry. Our analysis focuses on 1) studying the embedding step of data clustering and 2) comparing the spectra of graph and continuum Laplacians, linking the original spectral clustering problem with a continuum counterpart.

  • Analysis of Gradient Descent on Wide Two-Layer ReLU Neural Networks

    Wed, Aug 26, 2020, 12:00 pm

    In this talk, we propose an analysis of gradient descent on wide two-layer ReLU neural networks that leads to sharp characterizations of the learned predictor. The main idea is to study the dynamics when the width of the hidden layer goes to infinity, which is a Wasserstein gradient flow. While this dynamics evolves on a non-convex landscape, we show that its limit is a global minimizer if initialized properly. We also study the "implicit bias" of this algorithm when the objective is the unregularized logistic loss.

  • Uniform Error Estimates for the Lanczos Method

    Mon, Aug 24, 2020, 1:30 pm

    Abstract:            The computation of extremal eigenvalues of large, sparse matrices has proven to be one of the most important problems in numerical linear algebra. Krylov subspace methods are a powerful class of techniques for this problem, most notably the Arnoldi process for non-symmetric matrices and the Lanczos method for symmetric matrices. The theory of convergence for the Lanczos method is well understood, but much less is known about uniform error estimates for a specific number of iterations.

  • A Few Thoughts on Deep Network Approximation

    Wed, Aug 12, 2020, 12:00 pm

    Deep network approximation is a powerful tool of function approximation via composition. We will present a few new thoughts on deep network approximation from the point of view of scientific computing in practice: given an arbitrary width and depth of neural networks, what is the optimal approximation rate of various function classes? Does the curse of dimensionality exist for generic functions? Can we obtain exponential convergence for generic functions?

  • Tradeoffs between Robustness and Accuracy

    Wed, Jul 29, 2020, 12:00 pm

    Standard machine learning produces models that are highly accurate on average but that degrade dramatically when the test distribtion deviates from the training distribution. While one can train robust models, this often comes at the expense of standard accuracy (on the training distribution). We study this tradeoff in two settings, adversarial examples and minority groups, creating simple examples which highlight generalization issues as a major source of this tradeoff.

  • Managing Research Data

    Thu, Jul 23, 2020, 12:00 pm

    This webinar will go over tips on how to keep track of your data files more efficiently, better organize your data files, and how to manage your data, code and other research materials, to save yourself headaches down the road.

    This event is part of the Princeton Research Data Service Workshop Series

  • Molecular Simulation with Machine Learning

    Mon, Jul 13, 2020 (All day) to Tue, Jul 14, 2020 (All day)

    A two-day virtual workshop covering theory and hands-on tutorials on the software package for molecular simulation with machine learning (ML) tools developed at the Computational Chemical Science Center “Chemistry in Solution and at Interfaces” (http://chemlabs.princeton.edu/ccsc/(link is external)).  The package includes codes to construct and use deep neural network models of the potential energy surface and electronic properties of multi-atomic systems that reproduce the results of electronic density functional theory.

  • Molecular Simulation with Machine Learning

    Mon, Jul 13, 2020 (All day) to Tue, Jul 14, 2020 (All day)

    A two-day virtual workshop covering theory and hands-on tutorials on the software package for molecular simulation with machine learning (ML) tools developed at the Computational Chemical Science Center “Chemistry in Solution and at Interfaces” (http://chemlabs.princeton.edu/ccsc/(link is external)).  The package includes codes to construct and use deep neural network models of the potential energy surface and electronic properties of multi-atomic systems that reproduce the results of electronic density functional theory.

  • Microsoft Azure Two-Part Cloud Computing Workshop

    Wed, Jun 17, 2020, 12:00 pm
    This is a hands-on 2-hour applied workshop, where attendees will learn new concepts by building their solutions on Azure and interacting directly with the instructors.

Pages

Subscribe to Featured Event