Upcoming Events

Generalized Lagrangian Networks

Fri, Jan 31, 2020, 12:00 pm

Even though neural networks enjoy widespread use, they still struggle to learn the basic laws of physics. How might we endow them with better inductive biases? Recent work (Greydanus et al. 2019) proposed Hamiltonian Neural Networks (HNNs) which can learn the Hamiltonian of a physical system from data using a neural network.

Location: 26 Prospect Ave, Classroom 103
Speaker(s):

Events Archive

Solving Inverse Problems with Data-driven Priors

I will present a Bayesian machine learning architecture that combines a physically motivated parameterization and an analytic error model for the likelihood with a deep generative model providing a powerful data-driven prior for complex signals.

Location: 26 Prospect Ave, Classroom 103
Speaker(s):

Barks, Bubbles and Brownies!

We will have therapy dogs available to help calm you, bubble tea to hydrate you and brownies to assist with that chocolate fix.

RSVP not necessary but first come, first serve for the bubble tea!

Machine Learning for the Sciences

Taking place every other Friday. Lunch will be provided.

Location: 26 Prospect Ave, Auditorium 103
Tags: Seminars

Grad Students: Interested in Data Science?

Center for Statistics and Machine Learning is holding an informal graduate information session about its certificate program.

Lunch will be served!

Location: 26 Prospect Ave

Recent Advances in Non-Convex Distributed Optimization and Learning

We consider a class of distributed non-convex optimization problems, in which a number of agents are connected by a communication network, and they collectively optimize a sum of (possibly non-convex and non-smooth) local objective functions. This type of problem has gained some recent popularities, especially in the application of distributed...
Location: B205 Engineering Quadrangle
Speaker(s):
Tags: Seminars

Diving into TensorFlow 2.0

Description: Please join us for this 90-minute workshop, taught at an intermediate level. We will briefly introduce TensorFlow 2.0, then dive in to writing a few flavors of neural networks. Attendees will need a laptop and an internet connection.

Location: Lewis Science Library 138
Speaker(s):
Tags: Seminars

Can learning theory resist deep learning?

Abstract: 

Location: CS 105
Speaker(s):
Tags: Seminars

Machine Learning for the Sciences

Taking place every other Friday. Lunch will be provided.

Location: 26 Prospect Ave, Auditorium 103
Tags: Seminars

Convergence Rates of Stochastic Algorithms in Nonsmooth Nonconvex Optimization

Abstract:

Location: B205 Engineering Quadrangle
Speaker(s):
Tags: Seminars

Exploration by Optimization in Partial Monitoring

Abstract:   

Location: Sherrerd 101
Speaker(s):
Tags: Seminars

Pages