Upcoming Events

Princeton University is actively monitoring the situation around coronavirus (COVID-19) and the evolving guidance from government and health authorities. The latest guidance for Princeton members and visitors is available on the University’s Emergency Management website

Introduction to Data Analysis using Python
Fri, Oct 7, 2022, 10:00 am

This workshop will get students started in data analysis using the pandas Python package. It will briefly cover different components of data analysis and connect them with the goal of extracting meaning from data. We will go over an example to illustrate the data analysis process from beginning to end.

A Language-Based Model of Organizational Identification Demonstrates How Within-Person Changes in Identification Relate to Network Position
Mon, Oct 10, 2022, 12:00 pm

Shifting attachments to social groups are a constant in the modern era.They are especially pronounced in the contemporary workplace. What accounts for variation in the strength of organizational identification?

Location
Aaron Burr Hall 219
Speaker
Data Visualization in Python
Tue, Oct 11, 2022, 4:30 pm

This workshop provides an introduction to effective data visualization in Python. The training focuses on three plotting packages: Matplotlib, Seaborn and Plotly. Examples may include simple static 1D plots, 2D contour maps, heat maps, violin plots, and box plots. The session may also touch on more advanced interactive plots.

The Limits of the Quantitative Approach to Discrimination
Tue, Oct 11, 2022, 5:00 pm

Discrimination is obvious to the people facing discrimination. Given this, do we even need quantitative studies to test if it exists? Regardless of the answer, quantitative studies such as ProPublica’s “Machine Bias” have had a galvanizing effect on racial justice, especially in the context of automated decision-making. 

Location
East Pyne 010
Speaker
Understanding Reasons for Differences in Intervention Effects Across Sites
Tue, Oct 25, 2022, 12:00 pm

I am an epidemiologist with research interests in developing and applying causal inference methods to understand social and contextual influences on mental health, substance use, and violence in disadvantaged, urban areas of the United States.

Location
300 Wallace Hall
Speaker
Caught up in Neural Nets? When (and How) to use Classical Machine Learning in Your Research
Thu, Nov 10, 2022, 4:30 pm

In this workshop, participants will learn the basics of various classical machine learning techniques and discuss which types of problems each technique is best suited to address.

Events Archive

Bridging the Gap Between Your Laptop and Cloud Computing

Part 1:  Introduction to some tools that computer programmers typically use to write and debug code in an
 effective manner.

Part 2:  Introduction to cloud computing (create and manage cloud computing resources) How to use some tools that offer the possibility of writing code locally while seamlessly executing/running it on powerful cloud computing.

Bridging the Gap Between Your Laptop and Cloud Computing

Part 1:  Introduction to some tools that computer programmers typically use to write and debug code in an
 effective manner.

Part 2:  Introduction to cloud computing (create and manage cloud computing resources) How to use some tools that offer the possibility of writing code locally while seamlessly executing/running it on powerful cloud computing.

Princeton Data Science Coffee Chats

Princeton Data Science is hosting coffee chats on Saturday, October 1 and Sunday, October 2. Fill out the form below to be paired with a fellow student interested in data science and get free coffee at Small World Coffee. This is a great opportunity to receive or give mentorship, or simply meet other students who are interested in data science.

Introduction to Deep Learning with TensorFlow

Please join us for this intro to Deep Learning workshop. You'll learn about the basics of neural networks with diagrams and code examples in Keras, and work through tutorials to help you get started. You'll need a laptop and internet connection. There's nothing to install in advance, we'll use Colab for all examples. We'll cover the basics …

Fundamentals of Deep Learning for Multi-GPUs (9/27 and 9/28)

This 2-day workshop teaches you techniques for training deep neural networks on multi-GPU technology to shorten the training time required for data-intensive applications. 

Intro to Data Analysis using R

This workshop will get participants started in data analysis using R/RStudio. It will briefly cover different components of data analysis and connect them with the goal of extracting meaning from data. We will go over an example to illustrate the data analysis process from beginning to end.

Completing large low rank matrices with only few observed entries: A one-line algorithm with provable guarantees

Suppose you observe very few entries from a large matrix. Can we predict the missing entries, say assuming the matrix is (approximately) low rank ? We describe a very simple method to solve this matrix completion problem. We show our method is able to recover matrices from very few entries and/or with ill conditioned matrices, where many other popular methods fail. 

Location
Fine Hall
Speaker
New Results on Universal Dynamic Regret Minimization for Learning and Control

Universal dynamic regret is a natural metric for the performance of an online learner in nonstationary environments.  The optimal dynamic regret for strongly convex and exponential concave losses, however, had been open for nearly two decades. In this talk, I will cover some recent advances on this problem from my group that largely settled this open problem. 

Location
B205 Engineering Quad
Speaker
Events for Academic Year 2022-2023

Events at the Center for Statistics and Machine Learning are currently being scheduled. Please check back here later this month for events.

The Role of Relative Entropy in Supervised Machine Learning

In this talk, recent results on various aspects of the Empirical Risk Minimization (ERM) problem with Relative Entropy Regularization (ERM-RER) are presented. The regularization is with respect to a sigma-finite measure, instead of a probability measure, which provides a larger flexibility for including prior knowledge on the models. Special cases of this general formulation include the ERM problem with (discrete or differential) entropy regularization and the information-risk minimization problem.

Location
B205 Engineering Quadrangle
Speaker