Theory Seminar @ Yale


Theory Seminar @ Yale

The goal of this joint theory seminar between the departments of CS, SDS and Math @Yale is to disseminate and discuss recent advances related to the theory of computer science, data science, and machine learning.

Usual meeting time and place:
Fridays 11-12
Room 107, 24 Hillhouse

Van Vu on September 20, 2019

Speaker: Van Vu, Department of Mathematics, Yale University

Title:  A few thoughts about random matrices 

Abstract:
We would like to discuss a few problems that we have been working on lately: 
(1) Random perturbation of a large matrix (how does it effect the eigenvalues and eigenvectors).
(2) Products of random matrices (spectral distribution)
(3) How often is a random +-1 matrix normal ? 
(4) Spectral properties of Preferential attachment graphs. 

Zhou Fan and Cheng Mao on October 4, 2019

Title: Spectral Graph Matching

Abstract:
We’ll discuss some recent work with Yihong Wu and Jiaming Xu on the “graph matching” problem of identifying the vertex matching between two graphs with correlated edges. We propose a simple spectral algorithm, which is related to a popular convex relaxation approach for this problem and also to the behavior of eigenvector moment flows for random matrices. We’ll describe a theoretical result establishing conditions under which this algorithm exactly recovers the vertex matching between two correlated Erdos-Renyi random graphs, and discuss some of the ideas of its proof and some open questions.

Nisheeth Vishnoi on October 25, 2019

Title: Entropy, Optimization, and Symmetry

Rick Kenyon on November 8, 2019

Title: Orientations and the Graph Laplacian

Abstract:

Given a large graph-with-boundary, a harmonic function is a useful and easy way to understand (or draw) the graph. However it can be problematic: large sets of vertices may have similar function values. One way to try to avoid this difficulty is to assign edges different conductances. Can we choose a conductance function on edges so that the harmonic function “disentangles”cthe graph in some sense?

The answer is “yes” for the appropriate definitions of these concepts.

Harry Zhou on November 15, 2019

Title: Optimality of Spectral Clustering for Gaussian Mixture Models

Abstract:
Spectral clustering is one of the most popular algorithms to group high dimensional data. It is easy to implement and computationally efficient. Despite its popularity and successful applications, its theoretical properties have not been fully understood. The spectral clustering algorithm is often used as a consistent initializer for more sophisticated clustering algorithms such as EM. However, in this talk, we show that spectral clustering is actually already optimal for the Gaussian Mixture Models, when the number of clusters of is fixed and consistent clustering is possible. Contrary to that the spectral gap conditions are widely assumed in literature to analyze spectral clustering, these conditions are not needed in this work to establish its optimality.   

Dan Spielman on November 22, 2019

Title: Balancing covariates in randomized experiments using the Gram–Schmidt walk 

Abstract:
We use the Gram-Schmidt walk, a recent advance in algorithmic discrepancy theory, to derive a new class of designs for randomized controlled trials.    The goal of our designs is to produce very accurate estimates of treatment effects when those effects are largely explained by linear functions of subject covariates, while also supplying strong worst-case guarantees.  The tradeoff between these two is controlled by a parameter that can be set by the experimenter.  Mathematically, our major contribution is a proof that the Gram-Schmidt walk produces a 1-subgaussian random vector.

In this talk, I will explain the experimental design problem we address, the Gram-Schmidt walk algorithm, and the major ideas behind our proofs.  This is joint work with Chris Harshaw, Fredrik Savje, and Peng Zhang.

Design a site like this with WordPress.com
Get started