Skip to main content

Publications

* indicates equal contribution or alphabetic order.

On Transferring Transferability: Towards a Theory for Size Generalization

Eitan Levin*Yuxin Ma*Mateo DíazSoledad Villar 

Preprint.

We study the properties that make machine learning models generalize their performance across dimensions.

transferability size generalization graph neural networks equivariant machine learning any-dimensional learning

Nonlinear Laplacians: Tunable principal component analysis under directional prior information

Yuxin MaDmitriy Kunisky 

Preprint.

We study a new class of spectral algorithms for low-rank estimation based on a tunable nonlinear deformation of an observed matrix.

principal component analysis random matrix theory spiked matrix models low-rank estimation