Welcome to Shuyang Ling's homepage.
Contact:
New York University Shanghai, Office Number: S750
567 West Yangsi Road
Pudong New District Shanghai, 200126
Email: sl3635 at nyu dot edu
Phone: +86(21)20596120.
About me:
I am currently a Tenure-Track Assistant Professor of Data Science at New York University Shanghai and NYU Global Network Assistant Professor. Here is my Google Scholar Page and faculty profile page at NYU Shanghai.
Ph.D. students: Ph.D. positions will be available for fall 2025.
Post-doctoral Research Fellow Hiring: Shanghai Frontiers Science Center of Artificial Intelligence and Deep Learning at NYU Shanghai has several post-doctoral positions available. The positions come with competitive salary and benefits. The post-doctoral research fellows are expected to work on topics related to the mathematical foundations of data science and machine learning. Interested applicants with background in applied mathematics, optimization, probability and statistics, and computer science are also very welcome to send your CV, representative works, and research statement to me.
Seminar and reading groups:
Research Interests:
Mathematics of signal processing and machine learning
Iterative algorithms, convex and non-convex optimization, optimization landscape
Compressive sensing, low-rank matrix recovery, blind deconvolution, group synchronization, spectral methods in data science
Inverse problems in image processing and signal processing
Computational harmonic analysis, random matrix, spectral graph theory
Teaching:
News:
New grant, Nov 2024: I feel fortunate to get selected into Shanghai Rising Star Program (A-type) from Science and Technology Commission of Shanghai Municipality, PI: Dec 2024 - Nov 2027.
New grant, Nov 2024: I feel fortunate to receive a new grant (General program in Natural Science Foundation of Shanghai) from Science and Technology Commission of Shanghai Municipality, PI: Oct 2024 - Sep 2027.
New release, Sep 2024: ‘‘Beyond unconstrained features: neural collapse for shallow neural networks with general data", jointly done with my student Wanli Hong, we provide a complete picture of when the neural collapse configuration emerges in shallow neural networks for both general data and structured data.
New release, Aug 2024: ‘‘Uncertainty quantification of spectral estimator and MLE for orthogonal group synchronization", jointly done with my student Samuel Ziliang Zhong, derived the approximate distribution of the MLE and spectral estimators for the orthogonal group synchronization under additive Gaussian noise.
New release, Aug 2024: ‘‘On the exactness of SDP relaxation for quadratic assignment problem", studies the exactness of SDRs for the quadratic assignment problem.
Paper acceptance, July 2024: ‘‘Improved theoretical guarantee for rank aggregation via spectral method", joint work with my student Samuel Ziliang Zhong, has been accepted by Information and Inference, A Journal of the IMA.
Paper acceptance, May 2024: ‘‘Neural collapse for unconstrained feature model under cross-entropy loss with imbalanced data", jointly done with my Ph.D. student Wanli Hong, has been accepted by Journal of Machine Learning Research.
New release, Feb 2024: ‘‘Cross entropy versus label smoothing: a neural collapse perspective", jointly done with Li Guo, Keith Ross and other co-authors, studies the neural collapse under the label smoothing loss function.
New release, Dec 2023: ‘‘Local geometry determines global landscape in low-rank factorization for synchronization". This work shows that the Burer-Monteiro factorization enjoys a benign optimization landscape for very small rank, and under nearly identical conditions that ensures the exact recovery of SDP relaxation.
News, Sept 2023: ‘‘Quanta Magazine“ featured our work ‘‘On the landscape of synchronization networks: a perspective from nonconvex optimization” (jointly done with Ruitu Xu and Afonso. S. Bandeira) regarding synchronization and Kuramoto model on dense and random networks.
New release, Sep 2023: ‘‘Neural collapse for unconstrained feature model under cross-entropy loss with imbalanced data", joint work with my student Wanli Hong. Our work considers the neural collapse for the unconstrained feature model with imbalanced data. We characterize the NC under the cross-entropy loss and analyze the exact threshold when the minority collapse occurs.
New release, Sep 2023: ‘‘Improved theoretical guarantee for rank aggregation via spectral method", joint work with my student Samuel Ziliang Zhong. We provide an entry-wise error bound for the spectral ranking (regular and normalized) algorithm. This work gives a rigorous analysis of these two algorithms and a tight error bound on the mismatched pairs, with near-optimal sampling complexity.
Paper acceptance, Apr 2023: ‘‘Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method", has been accepted by Applied and Computational Harmonic Analysis.
Paper acceptance, Sep 2022: ‘‘Solving orthogonal group synchronization via convex and low-rank optimization: tightness and landscape analysis", has been accepted by Mathematical Programming, Series A.
Paper acceptance, Feb 2022: ‘‘Near-optimal performance bounds for orthogonal and permutation group synchronization via spectral methods", has been accepted by Applied and Computational Harmonic Analysis.
New grant, Dec 2021: Together with Lei Li (PI, SJTU) and Zhenfu Wang (PKU), we get funded by National Key R&D Program of China (equally shared), on ‘‘Mean-field limit for stochastic many body system: Algorithm analysis and application", Dec 2021 - Nov 2026.
New release, Dec 2021: ‘‘Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method": we consider the generalized orthogonal Procrustes problem with additive Gaussian noise, and try to figure out when generalized power method and SDP relaxation are able to recover the least squares estimator of the orthogonal matrices and underlying point cloud. Our result improves the previous work and is near-optimal in terms of information-theoretical limit.
Paper acceptance, Dec 2021: ‘‘Improved performance guarantees for orthogonal group synchronization via
generalized power method", has been accepted by SIAM Journal on Optimization.
New release, Jun 2021: ‘‘Generalized power method for generalized orthogonal Procrustes problem: global convergence and optimization landscape analysis": established a sufficient condition for the tightness of the SDP relaxation and the global convergence of the generalized power methods for the well-known generalized orthogonal Procrustes problem. We also showed that the optimization landscape of the associated Burer-Monteiro factorization is benign. It is worth noting that the condition is purely algebraic and can be applied to various statistical settings. This result partially answered the open question posed by Bandeira, Khoo, Singer in 2014, submitted, 2021.
Paper acceptance, Apr 2021: ‘‘Strong consistency, graph Laplacians, and the stochastic block model", joint work with Shaofeng Deng and Thomas Strohmer, has been accepted by the Journal of Machine Learning Research.
New release, Dec 2020: ‘‘Improved performance guarantees for orthogonal group synchronization via
generalized power method": greatly improved the theoretical guarantees of generalized power method (GPM) to solve the orthogonal group synchronization problem in presence of Gaussian noise. The GPM is guaranteed to converge to the MLE linearly and globally, submitted, 2020.
New grant, Sep 2020: ‘‘Optimization methods in group synchronization on complex networks and graph clustering: theory and application". Funding Agency: NSFC 12001372, PI: Jan 2021 - Dec 2023.
New release, Aug 2020: ‘‘Near-optimal performance bounds for orthogonal and permutation group synchronization via spectral methods": discuss the performance guarantees of spectral methods for orthogonal/permutation group synchronization. The work provides a block-wise error bound of eigenvectors, submitted, 2020.
New release, Jun 2020: ‘‘Solving orthogonal group synchronization via convex and low-rank optimization: tightness and landscape analysis": discuss the convex relaxation and the Burer-Monteiro factorization of orthogonal group synchronization, submitted, 2020.
New release, Apr 2020: ‘‘Strong consistency, graph Laplacians, and the stochastic block model": provides a near-optimal performance bound for the (normalized) Laplacian-based spectral clustering in the community detection under the stochastic block model. Joint work with Shaofeng Deng and Thomas Strohmer, submitted, 2020.
New release, Apr 2020: ‘‘On the critical coupling of the finite Kuramoto model on dense networks": characterizes the critical coupling for the non-identical Kuramoto oscillators on general dense networks, submitted, 2020.
New grant, Nov 2019: Shanghai Eastern Scholar for Young Professionals. Funding Agency: Shanghai Municipal Education Commission, PI: Jan 2020 - Dec 2022.
Oct 2019: Ph.D. Program in Data Science at NYU Shanghai just launched! Click here for more details! Students interested in the mathematical foundation of data science including optimization and signal processing are welcome to apply. (I cannot answer all the email requests regarding the Ph.D. program).
Sep 2019: I joined NYU Shanghai as an Assistant Professor Faculty Fellow of Data Science.
May 2019: ‘‘On the landscape of synchronization networks: a perspective from nonconvex optimization" (joint work with Ruitu Xu and Afonso S. Bandeira) is accepted to SIAM Journal on Optimization.
Apr 2019: ‘‘Certifying global optimality of graph cuts via semidefinite relaxation: a performance guarantee for spectral clustering" (joint work with Thomas Strohmer) is accepted to Foundations of Computational Mathematics.
Mar 2019: ‘‘Learning from their mistakes: self-calibrating sensors" (joint work with Benjamin Friedlander and Thomas Strohmer) is now available on SIAM News.
|