Grants (by year)
External grants:
PI: Shanghai STCSM Rising Star Program (A type), Grant No. 24QA2706200, Dec 2024 - Nov 2027
PI: Shanghai STCSM General Program, Grant No. 24ZR1455300, Oct 2024 - Sep 2027
National Key R&D Program of China, 2021YFA1002800, Dec 2021 - Nov 2026, (together shared by Lei Li, SJTU, PI; Zhenfu Wang, PKU)
PI: National Young Talent Program, Jan 2021 - Dec 2023
PI: NSFC, Grant No. 12001372, Jan 2021 - Dec 2023
PI: Shanghai Eastern Scholar Program (Youth Track), Jan 2020 - Dec 2022
PI: Shanghai Overseas Talent Program, 2020
Institutional grants:
PI: NYU Shanghai Boost Funds, Jan 2022 - Aug 2027
PI: NYU Shanghai Boost Funds, Jan 2021 - Dec 2026
Publication (by year)
Submitted
Beyond unconstrained features: neural collapse for shallow neural networks with general data.
W. Hong and S. Ling, submitted, 2024. (arXiv version).
Uncertainty quantification of spectral estimator and MLE for orthogonal group synchronization.
Z. S. Zhong and S. Ling, submitted, 2024. (arXiv version).
On the exactness of SDP relaxation for quadratic assignment problem.
S. Ling, submitted, 2024. (arXiv version).
Cross entropy versus label smoothing: a neural collapse perspective.
L. Guo, K. Ross, Z. Zhao, A. George, S. Ling, Y. Xu, Z. Dong, submitted, 2024. (arXiv version).
Generalized orthogonal Procrustes problem under arbitrary adversaries.
S. Ling, submitted, 2024. (arXiv version) (significantly revised in 2024).
Local geometry determines global landscape in low-rank factorization for synchronization.
S. Ling, submitted, 2023. (arXiv version)(Talking recording by IMA at UMN, on Youtube).
On the critical coupling of the finite Kuramoto model on dense networks. S. Ling, submitted, 2020. (arXiv version)
Journal Publications
Improved theoretical guarantee for rank aggregation via spectral method. Z. S. Zhong, S. Ling, Information and Inference: A Journal of the IMA, 13(3):1-36, 2024. (arXiv version)(Final).
Neural collapse for unconstrained feature model under cross-entropy loss with imbalanced data. W. Hong, S. Ling, Journal of Machine Learning Research 25(192):1-48, 2024. (arXiv version)(Journal)(Talk recording on Youtube).
Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method. S. Ling, Applied and Computational Harmonic Analysis, 66, 62-100, 2023. (arXiv version)(Final)(Talk Recording on Youtube)(Code demo)
Solving orthogonal group synchronization via convex and low-rank optimization: tightness and landscape analysis. S. Ling, Mathematical Programming, Series A, 200, 589–628, 2023. (arXiv version)(Final)
Near-optimal performance bounds for orthogonal and permutation group synchronization via spectral methods. S. Ling, Applied and Computational Harmonic Analysis 60, 20-52, 2022. (arXiv version)(Final)
Improved performance guarantees for orthogonal group synchronization via
generalized power method. S. Ling, SIAM Journal on Optimization, 32(2):1018-1048, 2022. (arXiv version)(Final)
Strong consistency, graph Laplacians, and the stochastic block model. S. Deng, S. Ling, T. Strohmer, Journal of Machine Learning Research, 22(117):1−44, 2021. (arXiv version)(Final)
Certifying global optimality of graph cuts via semidefinite relaxation: a performance guarantee for spectral clustering. S. Ling, T. Strohmer, Foundation of Computational Mathematics, 20(3):368-421, 2020. (arXiv version)(Final)(Slides)
When do birds of a feather flock together? k-means, proximity, and conic programming. X. Li, Y. Li, S. Ling, T. Strohmer, K. Wei, Mathematical Programming, Series A, 179(1):295-341, 2020. (arXiv version)(Final)(Slides)
On the landscape of synchronization networks: a perspective from nonconvex optimization. S. Ling, R. Xu, A. S. Bandeira, SIAM Journal on Optimization, 29(3):1879-1907, 2019. (arXiv version)(Final)(Talk Recording at the CMO)
Rapid, robust, and reliable blind deconvolution via nonconvex optimization. X. Li, S. Ling, T. Strohmer, K. Wei, Applied and Computational Harmonic Analysis, 47(3):893-934, 2019. (arXiv version)(Final)(Slides)(Talk Recording at the CMO)
Regularized gradient descent: a nonconvex recipe for fast joint blind deconvolution and demixing. S. Ling, T. Strohmer, Information and Inference: A Journal of the IMA, 8(1):1-49, 2019. (arXiv version)(Final)(Slides)
Self-calibration and bilinear inverse problems via linear least squares. S. Ling, T. Strohmer, SIAM Journal on Imaging Sciences, 11(1):252-292, 2018. (arXiv)(Final)
Blind deconvolution meets blind demixing: algorithms and performance bounds. S. Ling, T. Strohmer, IEEE Transactions on Information Theory, 63(7):4497-4520, 2017. (arXiv version)(Final)(Slides)
Self-calibration and biconvex compressive sensing. S. Ling, T. Strohmer, Inverse Problems, 31(11):115002, 2015. (arXiv version)(Final)(Slides)
(SIAM Student Paper Award 2017)
Backward error and perturbation bounds for high order Sylvester tensor equation. X. Shi, Y. Wei, S. Ling, Linear and Multilinear Algebra, 61(10):1436-1446, 2013. (Final)
Conference Proceedings
Fast blind deconvolution and blind demixing via nonconvex optimization. S.Ling, T.Strohmer, International Conference on Sampling Theory and Applications (SampTA), pp.114-118, 2017. (Final)
You can have it all – Fast algorithms for blind deconvolution, self-calibration, and demixing. S.Ling, T.Strohmer, Mathematics in Imaging, MW1C.1, 2017. (Final)
Simultaneous blind deconvolution and blind demixing via convex programming. S.Ling, T.Strohmer, 50th Asilomar Conference on Signals, Systems and Computers, pp.1223-1227, 2016. (Final)
Other publications
Bilinear Inverse Problems: Theory, Algorithms, and Applications. S.Ling, University of California Davis, 2017, (Manucript)(Slides)
Learning from their mistakes: self-calibrating sensors. B. Friedlander, S. Ling, T. Strohmer, SIAM News, 52(2), 2019. (Final)
|