My research focuses on deep learning theory, PDE learning, numerical PDEs and image processing.
Deep learning theory: I develope approximation theories and statistical learning theories of deep neural networks on various problems, especially when data have some low-dimensional structures.
PDE learning: I design efficient and robust algorithms for PDE learning from noisy data sets.
Numerical PDEs: I focus on using the level set method and operator-splitting method to solve various problems and nonlinear PDEs. My recent works proposed operator-splitting method based numerical solvers for the Monge-Ampère type equations.
Image processing: I design image regularization models and efficient algorithms by operator-splitting methods.
Learning functions varying along an active subspace.
SIAM Student Seminar, Georgia Institute of Technology, Feb. 2020
Approximate functions varying along an active subspace.
Workshop on New Trends in Machine Learning and Numerical PDEs, Hong Kong Baptist University, Dec. 2019
A level set based variational principal flow method for nonparametric dimension reduction on Riemannian manifolds.
Scientific Computing Seminars, University of Houston, Nov. 2018
Poster Presentation
Learning functions varying along an active subspace.
2020 Georgia Scientific Computing Symposium, Emory University, Feb. 2020.
Approximate functions varying along an active subspace.
Workshop on Recent Developments on Mathematical/Statistical approaches in DAta Science (MSDAS),
The University of Texas at Dallas, May. 2019.
A level set based variational principal flow method for nonparametric dimension reduction on Riemannian manifolds.
2019 Georgia Scientific Computing Symposium, Georgia Institute of Technology, Feb. 2019.
A level set based variational principal flow method for nonparametric dimension reduction on Riemannian manifolds.
Meeting the Statistical Challenges in High Dimensional Data and Complex Networks,
National University of Singapore, Feb. 2018.