Research

Enhanced Cyclic Coordinate Descent Methods for Elastic Net Penalized Linear Models

NeurIPS 2025 (Submitted), Duke University, May, 2025

A theoretically grounded optimization method that enhances classical coordinate descent by unrolling updates across blocks and applying Taylor-based curvature correction. ECCD achieves up to 13× speedup and maintains sub-10⁻⁵ relative error across logistic and Poisson GLMs, outperforming glmnet, biglasso, and ncvreg on high-dimensional benchmarks.

Tags: Generalized Linear Model Block Coordinate Descent Stable Accelerating

SADA: Stability-guided Adaptive Diffusion Acceleration

ICML 2025, Duke University, January, 2025

A training-free diffusion acceleration framework that jointly exploits step-wise and token-wise sparsity via a unified stability criterion. SADA achieves ≥ 1.8× speedup while maintaining LPIPS ≤ 0.10 and FID ≤ 4.5, significantly outperforming prior methods on SD-2, SDXL, Flux, and ControlNet.

Tags: Generative Model Numerical Method Stable Accelerating

TMfPOFD: Trimmed Mean for Partially Observed Functional Data

Bachelor's Thesis, University of Science and Technology of China, Mathematics Department, May, 2024

This is my bachelor’s thesis. In this thesis, I introduce the concept of the trimmed mean for partially observed functional data, prove the strong consistency of the estimator, and present results from simulation experiments.

Tags: Functional Data Trimmed Mean Data Depth Strong Consistency R Language