Yixiao Wang

Download CV

About Me

I am a second-year master’s student in the Department of Statistical Science at Duke University. Prior to this, I earned my undergraduate degree in Mathematics and Statistics from the University of Science and Technology of China (USTC), where I was admitted through the School of the Gifted Young, a selective four-year program for exceptionally talented students under the age of 16.

During my time at USTC, I served as a teaching assistant for Linear Algebra B1, where I was recognized as one of the top TAs for my dedication to supporting student learning.

Outside of academics, I enjoy creating art, particularly origami (see my artwork).

Selected Publications

SADA Project Figure

SADA: Stability-guided Adaptive Diffusion Acceleration

Ting Jiang*, Yixiao Wang*, Hancheng Ye*, Zishan Shao, Jingwei Sun, Jingyang Zhang, Zekai Chen, Jianyi Zhang, Yiran Chen, Hai Li
ICML 2025

ECCD Project Figure

Enhanced Cyclic Coordinate Descent for Elastic Net GLMs

Yixiao Wang*, Zishan Shao*, Ting Jiang, Aditya Devarakonda
NeurIPS 2025 (Submitted)

ZEUS Project Figure

ZEUS: Zero-shot Efficient Unified Sparsity for Generative Models

Yixiao Wang*, Ting Jiang*, Zishan Shao*, Hancheng Ye, Jingwei Sun, Mingyuan Ma, Jianyi Zhang, Yiran Chen, Hai Li
2025

flashsvd Project Figure

FlashSVD: Memory-Efficient Inference with Streaming for Low-Rank Models

Zishan Shao, Yixiao Wang, Qinsi Wang, Ting Jiang, Zhixu Du, Hancheng Ye, Danyang Zhuo, Yiran Chen, Hai Li
AAAI 2026 (Submitted)

Research Interests

My research interests lie at the intersection of statistical machine learning, deep learning, and generative modeling, with an emphasis on rigorous theoretical foundations, interpretability, and real-world relevance. I am particularly interested in developing principled frameworks that provide formal guarantees on model behavior while remaining applicable to practical machine learning challenges.

In traditional machine learning, I focus on regression models, modern optimal tree-based methods, and the theoretical underpinnings of their integration. I am also interested in understanding and enhancing the statistical properties of these hybrid approaches. In the domain of deep learning, my work explores generative models and attention-based architectures, aiming to uncover their mathematical structure and provide insights into their generalization, expressivity, and reliability. Ultimately, my goal is to advance machine learning methods that are both theoretically sound and impactful in practice.