Yixiao Wang

Download CV

I am a second-year master’s student in the Statistical Science at Duke University. I work with Prof. Cynthia Rudin in the Interpretable Machine Learning Lab, focusing on interpretable machine learning theory. I also work with Prof. Anru Zhang on deep learning theory. Beyond these, I have collaborated with several other faculty members and students, as detailed on my Research Experience section in Research page. I am deeply grateful for the guidance and support from these brilliant mentors and collaborators.

Prior to Duke, I earned my undergraduate degree in Mathematics (Probability and Statistics track) at the University of Science and Technology of China (USTC), where I was admitted through the School of the Gifted Young, a selective program for exceptionally talented students under the age of 17.

Selected Publications

SADA Project Figure

SADA: Stability-guided Adaptive Diffusion Acceleration

Ting Jiang*, Yixiao Wang*, Hancheng Ye*, Zishan Shao, Jingwei Sun, Jingyang Zhang, Zekai Chen, Jianyi Zhang, Yiran Chen, Hai Li
ICML 2025

ECCD Project Figure

Enhanced Cyclic Coordinate Descent for Elastic Net GLMs

Yixiao Wang*, Zishan Shao*, Ting Jiang, Aditya Devarakonda
NeurIPS 2025

ZEUS Project Figure

ZEUS: Zero-shot Efficient Unified Sparsity for Generative Models

Yixiao Wang*, Ting Jiang*, Zishan Shao*, Hancheng Ye, Jingwei Sun, Mingyuan Ma, Jianyi Zhang, Yiran Chen, Hai Li
ICLR 2026 (under review)

LSA Time Series Project Figure

Why Do Transformers Fail to Forecast Time Series In-Context?

Yufa Zhou*, Yixiao Wang*, Surbhi Goel, Anru Zhang
NeurIPS 2025 Workshop on WCTD (acceptance rate 40%)
ICLR 2026 (under review)

reasoning_flow

The Geometry of Reasoning: Flowing Logics In Representation Space

Yufa Zhou*, Yixiao Wang*, Xunjian Yin*, Shuyan Zhou, Anru Zhang
ICLR 2026 (under review)

flashsvd Project Figure

FlashSVD: Memory-Efficient Inference with Streaming for Low-Rank Models

Zishan Shao, Yixiao Wang, Qinsi Wang, Ting Jiang, Zhixu Du, Hancheng Ye, Danyang Zhuo, Yiran Chen, Hai Li
AAAI 2026 (under review)

* Equal contribution

Research Interests

My research interests lie at the intersection of statistical machine learning, deep learning, and generative modeling, with an emphasis on rigorous theoretical foundations, interpretability, and real-world relevance. I am particularly interested in developing principled frameworks that provide formal guarantees on model behavior while remaining applicable to practical machine learning challenges.

In traditional machine learning, I focus on regression models, modern optimal tree-based methods, and the theoretical underpinnings of their integration. I am also interested in understanding and enhancing the statistical properties of these hybrid approaches. In the domain of deep learning, my work explores generative models and attention-based architectures, aiming to uncover their mathematical structure and provide insights into their generalization, expressivity, and reliability. Ultimately, my goal is to advance machine learning methods that are both theoretically sound and impactful in practice.