MIAO'S GROUP

Graduate Students

Currently, my research interests are especially shifting to Efficient Foundation Models, from architecture to algorithm and system. My current students (including co-supervised) are working on:

  • Efficient Learning

Jiarui Jiang (2026 PhD, Attention Machanism, 2 NeurIPS), Shiyuan Ren (2024 MSc, Sparse training), Hongyi Wan (2023 MSc, Continual Learning and Unlearning, 1 ICML), Haomiao Qiu (2024 PhD, Continual Learning and Unlearning, 1 ICLR+1 NeurIPS), Xiaodong Qu (2024 PhD, Robust VLA), Yanda Chen(2023 MSC, Dataset Distillation, 1 CVPR)

  • Model Compression and Inference Acceleration

Jiaqi Zhao (2024 PhD, Quantization, 2ACL+1TPAMI), Chao Zeng (2023 MSc, Quantization, 1NN+1PR), Ming Wang (2023 MSc, Activation Sparsity, 1 EMNLP), Yixuan Dong(2024 MSc, Sparse Kernel and System), Ruizi Han (2025 PhD, Sparse Attention), Zhiyi Wang (2026 MSc, Linear Attention), Yongfan Lin (2026 MSc, Quantization)

  • Multimodal Applications

Yanyi Lv (2025 PhD, Diffusion Language Model), Kailin Luo (2026 MSc, Diffusion Language Model), Zijia Song (2026 PhD, Multimodel Understanding), Ao Jin (2025 PhD, Long Video Understanding), Qianlong Xiang (2023 PhD, Diffusion Model, 2CVPR), Junpeng Jiang (2024MSc, Long Video Genenration), Lexiao Zou(2024MSc, GUI Agent, 1ICME), Tianzong Zhang (2025PhD, Multi-Agent), Qian Liang (2025MSc, Multi-Agent), Lirong Jie (2025MSc, GUI Agent, 1NeurIPS), Ji Shi (2024 MSc, Code Agent), Liyang Zheng (2023 MSc, TinyML)

Undergraduates

Weizhe Chen (Neural Achitecture Search), Xichen Zeng (Quantization), Zichen Li (Quantization), Jizhihui Liu (Spase Attention), Letian Chen (Diffusion Language Model), Futing Sun (Diffusion Language Model), Yanhua Jiao (Sparse Attention), Lejia Chen (Sparse Attention)

Alumni

Hongrong Cheng (PhD, Now Associate Professor at UESTC), Xinle Wu (PhD, Now Research Fellow at NUS), Xin Zheng (PhD, Now Lecture at RMIT)