Chang Zou

Yingcai Honors College, University of Electronic Science and Technology of China shenyizou@outlook.com

prof_pic.jpg

Decoding nature, shaping futures.

Chengdu, Sichuan, China

I am Zou Chang, an undergraduate student at the Yingcai Honors College, University of Electronic Science and Technology of China (UESTC), majoring in Artificial Intelligence under the Fundamental Science Program in Mathematics and Physics (Class of 2022). I expect to receive my Bachelor’s degree in 2026.

Currently, I am an intern at the EPIC-Lab, led by Professor Linfeng Zhang, in the School of Artificial Intelligence at Shanghai Jiao Tong University. My primary research interests include, but are not limited to, accelerating image and video generation models.

I also have a strong interest in areas such as Large Language Models (LLMs) and Multimodal Large Language Models (MLLMs). I welcome opportunities for collaboration and discussion!

news

Dec 29, 2024 🚀🚀 We release our work DuCa about accelerating diffusion transformers for FREE, which achieves nearly lossless acceleration of 2.50× on OpenSora! 🎉 DuCa also overcomes the limitation of ToCa by fully supporting FlashAttention, enabling broader compatibility and efficiency improvements.
Oct 12, 2024 🚀🚀 We release our work ToCa about accelerating diffusion transformers for FREE, which achieves nearly lossless acceleration of 2.36× on OpenSora!
Nov 07, 2015 A long announcement with details-(no information, just a placeholder)

latest posts

selected publications

  1. ToCa-Preview.png
    Accelerating Diffusion Transformers with Token-wise Feature Caching
    Chang Zou, Xuyang Liu, Ting Liu, and 2 more authors
    arXiv preprint arXiv:2410.05317, 2024
  2. DuCa-Preview.png
    Accelerating Diffusion Transformers with Dual Feature Caching
    Chang Zou, Evelyn Zhang, Runlin Guo, and 4 more authors
    arXiv preprint arXiv:2412.18911, 2024