Tran Hoai Chau

Hoai Chau held a BSc at the University of Science Ho Chi Minh City. He is currently working as a Research Assistant at VinUniversity under the guidance of Prof. Doan Dang Khoa and Prof. Heng Ji (UIUC). His research interest is low-resource and deep-learning model compression.

His current research focuses on techniques to compress Transformer-based models, including Large Language Model (LLMs) and Vision Transformers (ViTs). This includes approaches like quantization, token merging, KV cache compression, and developing more efficient decoding algorithms.

Research Interest:

Deep Learning

Efficient AI


Publications

DetectVul: A statement-level code vulnerability detection for Python

Hoai-Chau Tran, Anh-Duy Tran, Kim-Hung Le

North-Holland, 2025

Accelerating Transformers with Spectrum-Preserving Token Merging

Hoai-Chau Tran, Duy MH Nguyen, Duy M Nguyen, Trung-Tin Nguyen, Ngan Le, Pengtao Xie, Daniel Sonntag, James Y Zou, Binh T Nguyen, Mathias Niepert

2024

Energy Minimizing-based token merging for accelerating Transformers

Hoai-Chau Tran, Duy Minh Ho Nguyen, Manh-Duy Nguyen, Ngan Hoang Le, Binh T Nguyen