Biography
I am now a PH.D. student from the NGai Lab in HKU.
Before that, I received my master degree from Tsinghua University in 2023.
For master, I studied in IIGroup in Tsinghua University,
supervised by Prof. Yujiu Yang.
In 2020, I received my bachelor degree in Department of Automation from Tsinghua University.
My major research interests lie in Effcient Model methods for Large Language Models.
I am looking for an internship/visiting opportunity, please contact me if you are interested.
Recent News
- 2024.09 - One paper is accepted by EMNLP 2024.
- 2024.06 - One blog about improving LoRA is released. [Eng | 中]
- 2024.04 - One blog about rethinking KL divergence in KD for LLMs is released. [Eng | 中]
- 2024.03 - One paper is accepted by NAACL 2024 findings.
- 2023.02 - Two papers are accepted by ICASSP 2023.
- 2022.10 - One research long paper is accepted by WSDM 2023.
Publications & Preprints
-
A Survey on the Honesty of Large Language Models
-
Mixture-of-Subspaces in Low-Rank Adaptation
-
Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
-
Weight-Inherited Distillation for Task-Agnostic BERT Compression
-
Modeling Fine-grained Information via Knowledge-aware Hierarchical Graph for Zero-shot Entity Retrieval
-
Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs
- Taiqiang Wu, Zhe Zhao, Jiahao Wang, Xingyu Bai, Lei Wang, Ngai Wong, Yujiu Yang
- Arxiv 2023 [pdf]
Internship
- 2021.3~2022.5, Tencent
- 2022.5~2023.5, Tencent Rahio Research Plan
Services
- ARR2022, Reviewer
- ARR2023, Reviewer
- ARR2024, Reviewer
(Last updated on Jun, 2024)