Biography
I am now a PH.D. student in the Ngai Wong 's Group in HKU. Before that, I received my master degree from Tsinghua University in 2023. For master, I studied in IIGroup in Tsinghua University, supervised by Prof. Yujiu Yang. In 2020, I received my bachelor degree in Department of Automation from Tsinghua University. My major research interests lie in PLM and model compression. Recently, I focus on the Model Edit for Large Language Models.
Recent News
- 2024.04 - One blog about the KL divergence in KD for LLMs is released.
- 2024.03 - One paper is accepted by NAACL 2024 findings.
- 2023.02 - Two papers are accepted by ICASSP 2023.
- 2022.10 - One research long paper is accepted by WSDM 2023.
Publications
-
Weight-Inherited Distillation for Task-Agnostic BERT Compression
-
Modeling Fine-grained Information via Knowledge-aware Hierarchical Graph for Zero-shot Entity Retrieval
-
Prompt-based Model for Acronym Disambiguation via Negative Sampling
-
SynGen: A Syntactic Plug-and-play Module for Generative Aspect-based Sentiment Analysis
- Chengze Yu *, Taiqiang Wu *, Jiayi Li, Xingyu Bai, Yujiu Yang
- ICASSP 2023 (CCF B, research paper) [pdf]
-
TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities
- Zhe Zhao, Yudong Li, Cheng Hou, Jing Zhao, Rong Tian, Weijie Liu, Yiren Chen, Ningyuan Sun, Haoyan Liu, Weiquan Mao, Han Guo, Weigang Guo, Taiqiang Wu, Tao Zhu, Wenhang Shi, Chen Chen, Shan Huang, Sihong Chen, Liqun Liu, Feifei Li, Xiaoshuai Chen, Xingwu Sun, Zhanhui Kang, Xiaoyong Du, Linlin Shen, Kimmo Yan.
- ACL 2023 industrial paper [pdf] [code]
-
Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs
- Taiqiang Wu, Zhe Zhao, Jiahao Wang, Xingyu Bai, Lei Wang, Ngai Wong, Yujiu Yang
- Arxiv 2023 [pdf]
Internship
- 2021.3~2022.5, Tencent
- 2022.5~2023.5, Tencent Rahio Research Plan
Services
- ARR2022, Reviewer
- ARR2023, Reviewer
- ARR2024, Reviewer
(Last updated on Mar.15, 2024)