My primary research interests are in domain-specific large language models (LLMs) and natural language processing (NLP). I have also contributed to other AI-related research projects, including applications in AI for Science.
Benchmarking for Domain-Specific LLMs: A Case Study on Academia and Beyond Rubing Chen, Jiaxin Wu, Jian Wang, Xulu Zhang, Wenqi Fan, Chenghua Lin, Xiao-Yong Wei*, Qing Li Findings of the Empirical Methods in Natural Language Processing (EMNLP-Findings), 2025 arXiv | Paper | Github
Honors and Awards
[2025-06] Best Project Award Competition 2025, First Runner-up
[2025-06] Au Bak Ling Charity Trust Scholarship 2024/25
[2023-05, 2025-05] Interdisciplinary Contest In Modeling (ICM), Meritorious Winner
[2024-12] The 4th China Mobile ‘Wutong Cup’ National Finals, Second Prize
[2022-2024] Dean’s Honours List, Department of Computing, PolyU (3 consecutive years)
[2022-09] Undergraduate Research Innovation Scheme Scholarship 2022/23