Hanxu Hu


I'm a 1st Year (starting from 2024 Fall) PhD student at University of Zurich co-supervised by Prof. Rico Sennrich and Prof. Ivan Titov at University of Edinburgh, and will work closely with Dr. Jannis Vamvas.

I'm a research intern at Shanghai AI Lab and was a member of EdinburghNLP advised by Prof. Edoardo Ponti and a research assistant at WestlakeNLP advised by Prof. Yue Zhang. I have done research about instruction tuning, in-context learning, and cross-lingual transfer.

I'm interested in scalable methods in both post-training and pre-training stages of Large Language Models. And I am trying to be a full-stack Large Language Model researcher.

I often read The Bitter Lesson, learn a lot from it.

Hanxu Hu

Recent Preprints

Source-primed Multi-turn Conversation Helps Large Language Models Translate Documents

Hanxu Hu, Jannis Vamvas, and Rico Sennrich

arXiv 2503.10494

Generalizing From Short to Long: Effective Data Synthesis for Long-Context Instruction Tuning

Wenhao Zhu, Pinzhen Chen, Hanxu Hu, Shujian Huang, Fei Yuan, Jiajun Chen, Alexandra Birch

arXiv 2502.15592

A Comprehensive Multilingual Evaluation Suite for Large Language Models

Xu Huang, Wenhao Zhu, Hanxu Hu, Conghui He, Lei Li, Shujian Huang, Fei Yuan

arXiv 2502.07346

Selected Papers

You can find all papers in my google scholar page.

Fine-tuning Large Language Models with Sequential Instructions

Hanxu Hu*, Simon Yu*, Pinzhen Chen*, Edoardo M. Ponti

NAACL 2025 Main

Project Page / arXiv

Chain-of-Symbol Prompting For Spatial Reasoning in Large Language Models

Hanxu Hu*, Hongyuan Lu*, Huajian Zhang, Yun-Ze Song, and Yue Zhang

COLM 2024

Code / arXiv

Meta-Learning For Multi-Modal Cross-lingual Transfer

Hanxu Hu and Frank Keller

MRL at EMNLP 2023 (Best Paper Nomination)

Code / arXiv

Service

Acknowledgments

I am lucky to work with many enthusiastic, intelligent, and hardworking peers, such as Pinzhen Chen at UofEdinburgh, Simon Yu at NEU, Chenmien Tan at UofEdinburgh, Wenhao Zhu at Nanjing University, Zeyu Huang at UofEdinburgh. I learnt a lot from them.