About me

Hi! I am a senior undergraduate student majoring in Computer Science & Mathematics in The Hong Kong University of Science and Technology (HKUST). My research interest lies in deep learning theory, providing theory to understand and explain behaviors of modern deep learning algorithms. Recently, I’ve been working on investigating empirically how LLM works. I’ve written some notes on what I’ve learned, so feel free to check them out!

I am interested in applying for PhD position in Computer Science / Statistics / Operations Research starting in 2025 Fall, and I like chatting with like-minded people so feel free to connect if you share similar research interest with me!

I am very fortunate to be mentored by Wei Hu at Michigan and Yiping Lu at Northwestern who always provide support and suggestions. I am also very grateful to Tong Zhang (previously at HKUST) for giving me opportunity to start my research in my sophomore year.

Education

  • BSc in Computer Science & Mathematics, Hong Kong University of Science and Technology, 2025 (Expected)
  • Exchange (Computer Science), École Polytechnique Fédérale de Lausanne, 2024 Spring

Research experience

  • July 2024 ~ Ongoing: Research Internship
    • Supervised under Professor Wei Hu @ Michigan
    • Working on understanding in-context learning mechanism using clean and controlled setting
  • July 2023 ~ Ongoing: Remote Collaboration
    • Supervised under Professor Yiping Lu @ Northwestern
    • Worked on benign overfitting for PINN, and had a first-authored paper under review
    • This work has an interesting finding that with smooth enough inductive bias, overparametrized kernel interpolator can obtain optimal convergence rate.
    • Organized the Deep Learning Theory Reading Group inside the research group
  • Oct 2022 ~ Nov 2023: Undergraduate Research Opportunities Program (UROP) @ HKUST
    • Supervised under Professor Tong Zhang @ HKUST
    • Paper accepted to ICLR2024 as coauthor
    • This work has an interesting finding that ensemble of models with spurious features can improve its performance, which is contrary to common belief on OOD that models should maintain invariant features and disgard spurious features like IRM.
    • (Here begins my research journey! Very thankful to Yong Lin, Yifan Hao and Lu Tan for guiding me)

Academics

  • Cumulative GPA: 4.099/4.3, Rank: 1
  • Graduate-level courses: COMP5212 Machine Learning (A), MATH5411 Advanced Probability Theory (A), CS439 Optimization in Machine Learning (6/6, at EPFL), CS552 Modern NLP (5.5/6, at EPFL)
  • Selected Undergraduate courses: COMP3711 Design and Analysis of Algorithms (A+), MATH4335 Optimization (A), MATH4063 Functional Analysis (A), MATH3312 Numerical Analysis (A+), COMP2012H Honors OOP and Data Structure (A+), MATH3043 Honors Real Analysis (A+), MATH2431 Honors Probability (A+), CS251 Theory of Computation (5.5/6, at EPFL)

Review experience

  • Conference / Journal: ICLR 2025, AISTATS 2025, NeurIPS 2024, TMLR
  • Workshop: ICML 2024 Workshop on Theoretical Foundations of Foundation Models (TF2M), ICLR 2024 Workshop on Bridging the Gap Between Practice and Theory in Deep Learning (BGPT)

Scholarship and Awards

  • Summer Research Sponsorship (HKD$20000 from CS, HKD$5000 from Math)
  • Tin Ka Ping Scholarship (Exchange) 24’ (HKD$20,000)
  • HKUST Epsilon Fund Award 24’ (HKD$5,000, for top students in the math department at HKUST, <5 undergraduates each year)
  • HKUST Study Abroad Funding Support 24’ (HKD$10,000)
  • Hong Kong Government Scholarship 22’ (HKD$40,000 per year, for students with GPA>3.95)
  • Chern Class Entry & Talent Scholarship 22’, 23’, 24’ (for top students in the math department at HKUST)

Academic Activities

These are the activities that greatly broaden my horizon (I am always grateful for having the chance to grow and learn!):

  • Heidelberg Laureate Forum, Heidelberg, Germany, Sep 2024
  • LeT-All Mentorship Workshop, Learning Theory Alliance, Online, June 2024
  • International Conference on Learning Representations (ICLR), Vienna, May 2024
  • Conference on Parismony and Learning (CPAL), Hong Kong, Jan 2024
  • Discover Citadel & Citadel Securities, Hong Kong, Apr 2023