Experience

  1. Graduate Research Assistant

    Georgia Institute of Technology
    • Research Topics: Large Language Models (WebConf'24, ACL'24), LLM Safety (In preparation for NAACL'25), Multimodal Models (ACL'24), Recommender Systems and Dynamic Graph Mining (KDD'23), Social Network Analysis (CIKM'24, KDD'23), Fair Graph Mining (CIKM'24).
    • Advisor: Dr. Srijan Kumar
  2. Research Intern

    J.P. Morgan AI Research
    • Research Topics: Multimodal Large Language Models (MLLMs). Infographics Understanding.
  3. Research Intern

    Adobe Inc.
    • Research Topics: Multimodal Large Language Models (MLLMs) Fine-tuning, Web UI and Video Tutorial Understanding.
  4. Research Intern

    Microsoft Research Asia
    • Research Topics: Large Language Models (EMNLP'24, ICML'24, ICML'23, AAAI'23), LLM Agents (EMNLP'24, ICML'24), Scientometric Analysis (In preparation for NAACL'25), Computational Social Science, Misinformation Detection (KDD'22, AAAI'22), Few-shot Learning (ACL'24, AAAI'23), Explainable AI (AAAI'22).
    • Advisors: Dr. Xiting Wang, Dr. Jindong Wang, and Dr. Xing Xie.
  5. Undergraduate Research Assistant

    UCLA
    • Research Topics: Large Language Models (EMNLP'24), Graph Neural Networks and Data Mining (WWW'23), LLM Fine-tuning (Under Review at KDD'25), Recommender Systems (WWW'23).
    • Advisors: Dr. Yizhou Sun, Dr. Wei Wang
  6. Software Engineer Intern

    Amazon
    • Worked in Fulfillment By Amazon (FBA), IAR team
    • Designed and implemented IAR Manual Analysis, a scalable and efficient workflow using AWS Step Functions and AWS Lambda. This service automates the aggregation of data points from multiple sources like Amazon S3 and DynamoDB for SageMaker ML model training, handling over 16,000 requests per summary stage;
    • Automated the deployment of the workflow across all AWS Realms (EU/FE/NA) through CloudFormation;
    • Establish DataCraft pipeline to enable automatic data ingestion from DynamoDB into the Andes dataset catalog, promoting broader internal adoption of these datasets for cross-functional teams and enhancing data accessibility;
    • Perform ablation analysis on the inventory reconciliation model, identifying key performance bottlenecks and optimizing model performance

Education

  1. Ph.D. in Computer Science

    Georgia Institute of Technology (GaTech)
    Research focus on Large Language Models, Multimodal Learning, and Social Computing
  2. B.S in Computer Science

    University of California, Los Angeles (UCLA)
    Graduated with honors
Skills & Hobbies
Technical Skills
Large Language Models (LLMs)
Multimodal LLMs
Natural Language Processing
Graph Neural Networks
Social Computing
Hobbies
Hiking
Rabbit
Photography
Awards
Neural Networks and Deep Learning
Coursera ∙ November 2023
I studied the foundational concept of neural networks and deep learning. By the end, I was familiar with the significant technological trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture; and apply deep learning to your own applications.
Blockchain Fundamentals
edX ∙ July 2023

Learned:

  • Synthesize your own blockchain solutions
  • Gain an in-depth understanding of the specific mechanics of Bitcoin
  • Understand Bitcoin’s real-life applications and learn how to attack and destroy Bitcoin, Ethereum, smart contracts and Dapps, and alternatives to Bitcoin’s Proof-of-Work consensus algorithm
Object-Oriented Programming in R
datacamp ∙ January 2023
Object-oriented programming (OOP) lets you specify relationships between functions and the objects that they can act on, helping you manage complexity in your code. This is an intermediate level course, providing an introduction to OOP, using the S3 and R6 systems. S3 is a great day-to-day R programming tool that simplifies some of the functions that you write. R6 is especially useful for industry-specific analyses, working with web APIs, and building GUIs.
See certificate
Languages
100%
English
100%
Chinese
25%
Japanese