Andy Dong

Andy Dong

Applied Scientist

About Me

I am an Applied Scientist and Tech Lead at Amazon, conducting research at the intersection of Personalization, Recommendation, and Large Language Models. My work bridges natural language understanding, information retrieval, and foundation model fine-tuning with generative recommendation, grounded in extensive experience building end-to-end scalable ML systems for real-world applications across Ads, Search, and Recommendation systems at global scale.

My current research interests center on Large Language Models and their integration into retrieval-augmented generation (RAG) and agentic systems. Building on my experience deploying large-scale industrial ML products, I am particularly interested in how foundation models can be adapted to reason over structured and unstructured knowledge, enable grounded personalization, and support multi-agent collaboration, advancing both the science and practice of information retrieval and human-AI interaction.

Experience

News

Publications

  1. Graph Collaborative Signals Denoising and Augmentation for Recommendation
    SIGIR
    Ziwei Fan, Ke Xu, Zhang Dong, Hao Peng, Jiawei Zhang, Philip S Yu
    Proceedings of the 46nd International ACM SIGIR Conference (SIGIR), 2023.
  2. Exploring information retrieval for personalized teaching support
    HCII
    Nanjie Rao, Sharon Lynn Chu, Zeyuan Jing, Huan Kuang, Yunjie Tang, Zhang Dong
    International Conference on Human-Computer Interaction (HCII), 2022.
  3. Cross-Document Contextual Coreference Resolution in Knowledge Graphs
    Preprint
    Zhang Dong, Mingbang Wang, Le Dai, Jiyuan Li, Xingzu Liu, Ruilin Nong
    Preprint
  4. End-to-End Dialog Neural Coreference Resolution: Balancing Efficiency and Accuracy in Large-Scale Systems
    Preprint
    Zhang Dong, Songhang deng, Mingbang Wang, Le Dai, Jiyuan Li, Xingzu Liu, Ruilin Nong
    Preprint
  5. Enhancing Coreference Resolution with Pretrained Language Models: Bridging the Gap Between Syntax and Semantics
    Preprint
    Xingzu Liu, Mingbang Wang, Zhang Dong, Le Dai, Jiyuan Li, Ruilin Nong
    Preprint

Services

Powered by Hugo with AcademiaLight