Experience

  1. Visiting Researcher - Recommender Systems

    Meta AI
    • Mentors: Qiang Zhang, Yinglong Xia
    • A year-long collaboration program between BAIR and the Modern Recommender Systems team at Meta, as a part of the AIM Program.
    • Worked on generative recommender systems such as HSTU and HLLM.
  2. Research Intern - Time Series

    Apple AIML
    • Mentors: Dr. Lauren Hannah, Dr. Haraldur Hallgrímsson
    • Time series with causal structures.
  3. AI Resident - Time Series

    Google X - Mineral.ai
    • Mentors: Dr. Yawen Zhang, Dr. Ming Zheng, Dr. Kezhen Chen
    • Project 1 - Multimodality: enable LLM to understand time series data and write a textual description.
    • Project 2 - Foundation model: time series pretraining analysis, where we study how the availability of downstream data affects the finetuning performance, and propose a strong baseline.
  4. Research Intern - ML for Systems

    ByteDance
    • Mentor: Dr. Tieying Zhang
    • Worked on the ByteBrain team. We focus on ML for Infrastructure.
    • I was primarily involved in two projects: 1. time series representation learning with contrastive learning, as well as interpretable time series forecasting by attending to the most relevant candidate sequence in the history; 2. virtual machine rescheduling using reinforcement learning, involving transformers and risk-seeking RL.
    • I was also involved in a third project on job scheduling in the cloud.
  5. Applied Scientist Intern - Time Series

    Amazon A9
    • Mentors: Dr. Anbo Chen, Dr. Shan Kang
    • Worked on time series forecasting of web traffic data on the Amazon Advertising Technology team with deep learning.
    • Extended transformer’s applicability for long-term predictions on small datasets. Built a visualization tool for monitoring feature importance over the course of training.
  6. Research Intern - Time Series

    WeWork
    • Mentors: Dr. Yun Chi, Dr. Haixun Wang
    • Predicted conference room occupancy for WeWork offices; improved forecasting accuracy by 30% compared to the original implementation.
    • Introduced a non-autoregressive transformer that lowers the time complexity for multi-step forecasting from O(L) to O(1).
    • Proposed a novel training technique for transformers on small-scale datasets.
  7. Student Researcher - Neural Network Quantization

    RIPS REU Program, IPAM, UCLA
    • Mentors: Dr. Nicholas Malaya, Dr. Alan Lee
    • Sponsored by AMD, the project was to explore the resiliency of convolutional neural networks to reduced numerical precisions.
    • Showed that larger batch sizes might lead to poorer results under low precision via simulated experiments and delivered preliminary proofs for error scaling in forward and backward propagations.

Education

  1. PhD at BAIR

    UC Berkeley
    • GPA: 3.80/4.0
    • Supported by National Science Foundation Graduate Research Fellowship (NSF-GRFP).
    • MEng Capstone Faculty Mentorship Award.
  2. BS Mathematics

    UC Santa Barbara (College of Creative Studies)
    • GPA: 3.89/4.0
  3. BS Computing

    UC Santa Barbara (College of Creative Studies)
    • Advisor: Professor Xifeng Yan.
    • Alumni Association Scholarship, SURF research fellowship, URCA undergraduate research grant x2.
Skills & Hobbies
Technical Skills
Python & PyTorch
Machine Learning
Cloud Computing (AWS/GCP)
Hobbies
Hiking in the Rockies
Building Custom PCs
Sci-Fi Reading
Awards
NSF-GRFP
National Science Foundation (NSF) ∙ January 2020
NSF recognizes and supports outstanding STEM students for three years.
MEng Capstone Mentorship Award
UC Berkeley ∙ May 2022
Awards to two faculty instructors every year.
Alumni Association Scholarship
UC Santa Barbara ∙ January 2018
Awarded $2000 to each of the 33 students in 2018.
Languages
100%
English
100%
Chinese