A year-long collaboration program between BAIR and the Modern Recommender Systems team at Meta, as a part of the AIM Program.
Worked on generative recommender systems such as HSTU and HLLM.
Research Intern - Time Series
Apple AIML
Mentors: Dr. Lauren Hannah, Dr. Haraldur Hallgrímsson
Time series with causal structures.
AI Resident - Time Series
Google X - Mineral.ai
Mentors: Dr. Yawen Zhang, Dr. Ming Zheng, Dr. Kezhen Chen
Project 1 - Multimodality: enable LLM to understand time series data and write a textual description.
Project 2 - Foundation model: time series pretraining analysis, where we study how the availability of downstream data affects the finetuning performance, and propose a strong baseline.
Research Intern - ML for Systems
ByteDance
Mentor: Dr. Tieying Zhang
Worked on the ByteBrain team. We focus on ML for Infrastructure.
I was primarily involved in two projects: 1. time series representation learning with contrastive learning, as well as interpretable time series forecasting by attending to the most relevant candidate sequence in the history; 2. virtual machine rescheduling using reinforcement learning, involving transformers and risk-seeking RL.
I was also involved in a third project on job scheduling in the cloud.
Applied Scientist Intern - Time Series
Amazon A9
Mentors: Dr. Anbo Chen, Dr. Shan Kang
Worked on time series forecasting of web traffic data on the Amazon Advertising Technology team with deep learning.
Extended transformer’s applicability for long-term predictions on small datasets. Built a visualization tool for monitoring feature importance over the course of training.
Research Intern - Time Series
WeWork
Mentors: Dr. Yun Chi, Dr. Haixun Wang
Predicted conference room occupancy for WeWork offices; improved forecasting accuracy by 30% compared to the original implementation.
Introduced a non-autoregressive transformer that lowers the time complexity for multi-step forecasting from O(L) to O(1).
Proposed a novel training technique for transformers on small-scale datasets.
Student Researcher - Neural Network Quantization
RIPS REU Program, IPAM, UCLA
Mentors: Dr. Nicholas Malaya, Dr. Alan Lee
Sponsored by AMD, the project was to explore the resiliency of convolutional neural networks to reduced numerical precisions.
Showed that larger batch sizes might lead to poorer results under low precision via simulated experiments and delivered preliminary proofs for error scaling in forward and backward propagations.
Education
PhD at BAIR
UC Berkeley
GPA: 3.80/4.0
Supported by National Science Foundation Graduate Research Fellowship (NSF-GRFP).
MEng Capstone Faculty Mentorship Award.
BS Mathematics
UC Santa Barbara (College of Creative Studies)
GPA: 3.89/4.0
BS Computing
UC Santa Barbara (College of Creative Studies)
Advisor: Professor Xifeng Yan.
Alumni Association Scholarship, SURF research fellowship, URCA undergraduate research grant x2.