I’m a second year PhD student in Computer Science at University of Southern California (USC). Before that, I received master degree in Computer Science from University of California San Diego. I am broadly interested in natural language processing and theoretical computer science. I am co-advised by Prof. Robin Jia and Prof. Vatsal Sharan.

Research

Machine Learning

  • Pretrained Large Language Models Use Fourier Features to Compute Addition (arxiv link).
    Tianyi Zhou, Deqing Fu, Vatsal Sharan and Robin Jia
    NeurIPS 2024.

  • Deja Vu: Contextual Sparsity for Efficient LLMs at Inference Time (arxiv link).
    Zichang Liu, Jue Wang, Tri Dao, Tianyi Zhou, Binhang Yuan, Zhao Song, Anshumali Shrivastava, Ce Zhang, Yuandong Tian, Christopher Ré, and Beidi Chen
    ICML 2023 Selected as Oral.

  • H2O: Heavy-Hitter Oracle for Efficient Generative Inference of Large Language Models (arxiv link).
    Zhenyu Zhang, Ying Sheng, Tianyi Zhou, Tianlong Chen, Lianmin Zheng, Ruisi Cai, Zhao Song, Yuandong Tian, Christopher Ré, Clark Barrett, Zhangyang Wang and Beidi Chen
    NeurIPS 2023.

Theoretical Computer Science (Author names in alphabetical order)

  • Space-Efficient Interior Point Method, with applications to Linear Programming and Maximum Weight Bipartite Matching (arxiv link).
    Sixue Liu, Zhao Song, Hengjie Zhang, Lichen Zhang and Tianyi Zhou
    ICALP, 2023.
  • Fast Heavy Inner Product Identification Between Weights and Inputs in Neural Network Training (arxiv link).
    Lianke Qin, Saayan Mitra, Zhao Song, Yuanyuan Yang and Tianyi Zhou
    IEEE Big Data, 2023.
  • Algorithm and Hardness for Dynamic Attention Maintenance in Large Language Models (arxiv link).
    Jan van den Brand, Zhao Song and Tianyi Zhou
    ICML, 2024.
  • The Closeness of In-Context Learning and Weight Shifting for Softmax Regression (arxiv link).
    Shuai Li, Zhao Song, Yu Xia, Tong Yu and Tianyi Zhou
    NeurIPS 2024.