Yongyi is a third-year Ph.D. student at University of Michigan, advised by Prof. Wei Hu. He had an internship at NTT Research at Harvard under the advisement of Dr. Hidenori Tanaka, with whom he continues collaborating closely. His research focuses on understanding the foundations and principles of deep learning, spanning research areas such as deep learning theory, science of deep learning, and mechanistic interpretability. Additionally, he is also broadly interested in many other exciting research topics of computer science, including quantization and graph neural networks.
Yongyi received his Bachelor of Science from Fudan university, under the supervision of Prof. Xipeng Qiu. He also had an internship at Amazon Shanghai AI Lab and has his fortune to be advised by Dr. David Wipf and Prof. Zengfeng Huang.
Besides academic research, Yongyi also harbors a passion in mathematics, Chinese classical literature and XiaoXue. Feel free to contact if you share the same interests.
ICLR: In-Context Learning of Representations Core Francisco Park*, Andrew Lee*, Ekdeep Singh Lubana*, Yongyi Yang*, Maya Okawa, Kento Nishi, Martin Wattenberg, Hidenori Tanaka
arxiv preprint
Dynamics of Concept Learning and Compositional Generalization Yongyi Yang, Core Francisco Park, Ekdeep Singh Lubana, Maya Okawa, Wei Hu, Hidenori Tanaka
arxiv preprint
Practical and Asymptotically Optimal Quantization of High-Dimensional Vectors in Euclidean Space for Approximate Nearest Neighbor Search Jianyang Gao, Yutong Gou, Yuexuan Xu, Yongyi Yang, Cheng Long, Raymond Chi-Wing Wong
arxiv preprint
HERTA: A High-Efficiency and Rigorous Training Algorithm for Unfolded Graph Neural Networks Yongyi Yang, Jiaming Yang, Wei Hu, Michał Dereziński
arxiv preprint
Going Beyond Linear Mode Connectivity: The Layerwise Linear Feature Connectivity Zhanpeng Zhou, Yongyi Yang, Xiaojiang Yang, Junchi Yan, Wei Hu
Neurips 2023
Are Neurons Actually Collapsed? On the Fine-Grained Structure in Neural Representations Yongyi Yang, Jacob Steinhardt, Wei Hu
ICML 2023
Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks Hongjoon Ahn,Yongyi Yang, Quan Gan, David Wipf, Taesup Moon
Neurips 2022
Transformers from an Optimization Perspective Yongyi Yang, Zengfeng Huang, David Wipf
Neurips 2022
Why Propagate Alone? Parallel Use of Labels and Features on Graphs Yangkun Wang, Jiarui Jin, Weinan Zhang, Yongyi Yang, Jiuhai Chen, Quan Gan, Yong Yu, Zheng Zhang, Zengfeng Huang, David Wipf
ICLR 2022
Graph Neural Networks Inspired by Classical Iterative Algorithms Yongyi Yang , Tang Liu, Yangkun Wang, Jinjing Zhou, Quan Gan, Zhewei Wei, Zheng Zhang, Zengfeng Huang, David Wipf
ICML 2021, long talk
Implicit vs Unfolded Graph Neural Networks Yongyi Yang, Yangkun Wang, Tang Liu, Zengfeng Huang, David Wipf
arxiv preprint
Relation of the Relations: A New Paradigm of the Relation Extraction Problem Zhijing Jin*, Yongyi Yang*, Xipeng Qiu, Zheng Zhang
arxiv preprint
(Last update: 01/07/2025)