Yizhong Wang is a final-year PhD candidate at the University of Washington, advised by Noah Smith and Hannaneh Hajishirzi. He has also been a student researcher at the Allen Institute for Artificial Intelligence (AI2) for the past 2 years, co-leading the post-training efforts in building fully open language models (OLMo). His research focuses on the fundamental data challenges in AI development and algorithms centered around data, particularly for building more general-purpose models. His work, such as Super-NaturalInstructions, Self-Instruct, and Tülu, has been widely used in building large language models today. He has won multiple paper awards, including ACL 2024 Best Theme Paper, CCL 2020 Best Paper, and ACL 2017 Outstanding Paper. He also serves on the program committee of top NLP and ML conferences and was an area chair for EMNLP 2024. Previously, he received his Master's degree from Peking University and his Bachelor's degree from Shanghai Jiao Tong University. He has interned at AI2, Meta AI, Microsoft Research Asia, and Baidu NLP.