About Me

I’m currently working as an AI researcher at Together AI, where I am investigating pre- and post-training methods for language models in inference optimization and agentic applications.

I graduated with a PhD at Stanford University (advised by Prof. Matthias Ihme). I was also a Graduate Fellow (‘23) at the Stanford Institute for Human-Centered AI, and have affiliations with the Center for Open and REproducible Science, Predictive Science Academic Alliance Program, and Stanford Flow Physics and Computational Engineering Group.

My thesis focused on improving our understanding of flow physics and energy systems with AI for Science, High-Performance Computing, and Scientific Machine Learning techniques.

I’ve also curated terabytes of high-fidelity 3D fluid data at [https://blastnet.github.io/].

Check out my CV for more detailed info.

Research Overview

I love and improving large-scale computational systems and models that can help solve our biggest problems.

Currently, I am focused on building efficient and intelligent AI systems that can solve problems in industry. Some of my work has resulted in the fastest inference of open-weight large language models such as DeepSeek-R1 [info].

During my PhD, I explored the use of scientific machine learning and high performance computing for building predicitve modeling systems in space exploration, climate, and efficient energy systems [info].

Click on my Selected Publications*

M Ihme, W.T. Chung. Artificial Intelligence as a Catalyst for Combustion Science and Engineering. Accepted in Proc. Combust. Inst. 40, 2024. (Equal Contribution. Presented as a plenary lecture at the 40th International Symposium on Combustion, Milan, 2024.)

W.T. Chung, B. Akoush, P. Sharma, A. Tamkin, K.S. Jung, J.H. Chen, J. Guo, D. Brouzet, M. Talei, B. Savard, A.Y. Poludnenko, M. Ihme. Turbulence in Focus: Benchmarking Scaling Behavior of 3D Volumetric Super-Resolution with BLASTNet 2.0 Data. Adv. Neural Inf. Process. Syst. (NeurIPS) 36, 2023. [Press]

P. Sharma, W.T. Chung, B. Akoush, M. Ihme. A Review of Physics-informed ML in Fluid Mechanics. Energies 16(5):2343, 2023.

W.T. Chung, K.S. Jung, J.H. Chen, M. Ihme. The Bearable Lightness of Big Data: Towards Massive Public Datasets in Scientific Machine Learning. In ICML AI Sci. W. (ICML AI4Science), 2022.

D.D. Wu, W.T. Chung, M. Ihme. ML4LM: Machine Learning for Safely Landing on Mars. In: NeurIPS Mach. Learn. Phys. Sci. W. (NeurIPS ML4PS), 2022.

W.T. Chung, K.S. Jung, J.H. Chen & M. Ihme. BLASTNet: A Call for Community-Involved Big Data in Combustion Machine Learning. Appl. Energy Combust. Sci. 12:100087, 2022.

M. Ihme, W.T. Chung, A.A. Mishra. Combustion Machine Learning: Principles, Progress, and Prospects, Prog. Energy Combust. Sci. 91:101010, 2022.

W.T. Chung, A.A. Mishra, M. Ihme. Interpretable Data-driven Methods for Subgrid-scale Closure in LES for Transcritical LOX/GCH4 Combustion, Combust. Flame 239:111758, 2022.

W.T. Chung, A.A. Mishra, N. Perakis, M. Ihme. Accelerating High-fidelity Combustion Simulations with Classification Algorithms. In: Proc. AAAI Spring Symp. Combin. Artif. Intell. Mach. Learn. Phys. Sci. (AAAI MLPS), 2021.

W.T. Chung, A.A. Mishra, N. Perakis, M. Ihme. Data-assisted Combustion Simulations with Dynamic Submodel Assignment using Random Forests, Combust. Flame 227:172-185, 2021.

W.T. Chung, A.A. Mishra, N. Perakis, M. Ihme. Random Forests for Accelerating Turbulent Combustion Simulations. In: NeurIPS Mach. Learn. Phys. Sci. Workshop (NeurIPS ML4PS), 2020

*See my Google Scholar for a complete list.

Acknowledgments

I am grateful to my research collaborators at Together AI, Stanford, SLAC, TUM, NASA, LLNL, Sandia, UniMelb, UConn, and PolyMontréal.

Thank you to the following funding sources for making my research possible: