Yu Shi

Yu Shi

Yu Shi

researcher

Research Interests

AI4ScienceMachine Learning System
12
Publications
0
Invited Talks

About

Yu Shi is currently a researcher at Zhongguancun Academy. His research focuses on multimodal scientific foundation models and high-performance machine learning systems. He specializes in joint pretraining of machine learning force fields and structure generation, and multimodal integration with large language models. Additionally, he has long been dedicated to addressing performance bottlenecks in machine learning systems, including low-precision and distributed training strategies, as well as GPU operator design. His work has been published in top-tier machine learning conferences and natural science journals. He has also delivered lectures on Advanced Machine Learning at institutions such as the Institute of Computing Technology, Chinese Academy of Sciences (ICT-CAS), and Tsinghua University. Moreover, he is a long-term contributor to open-source machine learning tools like LightGBM.

Education

Master's Degree

Institute of Interdisciplinary Information Sciences, Tsinghua University

Bachelor of Computer Science

Shanghai Jiao Tong University

Publications

Predicting equilibrium distributions for molecular systems with deep learning

Nature Machine Intelligence2024140 citations

Mattersim: A deep learning atomistic model across elements, temperatures and pressures

arXiv preprint202497 citations

Benchmarking graphormer on large-scale molecular modeling datasets

arXiv preprint202288 citations

Gradient boosting with piece-wise linear regression trees

International Joint Conferences on Artificial Intelligence201875 citations

The impact of large language models on scientific discovery: a preliminary study using gpt-4

arXiv preprint202370 citations

LightGBM: Light gradient boosting machine

R package version202262 citations

Scalable emulation of protein equilibrium ensembles with generative deep learning

Science202548 citations

Quantized training of gradient boosting decision trees

Advances in neural information processing systems202233 citations

From Static to Dynamic Structures: Improving Binding Affinity Prediction with Graph‐Based Deep Learning

Advanced Science20248 citations

Naturelm: Deciphering the language of nature for scientific discovery

arXiv e-prints20257 citations

Physical consistency bridges heterogeneous data in molecular multi-task learning

Advances in Neural Information Processing Systems20242 citations

E2Former: A Linear-time Efficient and Equivariant Transformer for Scalable Molecular Modeling

arXiv preprint2025