Selected Publications
Diffusion (Language) Models
-
K. Rojas, J. Lin, K. Rasul, A. Schneider, Y. Nevmyvaka, M. Tao, W. Deng. Improving Reasoning for Diffusion Language Models via Group Diffusion Policy Optimization. 2025
-
H. Zheng, X. Liu, X. Kong, N. Jiang, Z. Hu, W. Luo, W. Deng, G. Lin. Ultra-Fast Language Generation via Discrete Diffusion Divergence Instruct. 2025. [机器之心] [X]
-
W. Deng, W. Luo, Y. Tan, M. Biloš, Yu, Yuriy, T. Q. Chen. Variational Schrödinger Diffusion Models. ICML 2024.
-
W. Deng, Y. Chen, T. Yang, H. Du, Q. Feng, T. Q. Chen. Reflected Schrödinger Bridge for Constrained Generative Modeling. UAI 2024 (Oral) Acceptance rate 3.8%.
-
Y. Chen, W. Deng, S. Fang, F. Li, T. Yang, Y. Zhang, K. Rasul, S. Zhe, A. Schneider, Y. Nevmyvaka. Provably Convergent Schrödinger Bridge with Appli. to Prob. Time Series Imputation. ICML 2023.
Sampling
-
J. Liang, Q. Zhang, W. Deng, Q. Song, G. Lin. Bayesian Federated Learning with Hamiltonian Monte Carlo: Algorithm and Theory. Journal of Computational and Graphical Statistics. 2024
-
B. Hao, T. Lattimore, W. Deng. Information Directed Sampling for Sparse Linear Bandits. NeurIPS 2021 Spotlight (3% acceptance rate)
- W. Deng, G. Lin, F. Liang. An Adaptively Weighted Stochastic Gradient MCMC Algorithm for Monte Carlo simulation and Global Optimization. Statistics and Computing, (2022) 32:58 [code]
-
W. Deng, Q. Feng, L. Gao, F. Liang, G. Lin. Non-convex Learning via Replica Exchange Stochastic Gradient MCMC. ICML 2020. [code] [slides]
-
W. Deng, G. Lin, F. Liang. A Contour Stochastic Gradient Langevin Dynamics Algorithm for Simulations of Multi-modal Distributions. NeurIPS 2020 [code] [blog] [slides] [poster][video] [知乎]