Michael Jordan

Professor

Biography

Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley.

Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley. He received his Masters in Mathematics from Arizona State University, and earned his PhD in Cognitive Science in 1985 from the University of California, San Diego. He was a professor at MIT from 1988 to 1998. His research interests bridge the computational, statistical, cognitive, biological and social sciences. Prof. Jordan is a member of the National Academy of Sciences, a member of the National Academy of Engineering, a member of the American Academy of Arts and Sciences, and a Foreign Member of the Royal Society. He is a Fellow of the American Association for the Advancement of Science. He was a Plenary Lecturer at the International Congress of Mathematicians in 2018. He received the Ulf Grenander Prize from the American Mathematical Society in 2021, the IEEE John von Neumann Medal in 2020, the IJCAI Research Excellence Award in 2016, the David E. Rumelhart Prize in 2015, and the ACM/AAAI Allen Newell Award in 2009. He gave the Inaugural IMS Grace Wahba Lecture in 2022, the IMS Neyman Lecture in 2011, and an IMS Medallion Lecture in 2004. He is a Fellow of the AAAI, ACM, ASA, CSS, IEEE, IMS, ISBA and SIAM.

In 2016, Prof. Jordan was named the "most influential computer scientist" worldwide in an article in Science, based on rankings from the Semantic Scholar search engine.

Education

  • 1985, Ph.D., Cognitive Science, UC San Diego
  • 1980, M.S., Mathematics, Arizona State University
  • 1978, B.S., Psychology, Louisiana State University

Selected Publications

  • B. Shi, S. Du, W. Su, and M. Jordan, "Understanding the acceleration phenomenon via high-resolution differential equations," Mathematical Programming, vol. 5, pp. 634-648, April 2022.
  • M. Jagadeesan, A. Wei, M. Jordan, and J. Steinhardt, "Learning equilibria in matching markets from bandit feedback," in Advances in Neural Information Processing Systems, 2021.
  • A. Adhikari, J. DeNero, and M. Jordan, "Interleaving computational and inferential thinking: Data science for undergraduates at Berkeley," {Harvard Data Science Review, vol. 2, pp. 1-24, Sep. 2021.
  • A. El Alaoui, F. Krzakala, and M. Jordan, "Fundamental limits of detection in the spiked Wigner model," Annals of Statistics, vol. 48, pp. 863-885, July 2021.
  • M. Muehlbach and M. Jordan, "Optimization with momentum: Dynamical, control-theoretic, and symplectic perspectives," Journal of Machine Learning Research, vol. 22, pp. 1-50, July 2021.
  • W. Mou, M. Yi-An, M. Wainwright, P. Bartlett, and M. Jordan, "High-order Langevin diffusion yields an accelerated MCMC algorithm," Journal of Machine Learning Research, vol. 22, pp. 1-48, March 2021.
  • X. Dai and M. Jordan, "Learning strategies in decentralized matching markets under uncertain preferences," Journal of Machine Learning Research, vol. 22, pp. 1-50, March 2021.
  • J. D. Lee, M. Jordan, B. Recht, and M. Simchowitz, "Gradient Descent Only Converges to Minimizers," in Proceedings of the 29th Conference on Learning Theory, {COLT} 2016, New York, USA, June 23-26, 2016, 2016, pp. 1246--1257.
  • X. Pan, D. Papailiopoulos, S. Omyak, B. Recht, K. Ramchandran, and M. Jordan, "Parallel correlation clustering on big graphs," in Advances in Neural Information Processing Systems 28, 2015, pp. 82--90.
  • X. Pan, J. E. Gonzalez, S. Jegelka, T. Broderick, and M. Jordan, "Optimistic concurrency control for distributed unsupervised learning," in Advances in Neural Information Processing Systems 26, 2013, pp. 1403--1411.
  • B. Taskar, S. Lacoste Julien, and M. Jordan, "Structured prediction, dual extragradient and Bregman projections," J. Machine Learning Research, vol. 7, pp. 1627-1653, Dec. 2006.
  • F. R. Bach and M. Jordan, "Learning spectral clustering, with application to speech separation," J. Machine Learning Research, vol. 7, pp. 1963-2001, Dec. 2006.
  • Y. W. Teh, M. Jordan, M. J. Beal, and D. M. Blei, "Hierarchical Dirichlet processes," J. American Statistical Association, vol. 101, no. 476, pp. 1566-1581, Dec. 2006.
  • M. Jordan, "Graphical models," Statistical Science: Special Issue on Bayesian Statistics, vol. 19, no. 1, pp. 140-155, Feb. 2004.
  • D. M. Blei, A. Y. Ng, and M. Jordan, "Latent Dirichlet allocation," J. Machine Learning Research, vol. 3, pp. 993-1022, Jan. 2003.
  • M. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. K. Saul, "An introduction to variational methods for graphical models," Machine Learning, vol. 37, no. 2, pp. 183-233, Nov. 1999.
  • D. Wolpert, Z. Ghahramani, and M. Jordan, "An internal forward model for sensorimotor integration," Science, vol. 269, pp. 1880-1882, Sep. 1995.
  • M. Jordan and R. A. Jacobs, "Hierarchical mixtures of experts and the EM algorithm," Neural Computation, vol. 6, no. 2, pp. 181-214, March 1994.

Awards, Memberships and Fellowships