Ernest Chong

Home / People / Faculty / Ernest Chong

Assistant Professor

Office #: 1.502-14
Pillar / Cluster: Information Systems Technology and Design
Research Areas:Computing Theory, Artificial and Augmented Intelligence, Data Science


I received my PhD in Mathematics from Cornell University in August 2015, under the supervision of Edward Swartz. Prior to joining SUTD, I was a research scientist in the Data Analytics Department at the Institute for Infocomm Research, a research institute within the Agency for Science, Technology and Research (A*STAR), and I was also an adjunct assistant professor in the Division of Mathematical Sciences at Nanyang Technological University. As the grand winner of the Singapore National Science Talent Search 2003, I was awarded the A*STAR National Science Scholarship, which fully funded both my undergraduate and PhD studies.


  • Ph.D. in Mathematics, Cornell University, August 2015
  • M.S. in Mathematics, Cornell University, May 2013
  • B.A. in Mathematics, Cornell University, May 2009 (summa cum laude, distinction in all subjects, accelerated graduation in 3 years)

Detailed Research Interests

My current research deals with both Mathematics and Artificial Intelligence (AI).

Primary interests in Mathematics

  • algebraic, geometric, topological, and enumerative combinatorics
  • combinatorial and computational commutative algebra

Primary interests in AI

  • theoretical and computational aspects of machine learning, especially deep learning
  • unsupervised and semi-supervised learning, architecture hyperparameters of neural networks

A common theme in my research is the understanding of numerical or algebraic data via some underlying combinatorial structure, and I find it useful to draw ideas from other fields such as discrete geometry, algebraic geometry and algebraic topology. Within AI, I am particularly interested in developing an algebraic theory for deep learning that incorporates methods from computational commutative algebra, and I am also very interested in methods for reducing the computational cost of various learning algorithms while still maintaining good learning performance.