Dr. Salman Habib
Argonne Lab/Kavli Institute at Univ. of Chicago
102 Cardwell Hall
The current model of the dynamics and content of the Universe fits a number of observational datasets to a surprising degree of success with only six free parameters. Nevertheless, several of the ingredients of the model -- dark energy, dark matter, the nature of initial density perturbations -- are embarrassingly not understood at a fundamental level. To help resolve these mysteries, future cosmological surveys aim to put the cosmological model to the test by presenting data with such small statistical errors that our ability to make theoretical predictions at the required accuracy is called into question. In this talk, I will describe efforts to meet this challenge using a combination of leading-edge supercomputing-based cosmological simulations, statistical methods, and machine learning, with the aim of 1) producing synthetic universes that have all of the scale and complexity of the observed universe, and 2) providing the ability to make a set of detailed and accurate cosmological predictions essentially instantaneously.