Understanding the Universe through Big Data analysis and machine learning
Contact person: Per Barth Lilje
Keywords: Cosmology, Extragalactic astrophysics, The Universe Galaxies
Research group: Cosmology and Extragalactic Astrophysics
Institute of Theoretical Astrophysics
One of the key goals of modern astrophysics is to understand cosmic structure formation from scales covering the entire Universe to a single galaxy. To achieve this, many ground- and space-based experiments and observatories are currently being planned and deployed that will gather huge data volumes in the coming years, and massive simulations are being constructed to interpret these data. These days, both traditional and machine learning/AI methods are being used in parallel, but typically have different and complementary strengths and weaknesses. Understanding the scopes of each type is of the essence for future work. The University of Oslo plays a key role in all these aspects, and we invite applicants to submit independent project ideas that focuses on cutting-edge Big Data analysis and simulation techniques for one or more next-generation astrophysics experiments, using either traditional or AI/machine learning methods or combinations thereof.
Relevant topics:
- 3D mapping of cosmic structure: build a 3D model of cosmic structures from the Solar system to Cosmic Dawn using either classical likelihood or machine learning techniques coupled to ray tracing with state-of-the-art GPUs (e.g., https://www.linuxlinks.com/POV-Ray/).
- Commander4 - implement a massively parallelized Gibbs sampler (Commander4; https://github.com/Cosmoglobe/Commander) for Planck HFI (https://www.esa.int/Science_Exploration/Space_Science/Planck) and Simons Observatory (https://simonsobservatory.org/) data.
- Cosmoglobe-LIM: develop an end-to-end iterative global Bayesian Gibbs sampler for Line Intensity Mapping experiments and COMAP2 (https://comap.caltech.edu/).
- Euclid: using simulation data together with machine-learning techniques to create emulators for theoretical predictions of Euclid (https://www.euclid-ec.org) observables. Some examples of this, that ties to work currently done at ITA, are void observables, weak lensing shear maps and the non-linear matter power-spectrum in models of dark energy and gravity beyond LCDM.
- Euclid+MOONS: applying data mining approaches to create Virtual-Observatory (VO) compatible (https://www.ivoa.net) data archives and tools that can enable astrophysicists to exploit the ESA Euclid space telescope data (https://www.euclid-ec.org) in combination with ground-based multi-object extragalactic spectroscopic surveys (e.g. from the MOONS instrument on the ESO VLT: https://vltmoons.org).
- LIGO, LISA: use of machine learning to develop new techniques for analyzing gravitational-wave data from LIGO (https://www.ligo.caltech.edu/) and LISA (https://lisa.nasa.gov/). This includes methods for improving the accuracy of source parameter estimation, identifying new types of gravitational-wave sources, and exploring the implications of gravitational waves for fundamental physics.
- LiteBIRD: use of machine learning techniques in the search for inflationary gravitational waves with the JAXA-led LiteBIRD satellite experiment (https://www.isas.jaxa.jp/en/missions/spacecraft/future/litebird.html).
- N-body and hydro-simulations in exascale HPC systems: use of machine learning algorithms to improve in the accuracy and efficiency of the simulations, and to extract new insights from the simulation data (https://www.space-coe.eu/index.php).
External partners:
- SINTEF