Proposed projects
Available as 2022-2023 semester projects.
relativistic cosmic rays thermal and kinetic feedback from SN explosions, and magnetic fields. The pivotal role of the magnetic field in star formation and galactic outflows and ISM dynamics is widely appreciated but not well understood. It is expected that on the length scales of ~10 to ~100 pc magnetic fields will have a significant impact on the dynamics of ISM essentially on the formation of dense cold clouds that lead to the star formation. Moreover, the interpretation of many crucial radio observables (synchrotron emission and Faraday rotation) depends sensitively on the distribution of magnetic fields in various ISM phases. This project is aimed at addressing this issue by statistically analyzing the multi-phase structure of ISM, its dynamical evolution, and its interaction with dynamical strength magnetic fields, mainly by computing the Betty numbers from the Magnetohydrodynamics (MHD) simulations of ISM. By doing so, we seek to gain valuable insights into the role of magnetic fields in shaping the ISM and its physical processes.
https://ui.adsabs.harvard.edu/abs/2015AN….336..991B/abstract
Supervisors: LASTRO/CVLab/eSpace (Prof. Jean-Paul Kneib/Stephan Hellmich/Andrew Price)
Type of Project: Master project (can be adapted for TP-IV)
Duration: 14 weeks (Official start/end date: September 19-December 22)
Submission of final report: January 15
Final Presentation: TBD
Prerequisites:
- Open to coding in Python
- Interest in image rendering
- Familiarity with rendering (Blender) and camera basics (I.E. pinhole camera model) is a plus.
- Interest in computer vision or machine learning is a plus.
Context
As part of a collaborative research project between CVLab and LASTRO, we are exploring novel techniques for determining the rotational and physical properties of space debris. For this purpose, we are currently developing methods for the detection and extraction of space debris observations from large astronomical data archives. These archives contain observational data over a 10-year period and include a large amount of random satellite and space debris observations. On the astronomical images, these objects appear as characteristic streaks, most of which cross the entire detector during the several minutes of exposure time.
In order to monitor the performance of our streak detection methods, we incorporate synthetic streaks to the data before processing. This allows us to determine the detection efficiency and to verify orbit determination routines. Currently, these synthetic streaks are randomly generated features that do not reflect any information on orbit, size or shape. Improving the generation of synthetic streaks incorporating this information would make them appear more realistic and would improve our algorithm’s robustness in the analysis of real streaks.
The goal of our analysis is to obtain as much information as possible from the observed space objects. An important property that we want to determine is the rotation rate axis. This tumbling state can theoretically be obtained from the intensity profile of the observed streak. However, for certain objects or tumbling states, we might not have enough data for a robust analysis. A priori knowledge on the size and shape of the object can be used to generate synthetic data that can be used to constrain the tumbling state of the observed objects.
Project Scope
The goal of this project is to develop a tool that allows the insertion of realistic synthetic observations of space objects into astronomical images. These synthetic observations should be based on an artificial population of space objects that resembles the real population as closely as possible. This population is then used to implant synthetic streaks into real data. While the observatory location, telescope and instrument determine which objects are visible at the time of observations, object shape, observing geometry (illumination conditions), rotation, atmospheric extinction and seeing define the precise appearance of the synthetic streak.The final outcome of this project will be the synthetic population of space objects and a tool that inserts the synthetic streaks into real data.
Tasks
- Familiarization with astronomical data archives
(data products, instruments, sensors, environment that influences the appearance of space objects on astronomical images) - Implementation of a rendering engine (Blender)
- Generation of a synthetic population of space objects, incorporating orbits, shapes, sizes and rotations
- Development of a tool to implant synthetic streaks on astronomical images
Supervisors: M. Hirschmann (Faculty)
Type of Project: Master project (can be adapted for TP-IV)
Project description: Are new JWST observations of massive galaxies at z>10 questioning our LambdaCDM cosmological model?
Thanks to the new, revolutionary James Webb Space Telescope, galaxies have been discovered to form earlier in cosmic history and are likely more massive than previously thought and predicted by state-of-the-art cosmological simulations assuming a LambdaCDM cosmogony. A number of potential solutions to resolve this tension have been discussed in literature, such as less efficient stellar feedback/UV radiation background, higher star formation efficiency, existence of massive (PopIII) stars producing more UV photons, no significant dust attenuation etc, but hardly any quantitive study has been conducted so far testing these scenarios. In this context, the student would take advantage of a modern galaxy formation model applied to merger trees from a large dark-matter-only simulation to explore the impact of different stellar feedback and star formation models on number counts of UV-bright galaxies at z > 10 and the related UV luminosity function, confronted to new JWST data. These model developments will be used to create novel mock galaxy catalogues for high-redshift galaxy populations to provide an interpretative framework for current and future high-redshift galaxy surveys, such as with JWST.
Supervisors: M. Hirschmann (Faculty), R. Tress (PostDoc)
Type of Project: Master project (can be adapted for TP-IV)
Project description: Understanding the interstellar medium of barred galaxies via next-generation, idealised simulations
The PHANGS survey [1] is observing nearby galaxies at unprecedented high resolution with different revolutionary telescopes, such as ALMA and JWST, to study their complex interstellar medium in great detail. Many of those galaxies are barred disc galaxies. These barred structures are responsible in efficiently driving copious gas towards the centre, inducing complex gas motions, extreme star formation and potentially fuelling the a central active galactic nucleus. To robustly understand and interpret these observations, at GalSpec, we conduct high-resolution, idealised magnetohydrodynamic simulations of such galaxies. In this context, the student would work on the generation of a model of the background stellar and dark matter potential in which the gas evolves, stars form and explode etc (by developing own python routines). This can be done by constructing an analytic fixed background potential in a similar way as in [2]. The IR photometric data of these observed galaxies (which is a proxy for the old stellar component) can be fitted with a disc component, a bulge component, and an exponential bar component. Parameters can then be adjusted with simple isothermal simulations of the gas in such a background potential by comparing to the observed galaxy morphology. These results will be used to conduct a variety of novel, idealised MHD simulation of “PHANGS”-like galaxies to provide a novel, interpretative framework for the observed PHANGS galaxies.
References:
[1] https://sites.google.com/view/phangs/home?pli=1
[2] https://ui.adsabs.harvard.edu/abs/2020MNRAS.499.4455T/abstract
Supervisors: M. Hirschmann (Faculty), M. Farcy (PostDoc)
Type of Project: Master project (can be adapted for TP-IV)
Project description: Which physical process(es) can suppress star formation in galaxies at cosmic dawn?
Recent observations have discovered an increasingly large population of massive, quiescent galaxies as early as z>4, which has now been spectroscopically confirmed by first data from the James Webb Space Telescope. The origin of the suppression of star formation already less then 2 million years after the Big Bang remains an unsolved puzzle, and is debated to be related to specific physical processes such as merger events and star bursts, Supernovae explosions, AGN feedback etc. Interestingly, modern, state-of-the-art cosmological simulation largely fail to reproduce the observed number densities of quiescent, massive galaxies. This failure may be related to uncertain sub-grid models adopted, but also to the definition of quiescent galaxies in observations. The proposed project will be based on cosmological simulations of high-redshift galaxies, and aims to explore how different tracers and definitions of quiescence change the comparison of simulations to novel JWST observations of massive quiescent galaxies at z>4. Results will be used for the development of improved feedback models and the conduction of new cosmological simulations. The student will learn to work with modern cosmological simulations, and gain experience in using and developing python packages.
Supervisors: M. Hirschmann (Faculty), Adele Plat (PostDoc)
Type of Project: Master project (can be adapted for TP-IV)
Project description: Origin of extreme emission-line galaxies at cosmic dawn
New observations with the James Webb Space Telescope have unveiled a population of very high-redshift galaxies with extremely elevated emission-line ratios (e.g. in [OIII]/Hb or [OIII]/[OII] originating from ionised gas in galaxies) compared to that of present-day galaxies. The physical origin of these elevated line ratios is highly debated and could be linked to different extreme conditions of the interstellar medium and various ionisation conditions in galaxies at earliest cosmic epochs. With observations alone, however, it can be very difficult to robustly address this puzzle, i.e. to disentangle the influence of different ISM and radiation properties on line emission. Thus, this project aims at *theoretically* exploring which ISM and ionisation properties are able to cause extreme emission-line galaxies at earliest cosmic epochs, consistent with the new JWST observations. For that, novel emission-line catalogues of simulated galaxies (based on different cosmological simulations) will be employed and compared. Results will be important for the further development and improvement of emission-line modelling of simulated galaxies. The student will learn to work with modern cosmological simulations, photo-ionisation models, and gain experience in using and developing python packages.
Proposed by: David Harvey (Faculty), Yves Revaz (Faculty)
Simulating the formation of galaxies is vital to our understanding how the Universe formed and the underlying physics behind it. By simulating galaxies we can test theories, and probe the nature of the elusive dark matter. However, simulating galaxies can take a long time, in particular, owing to the treatement of the cooling due to molecules and metals. Indeed, an accurate treatement requires to solve many non-linear inter-dependent equations. If we can speed up the estimation of metal-cooling in galaxies we will be able to dramatically speed up our simulations. In a complete new and unique way, this masters project will aim to use deep learning to quickly and precisely estimate the abundances and cooling of a set of atomic species at play during the formation of galaxies. By constructing a model that can bypass the need to carry out complicated equations we can hopefully speed up the simulations and hence open up the possibility to probe new models of dark matter. The student will get then hands on simulations, machine learning, deep learning and gain experience in using python packages such as tensor flow and the GRACKLE libraries.
Most galaxies contain a supermassive black hole (SMBH) between a few millions to a few hundreds of million times the mass of the Sun. These SMBH accrete mass at the very center of their host galaxy, a process that emits tremendous amount of light. This is called a quasar, which luminosity can outshine that of its host galaxy. But how do the properties of quasars relate to the ones of their host galaxies, in particular their mass properties? Do quasars power galaxies? Or are they formed only in the most massive galaxies? Many of the answers to those questions rely on relations between the mass of SMBH and their host galaxy and their evolution with redshift. We propose to use existing hydrodynamical cosmological simulations to establish such relations, with particular focus on the dark matter content of quasar host galaxies, as this is now a measurable quantity thanks to gravitational lensing.
- Three quasi-stellar objects acting as strong gravitational lenses (Courbin et al. 2012)
- Concordance between Observations and Simulations in the Evolution of the Mass Relation between Supermassive Black Holes and Their Host Galaxies ( Ding et al. 2022)
The variability of strongly lensed quasars light curves is set by 3 main components: 1- the continuum flux of the source, 2- microlensing by stars in the lens galaxy and 3- reverberation of the continuum by the Broad Line Region (BLR) (see Fig 1. of the first reference paper). These light curves therefore carry information about the structure of the quasar and the content of the lens galaxy.
A method using the power spectrum of such light curves was designed to disentangle the different variability components and constrain the physical properties of the lens system.
The aim of this project is to refine this method and apply it to the large COSMOGRAIL sample of microlensing light curves.
- Power spectrum fitting of the microlensing light curve of QJ0158-4325: Paic et al.(2021)
- Microlensing light curves of lensed quasars sample: COSMOGRAIL XIX
Monitoring the luminosity of multiple images of a strongly lensed quasar over several years allows us to study the structure of the quasar and the content of the lensing galaxy. Because of Earth revolution and meteorological hazards, the resulting light curves are not regularly sampled and contain large discontinuities. The aim of this project is to use a method called inpainting to predict the signal where it is missing. This method relies on wavelet transforms to decompose the light curves into multiple frequencies, in order to recover realistic and continuous curves that match the statistical properties of the observed ones. Ultimately, those regularly sampled light curves will give better constraints on the physical model of quasar structures.
Another application of inpainting in two dimensions is the prediction of the light distribution of elliptical galaxies, when part of that light is hidden. This is typically the case when masking the light from a background source superimposed to the lens galaxy, to reconstruct solel the light of the lens galaxy. In this part of the project, the goal is to characterize the inpainting method on real observations of isolated galaxies, by artificially creating a mask that hides different portions of those galaxies. Depending on the progress throughout the project, an application to gravitational lens modeling will be envisioned.
- Light curves of lensed quasars: COSMOGRAIL XIX
- Application of wavelets and inpainting (but for 2D images): Lanusse et al. 2016
- Starlet (wavelet) transform : Starck et al. 2011
- Other techniques for modeling 1D or 2D signals:
In the last few decades, many cosmological probes have been proposed to study the Universe expanding history and in particular, the Dark Energy. 3D mapping of the Universe measures the 2D structures in the sky at different redshifts (1D), and thus provides time-evolving information of the cosmic evolution history. 3D maps can be obtained either through a galaxy survey using optical telescopes such as SDSS [1], or through radio telescopes by detecting the 21cm emission line from neutron hydrogens in the sky [2].
Quasars, as extremely bright objects in the sky, are frequently observed by optical galaxy surveys to obtain their positions in the expanding Universe. However, due to their broad spectral lines and astrophysical effects, quasar measurements are often subject to redshift and position uncertainties [3]. Meanwhile, 21cm detections from radio telescopes provide accurate redshift information at locations of quasars. Combining the measurements of optical survey and 21cm radio survey will help improve the accuracy of quasar measurements, thus improving its cosmological measurements.
During this project, the students will stack 21cm radio maps at the locations of quasars measured by optical surveys [4]. This is to reduce quasar redshift uncertainties using radio information, with the ultimate goal of improving current cosmological constraints. The student will acquire knowledge of both optical and radio cosmology using 3D maps. The student is expected to gain hands-on experiences with Python and statistical data analysis at the end of the project.
References:
[2] https://arxiv.org/pdf/astro-ph/0401340.pdf
Cosmic voids are large regions of space in the Universe which have low matter densities. These structures are influenced by the expansion of the Universe and the clustering of matter. Thus, they can be used to better understand the cosmology and the large scale structure.
The purpose of this project is to test whether numerical models could be used to extract cosmological information from the clustering of voids and to perform an analysis on simulations.
The student will learn how to better code in python and how to extract information using Monte Carlo techniques and Bayesian inference methods.
- https://arxiv.org/abs/1904.01030
- https://arxiv.org/abs/1712.07575
- https://arxiv.org/abs/0712.3049
Cosmic voids are large regions of space in the Universe which have low matter densities. These structures are influenced by the expansion of the Universe and the clustering of matter. Thus, they can be used to better understand the cosmology and the large scale structure.
The purpose of this project is to predict the constraining power — on cosmological parameters — of the size distribution of cosmological voids using the Fisher Matrix formalism.
The student will learn how to better code in python and how to estimate the amount of information that an observable carries about an unknown parameter (using Fisher Matrix).
- https://arxiv.org/pdf/1909.11107.pdf
- https://arxiv.org/pdf/2206.01709.pdf
- https://arxiv.org/pdf/2205.11525.pdf
- https://arxiv.org/abs/1511.04299
The purpose of this project is to check all possible sources of inhomogeneity and to create models that can account for them.
The student will learn how to better code in python, make comprehensive figures and analyse large data-sets.
- https://arxiv.org/abs/1704.00338
- https://arxiv.org/abs/1508.04478
- https://arxiv.org/pdf/2208.08515v1.pdf
- https://arxiv.org/pdf/2208.08513v1.pdf
- https://arxiv.org/pdf/1903.02474.pdf
Cosmic reionization is studied using Radiative Transfer (RT) codes to simulate the radiative feedback from the primordial galaxies and stars in the early Universe. Part of the simulation uses a series of differential equations to keep track of the evolution of the neutral hydrogen fraction and the gas thermal evolution in the intergalactic medium (IGM) based on the computed ionizing and heating radiation. These simulations constitute the ground-basis experiment for the upcoming Square Kilometre Array (SKA) radio telescope. Ideally, they require huge volumes (~1Gpc size) with many particles (~10^11) to reproduce the relevant cosmological scale. Therefore, RT and heating simulations require considerable computational power as the number of operations needed to solve the differential equations grow exponentially with the number of simulated particles and sources.
Here, we propose to upgrade the existing C2Ray code, broadly used to simulate the cosmic Epoch of Reionisation simulations. This code uses the short-characteristic ray tracing method to compute the propagation of UV radiation from the primordial galaxies and black holes into the IGM. This code was written in the late ’90 (then updated over the years) in Fortran and is CPU paralleled with MPI and OpenMP. However, over the years, computational algorithms evolved. New ray tracing methods (e.g., domain decomposed RT, accelerated RT, etc.) allow calculating the radiation propagation in cosmological simulation with little numerical diffusion. Moreover, these new techniques can be GPU accelerated, something that in the late ’90 was impossible.
Furthermore, there is the intention to implement the current source model of C2Ray, which uses a simple halo-based model that has stayed the same over the years and proposes a pretty simplistic approach. The idea is to implement a model that follows the halo number count in each simulation sub-region, which we could calibrate based on the luminosity function derived from observations such as JWST or WFIRST.
This project would involve 1) embedding some of the core Fortran subroutines into Python and implementing GPU acceleration. 2) implement a modern ray tracing method. 3) implement a new source model based on the sub-grid dark matter halo number density. The final goal is to compare the computational accuracy and speedup of the new ray tracing code with some tests and the new source model, finalized with one or more publications.
The Epoch of Reionization (EoR) is a crucial period in our Universe’s history, which includes the birth of the first radiating sources that influenced the formation and evolution of latter-day structures. These luminous objects produced enough UV radiation (energy ≥ 13.6 eV) to propagate into the intergalactic medium (IGM), ultimately transitioning our Universe from a cold and neutral state to a hot and ionized state. This exciting period is one of the least understood epochs in the Universe’s evolution due to the lack of direct observations.
We can probe the reionization process by observing the redshifted 21-cm neutral hydrogen signal produced during cosmic reionization. The Square Kilometer Array telescope will be sensitive enough to detect 21-cm statistical quantities and produce images of the signal distribution in the sky. A sequence of such 21-cm images at different redshifts will constitute a three-dimensional data set known as a 3D tomographic dataset (or 21-cm lightcone).
Here, we propose to employ SKA tomographic data as a foreshadowing method to identify the region of interest for future and ongoing galaxy observations. By resolving physical scales down to 5 arcseconds in the plane of the sky, SKA can pinpoint the locations of individual ionized bubbles. These regions are of interest for infrared and near-infrared space telescopes (e.g., JWST, WFIRST, Euclid), which aim to observe galaxy cluster formation in the early Universe.
In recent years, Artificial intelligence (AI) and deep learning techniques have been proven to be powerful tools. For this reason, this workshop would involve 1) developing a Bayesian Convolutional Neural Network (BCNN), which predicts the position and area of high-redshift galaxy clusters from SKA 21-cm images based on the ionized region morphology. 2) Run its own 21-cm simulations and use existing and post-process large N-body simulations to test its approach on different EoR source models. 3) Implement extra-galactic Radio point sources and galactic synchrotron emission to test the student approach on different existing foreground contamination avoidance techniques.CONTEXT
To characterize the physical properties of space debris eSpace/LASTRO is currently expanding its observation possibilities. EPFL has access to the TELESTO telescope, located at the Sauverny observatory close to Geneva which would represent a good observing facility. However, in its current state, the telescope is not very well suited for the observation of space debris. There is no possibility to target or track objects in the orbit of Earth. To overcome these problems and enable TELESTO for space debris observations, the telescope control software needs to be improved. A longer-term goal is the complete automation of the telescope and the work done in this project represents an important step towards achieving this goal.
PROJECT SCOPE
During the project, you will familiarize yourself with the telescope and its software environment. You will learn about requirements of passive optical observations of space debris. In order for the telescope to be used for space debris observations, the control of the various subsystems of the facility needs to be integrated into an easy-to-use interface. The resulting software module should provide a command-line or script-based interface through which observers can create observation plans that are automatically processed by the telescope. The basic requirements for the module are pointing and tracking based on two line elements (TLEs), camera and filter control as well as routines for automated acquisition of calibration frames.
Observational relation between the magnetic field and the star formation rate in dwarf galaxies: https://ui.adsabs.harvard.edu/abs/2011A%26A…529A..94C/abstract
Past projects
Observation of partially polarized synchrotron radiation emitted by the relativistic cosmic rays gyrating around the magnetic field lines is one of the key tools in mapping the strength and topology of galactic magnetic fields. In particular, the Faraday rotation measure of radiations constrains the strength and direction of the line of the sight component of the magnetic field, while the planer component’s strength is inferred from the degree of polarization at various frequencies.
https://ui.adsabs.harvard.edu/abs/2015AN….336..991B/abstract
radio synchrotron emission:
https://ui.adsabs.harvard.edu/abs/2015A%26ARv..24….4B/abstract
The project will involve developing a signal model framework which will be used for a likelihood-ratio test. Performance of the model will be compared against null hypothesis likelihood tests and a standard source-finding package such as SoFiA. The algorithm will be developed on simulated data and tested on data from one of the SKA precursor experiments.
References:
https://arxiv.org/abs/1501.03906
- Reduce the number of cosmological volumes required to obtain smooth templates used for cosmological fits (i.e. void BAO)
- Assess whether the denoising procedure is unbiased such that data SNR can be improved.
In this project the student will write and test different Deep Learning approaches to denoising the power spectra, starting by using convolutional autoencoders on the data available (though generating more data is possible). The student will address generalization issues and identify an appropriate training protocol (i.e. using clean target data vs. using noisy target data). If possible, the student may evaluate any bias the denoiser may add in terms of cosmological parameters.
References
Denoising: https://arxiv.org/pdf/1803.04189.pdf
Clustering measurements:
https://academic.oup.com/mnras/article/473/4/4773/4443205
https://arxiv.org/pdf/2007.09009.pdf
Gravitationally lensed quasars are valuable probes for studying astrophysics and cosmology. In particular, the magnification effect allows us to observe and study supermassive black holes only a few hundred million years after the start of the universe. The number and growth rates of these quasars has significant consequences for our understanding of the reionisation of the universe, and of black hole formation and growth mechanisms, however only a few systems are known.
This project will aim to find these elusive systems using a spectral energy distribution model-fitting approach, based on multi-wavelength broadband photometric data. Once candidates are identified, pixel modelling will be used to select the most promising systems for spectroscopic follow-up.
Gravitationally lensed quasars are of utmost importance for astropysics and cosmology. However, these systems must first be located on the sky. Of particular interest are quadruply imaged quasars (quads), however these are very rare, and often blended in ground-based datasets. This project proposes the use catalogues from the space-based telescope Gaia to find new quads (only ~40 are currently known), to then be spectroscopically followed-up and used for a variety of science goals: microlensing, galaxy mass measurements, time delay cosmography etc. The student will be allowed to propose their own investigations and selection methods, but will start with investigation of differences in parameters between the multiple Gaia data releases, analysing the results of known quads and common contaminant systems. The final result should be a selection algorithm to recover known lenses, and new potential candidates. Contact [email protected] for discussion/further details.
When analyzing strong gravitational lenses, we need to assume specific models for the light and mass distribution of the lens galaxy, as well as a light model for the source galaxy. In the past few years, some studies and more recently the work by Cao et al. 2021 , have demonstrated limitations of the models currently being used, arguing that it is too simplistic. Even though it can reproduce observations, using such simple models can lead to strong biases on the inferred properties of the lens galaxies, and the value of the Hubble constant. In their work they simulated observations of gravitational lenses using spectroscopic data to obtain realistic galaxy mass profiles. These simulations have been publicly released by the authors here . The goal of this project is to apply a new modeling code (already developed) on this dataset that supports more complex mass models, and compare the resulting quantities with previous simpler models.
- The paper by Cao et al. 2021 with simulated lenses
- A technique for deviation to smooth gravitational potential : Vegetti & Koopmans 2009
- Slightly more flexible power-law potential : Stacey et al. 2021
Dark matter is a universal phenomenon that can be observed from the galactic to the largest observable scales. We intend to investigate the nature of dark matter from the observation of the filaments that constitute the intergalactic medium. The intergalactic medium is observed in absorption in quasar spectra. We intend to train a neural network with mock data generated from cosmological numerical simulations from different cosmologies with cold and warm dark matter. We intend to investigate if the classical estimators used in the Lyman-alpha forest are optimal estimators or if other estimators could be considered.
https://arxiv.org/abs/0711.3358
https://arxiv.org/abs/1809.06585
The Universe is filled with a cosmic web of baryonic filaments since very high redshift, that is called the intergalactic medium. The intergalactic medium is largely observed in absorption in the spectra of distant and bright objects as quasars. It was speculated that the width of the absorbing lines in the quasar spectra is due to the spatial extent of the filaments. We aim to directly investigate this statement with the use of numerical simulations. The student will analyze the outputs of existing cosmological numerical simulations and compare with mock spectra extracted from the same simulations. estimators or if other estimators could be considered.
Massive cosmological spectroscopic surveys provide a unique way to probe the 3D distribution of large-scale structures in the Universe, which can be used to measure the expansion history and growth of structures, thus constraining the Hubble parameter, as well as the properties of different cosmological components, such as the dark energy equation of state and neutrino mass.
As the product of the spectroscopic surveys, the database for the 3D positions of galaxies and quasars has been growing rapidly. In the past decade, the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS) and SDSS-IV extended BOSS (eBOSS) have measured millions of spectra. The ongoing Dark Energy Survey Instrument (DESI) is expected to take over 30 million spectra in 5 years. The next generation experiments, such as the MegaMapper, MUltiplexed Survey Telescope (MUST), and Spectroscopic Survey Telescope (SpecTel), are foreseen to increase the number of galaxy/quasar positions by another order of magnitude. Thus, we expect a factor of 10 gain on the figure of merit of cosmological constraints in the next 10-15 years over the current results.
In this project, we aim at performing a Fisher information matrix analysis of future large-scale spectroscopic surveys, to provide forecasts on various cosmological analyses, including the standard baryonic acoustic oscillation (BAO) and redshift-space distortion (RSD) measurements in the flat-ΛCDM framework, as well as constraints on alternative model parameters, such as the primordial non-Gaussianity, the curvature, neutrino mass, and the dark energy equation of state. To this end, we shall accomplish realistic estimations of the number of targets, properties of the targets, and sky area and redshift range coverage, based on the expected specifications of future surveys.
We have come to an era of precision cosmology, where cosmic microwave background (CMB) measurements from the Planck satellite constrain the standard ΛCDM model within <1% accuracy. However, for the study of dark sector dominating in the late time Universe, observations at much lower redshift than the CMB are required. This can be achieved by measuring the large-scale-structures in the Universe through HI intensity mapping (IM), which maps the large-scale HI intensity fluctuations with the ultimate objective of detecting the baryonic acoustic oscillations (BAO) and constraining Dark Energy. Alternatively, gravitational lensing provides an independent method to probe the dark sector through, e.g., strong lensing time-delay measurements and weak lensing shear measurements.
The latest constraint on H0 through time-delay lensed quasars by the H0LiCOW collaboration [[i]] is consistent with that from the independent type Ia supernovae measurement [[ii]] under the standard ΛCDM cosmology. However, they are ~5 in disagreement with CMB and BAO measurements[[iii]] [[iv]]. To understand the high statistical significance of the H0tension, the student will combine HI IM with other probes including time-delay lensed quasars, CMB, BAO from optical surveys, and type Ia supernovae, in order to jointly constrain H0 and explore possible new physics that potentially resolve the H0 tension between early- and late-Universe probes. The project will involve Fisher matrix projection of SKA-like IM experiments, with prior information from other probes. Depending on the progress, the work can potentially extend to end-to-end simulations as the secondary step.
References:
The Universe is permeated by magnetic fields, yet their origin is one of the greatest mysteries of cosmology. Could they have been generated already shortly after the Big Bang or are they
only a property of the modern Universe? How do they evolve in cosmic history and how do they influence the formation of cosmic structures?
To answer these questions, astrophysical plasmas are often described with magnetohydrodynamics (MHD). However, at high energies, such as less than a second after the Big Bang or in hot young neutron stars, MHD necessarily needs
to be extended to include additional electric currents caused by quantum anomalies that are related to the chirality of fermions. Recently, such ”
chiral MHD” has been studied intensely with direct numerical simulations, which yield a very different evolution of magnetic fields as compared to the
classical MHD case.
One difference is the possible occurrence of chiral magnetic waves, a spatial propagation of the asymmetry between left-
and right-handed fermions, but also the propagation of Alfvén waves is modified in chiral MHD.
In this project, the student will perform direct numerical simulations with the Pencil Code to study the properties of chiral magnetic
waves. A set-up for a wave scenario will be provided and the student will explore new regimes of the parameter space. He or she will analyze the spatial
propagation of waves, time series, and energy spectra.
URL/References:
Simulations of chiral waves [see Sections 4.3 and 4.4]:
https://ui.adsabs.harvard.edu/abs/2018arXiv180806624S/abstract
Theory of chiral MHD [see Section 5 for a discussion of the dispersion relation in chiral MHD]:
https://ui.adsabs.harvard.edu/abs/2017ApJ…846..153R/abstract
Pencil Code: http://pencil-code.nordita.org/
Proposed by: Richard Anderson (new faculty)
Type of project: Master/CSE project
Flavour: Observation/Data science
Proposed by: Jean-Paul Kneib (Faculty)
Type of project: Master/CSE project
Flavour: Observation/Data science
Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: Observation/Data science
Gravitationally lensed quasars are of utmost importance for astropysics and cosmology. However, these systems must first be located on the sky. One way to find such systems is to search the spectra of massive galaxies for signs of a background/higher-redshift quasar. If present there is a strong possibility that the background quasar is multiply-imaged. This project will aim to develop techniques to discover such systems in existing SDSS/BOSS spectra through model fitting and systematic testing of mocks, with forecasts for future spectroscopic surveys (such as 4MOST and Euclid). Promising candidates will be targetted for resolved spectroscopy. The project can also be extended to discovering massive galaxy absorption lines in the spectra of high-redshift quasars as possible lens candidates.
URL/References:
https://arxiv.org/abs/1810.04480 (see
figure 8)
https://arxiv.org/abs/1604.01842
https://arxiv.org/abs/1110.5514
Proposed by: Richard Anderson (new faculty)
Type of project: Master/CSE project
Flavour: Observation/Data science
Proposed by: Richard Anderson (new faculty)
Type of project: Master/CSE project (also adaptable for TP4b). Project flavour: Observations/Data Science
Proposed by: Martin Millon (PhD), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: Observation/Data science
Time-delay cosmology with strongly lensed supernovae is expected to provide a highly precise measurement of the expansion rate of the Universe, i.e. the Hubble Constant. This technique is based on the measurement of time-delays between the different images created by a strong gravitational lens. Although only 2 lensed supernovae have been discovered so far, future surveys such as LSST or ZTF are expected to discover dozens of such systems every year in the near future, improving the precision on the Hubble constant.
This project aims to adapt the current curve-shifting technique, developed for lensed quasars, to lensed supernovae and assess its reliability. The first part of the project will consist in producing mock lensed Supernovae light-curves before developing the method to analyze them. A precise and accurate measurement of the time-delays is critical as the errors propagate directly to the final estimate of the Hubble constant.
URL/References:
– https://arxiv.org/abs/1802.01584
– https://arxiv.org/abs/1805.04525
Proposed by: Martin Millon (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b or Master/CSE project
Project flavour: Data Science/Simulation
Abstract:
Microlensing is a unique tool to study the inner structure of Black Hole accretion disks (also called quasars). It is produced by stars passing in front of a background lensed quasar resulting in an extra amplification of its luminosity due to the gravitational lensing effect.
The aim of the project is to pursue the development of a Convolutional Neural Network (CNN) to analyze this signal. If successful on the training set, the method will be applied on real observation to obtain the first estimate of the size of a quasar accretion disk with a CNN.
URL/References:
Proposed by: Ginevra Favole (Postdoc), Jean-Paul Kneib (Faculty)
Type of project: Master project
Project flavour: simulations/galaxies
The sub-halo abundance matching (SHAM) and the halo occupation distribution (HOD) model are the most used techniques to populate dark matter haloes in N-body simulations with observed galaxies. These prescriptions allow us to
link the dark and the visible sectors of our Universe, directly probing the stellar-to-halo-mass relation. By coupling SHAM and HOD with large-volume high-resolution cosmological simulations, we are able to generate accurate
galaxy mock catalogues, which we then use to model the galaxy clustering and weak lensing signals from galaxies. These are the two cosmological probes that ongoing and future surveys such as DESI, Euclid, 4MOST or LSST will
provide us.
This project aims to adapt existing SHAM and HOD software scripts to new N-body products and make them publicly available for future cosmological analyses on GitHub.
URL/References:
https://ui.adsabs.harvard.edu/abs/arXiv:astro-ph%2F0408564
https://ui.adsabs.harvard.edu/abs/2006ApJ…647..201C/abstract
https://ui.adsabs.harvard.edu/abs/2016MNRAS.461.3421F/abstract
Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)
Type of project: Master project
Project flavour: Numerical methods
Dark matter is a universal phenomenon that can be observed from the
galactic to the largest observable scales. We intend to investigate
the nature of dark matter from the observation of the filaments of
baryons that constitute the intergalactic medium. The intergalactic
medium is observed in absorption in quasar spectra. We intend to train
a neural network with mock data generated from cosmological numerical
simulations from different cosmologies with cold and warm dark
matter. We intend to investigate if the classical estimators used in
the Lyman-alpha forest are optimal estimators or if other estimator could be considered .
URL/References:
Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)
Type of project: Master project
Project flavour: Numerical methods
The Universe is filled with a cosmic web of baryonic filaments since
very high redshift, that is called the intergalactic medium. The
intergalactic medium is largely observed in absorption in the spectra
of distant and bright objects as quasars. It was speculated that the
width of the absorbing lines in the quasar spectra is due to the
spatial extent of the filaments. We aim to directly investigate this
statement with the use of numerical simulations. The student will
analyze the outputs of existing cosmological numerical simulations and
compare with mock spectra extracted from the same simulations.
URL/References:
Proposed by: Aymeric Galan (PhD), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, image processing, neural networks
Modelling strongly lensed systems is the angular stone of a wide range of topics. There are three main components of a lens model: the mass distribution of the lensing galaxy, and the surface brightness distribution of the source galaxy. On top of a number of degeneracies affecting the modelling, it is sometimes hard to pin-point which component of the model is at fault when looking at the residuals (= the difference between model and data). Additionally, in the light of future large samples of data, it is important to design automated procedures to replace current visual inspection. The goal of this project is to simulate a suite of residuals maps and train a neural network to classify these maps depending on the missing component in the model.
References:
https://arxiv.org/pdf/0805.0201.pdf, https://arxiv.org/pdf/1807.09278.pdf, https://arxiv.org/pdf/1806.07897.pdf
Proposed by: Austin Peel (postdoc), Aymeric Galan (PhD), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, differentiable programming, neural networks
Neural networks are more and more employed in strong lens modelling. In this project, the goal is to train a variational autoencoder (VAE) to learn the surface brightness of typical lensed galaxies. A potential improvement might be introduced by using the recently established « disentangled VAE » architecture. With such networks, the goal would be to extract specific features from the source galaxy, such as size, orientation and ellipticity, directly from the abstract space defined by the VAE. Furthermore, using the differentiable programming framework JAX, the VAE could be included in a larger modelling pipeline in a modular way.
References:
https://jax.readthedocs.io/en/latest/
https://arxiv.org/pdf/1910.06157.pdf, https://arxiv.org/pdf/1709.05047.pdf, https://arxiv.org/pdf/2005.12039.pdf
Proposed by: Karina Rojas (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: Numerical modeling, statistics, image processing
We are currently finding strong gravitational lenses in the CFIS (Canada-France Imaging Survey) in a an effort to prepare the ESA Euclid mission. These will be used as a test bench to perform massive and fast mass modeling of galaxies with the “lenstronomy” software developed in Stanford and in part at EPFL. The work will be to implement as reliable modeling scheme and to provide a new sample of lenses with exquisite ground-based imaging. Eventually, some these developments will be applicable to the Euclid space telescope to be launched in 2022.
Proposed by: Benjamin Clément (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, image processing, cosmology
It is proposed to design a machine learning algorithm to find giant lensed arcs in galaxy clusters. The algorithm, if successful, will set the basis to design the arc-finding algorithm for the ESA Euclid space telescope to be launched in 2022. But the present will be based on existing sharp and deep Hubble images of galaxy clusters known as the “Hubble Frontier Fields”. The work will involved the design of a simulation chain to help training neural networks and will be applied to the Hubble image to validate its performances.
Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, image processing, cosmology
Gravitationally lensed quasars are of utmost importance for cosmology. Measuring their time delays can indeed provide an independent measurement of the Hubble constant provided enough systems are studied. These can be found using machine learning applied to massive ground-based surveys such as CFIS, DES, PANSTARRS, etc. The present work will involve the construction simulated training sets and the design of CNNs to look for new lensed quasars in existing ground-based surveys. Depending on success, this work may extend to the ESA Euclid space telescope.
URL: https://fr.wikipedia.org/wiki/Euclid_(télescope_spatial)
Proposed by: James Chan (Postdoc), Frédéric Courbin (Faculty), and Cameron Lemon (Postdoc)
Type of Project: TP4b or Master/CSE project
Project flavour: Observation/Data Science
Strong gravitationally lensed quasars provide powerful means to study galaxy evolution and cosmology. There have been several undertakings to look for them in various surveys. In this project, we will apply an existing algorithm, CHITAH, to search for more lensed quasars in the field of CFIS. CFITS is a legacy survey for the Canadian and French communities. It provides excellent image quality with ~5,000 square degrees. We aim at new lensed quasar candidates in various surveys, in particular CFIS. This project requires coding skills and interest in imaging analysis.
URL/References:
http://www.cfht.hawaii.edu/Science/CFIS/
Proposed by: Yves Revaz (Faculty) and Loic Hausammann (PhD)
Type of Project: Master
Project flavour: Simulation / Galaxies
The Milky way is known to impact the evolution of its satellites through two main mechanisms: the tidal and ram pressure stripping.
Both have been studied for a long time, but no one has looked at the impact of the radiation emitted by the Milky Way. This radiation (mostly in the UV band) can heat the cold gas and is supposed to facilitate its stripping through the ram pressure.
This project implies developing a model for the Milky Way’s UV based on observations and running simulations of dwarf galaxies that include the UV through the moving box technique.
This technique allows simulations of satellite dwarf galaxies to be run at very high resolution while still taking into account correctly the Milky Way.
URL / References:
This project is in practice at University of Geneva at the Observatoire de Sauverny.
Proposed by Vincent Bourrier and David Ehrenreich (Unige) – Frédéric Courbin (Contact EPFL faculty).
Type of Project: Master/CSE project
Project flavour: Observation/Data Science
The Unige exoplanet group in the Geneva Observatory offers an internship in the frame of the Exoplanets on the Edgeprogram, carried out with the HARPS-N spectrograph on a sample of transiting exoplanets at the border of the “Neptunian desert”.
This mysterious feature is a lack of Neptune-size exoplanets orbiting very close to their star, which could be linked to the way such planets migrated from their original birthplace far from the star to their present location. This hypothesis can be tested by analyzing the properties of exoplanets observed today at the border of the desert, as their orbital architecture (the shape of their orbit and its orientation with respect to the star) contains the imprint of their past migration.
As a planet transits in front of its star, it leaves a specific spectroscopic signature (the so-called Rossiter-McLaughlin effect) whose properties depend on the planet’s orbital architecture. A high-resolution spectrograph like HARPS-N, devised in Geneva and installed on a 4-m class telescope, is an ideal tool to search for these signatures and get precious clues about the origins of the Neptunian desert.
The internship consists of the analysis of the HARPS-N data, the extraction of the Rossiter-McLaughlin signals, and when relevant their interpretation with a migration model currently in development within our group.
mich