TP-IVa (autumn semester): These practical works are designed to illustrate a broad range of astrophysical observational topics and to train with programing and scripting in python. They include fitting theoretical models to data, processing of astronomical images and spectra and overall prepare students to the research topics proposed for the next spring semester.
Registration opens on is-academia. Interested students should contact F. Courbin and register under their names on the platform.
TP-IVb (spring semester): EPFL’s Laboratory of Astrophysics offers students access to cutting-edge research in modern astrophysics and cosmology via practical work assignments, aiming to decipher existing observations, test established theories, and push forward innovative tools and methods to increase our understanding of the Universe.
The wide variety of these mini research projects reflects the broad range of topics and expertise covered by the Laboratory of Astrophysics.
Applications from other disciplines are welcome – if you wish to participate in a project, please contact directly the indicated faculty member or send an email to [email protected].
Note that some of the Master thesis projects can potentially be adapted to be TP-IVb. Please check both pages!
Proposed projects
Available as 2022-2023 TP-IVb (or other 8 credits projects).
In this context, this project aims to quantify the star formation activity in a sample of galaxies located in the large scale structure, filaments and groups, linked to the Virgo galaxy cluster. Images of galaxies, in the Halpha line sensitive the galaxy star formation rate, will be analyzed with wavelets in order to determine the number, size and spatial distribution of the star-forming regions. The immediate and concrete goal of this project is to build and test the code of wavelet analysis. The results will then be put in an astrophysical context.
Proposed by: Jennifer Schober (SNF PRIMA Group Leader)
The infrared-radio correlation (IRRC) of star-forming galaxies can be used to estimate their star formation rate (SFR) based on the radio continuum luminosity at MHz-GHz frequencies. For application in future deep radio surveys, it is crucial to know whether the IRRC persists at high redshift z. This should be taken into account for SFR calibrations based on radio luminosity.
With a semi-analytical model of galaxies that depends on stellar mass and redshift we can predict the radio emission, depending on the line-of-sight. We are interested in answering the following questions: How does the IRRC change at different radio frequencies? How does the spectral index evolve with redshift? What determines the non-linearity of the IRRC?
Proposed by: Yoan Rappaz (PhD Student) & Jennifer Schober (SNF PRIMA Group Leader)
Magnetic fields are ubiquitous in the universe. They are observed on a very wide range of scales and have dynamical implications for many processes in astrophysics. In particular, the question of the origin of cosmic magnetic fields, as well as the processes that amplified them (dynamos) during the evolution of the universe, remains an open question today. In the case of galaxy clusters (the largest gravitational structures in the cosmos), magnetic fields of the order of a few microGauss are observed. Various dynamo theories to explain the amplification of cosmic magnetic fields are currently under investigation. Although each of these theories studies a specific system, almost all of them take as their starting point the random and disordered motions of gas, called “turbulence”.
The study of the nature of a system’s turbulence is, therefore, necessary to understand the dynamics of the magnetic fields. In the case of galaxy clusters, turbulence is mainly created by dark matter halos accreting and merging together during the cluster’s formation. Active Galactic Nuclei (AGNs) are also a source of turbulence, and are the subject of active research, as the involvement of this phenomenon in the amplification of magnetic fields is complex to establish.
The project proposed here is to study the nature, spectrum, and various characteristics of the turbulence generated by AGNs during the formation of a galaxy cluster. We will use a semi-analytical approach based on the use of Merger Tree algorithms (Modified Galform). The student will have the opportunity to study the physical theory of cosmic magnetic fields, dynamo, and turbulence, thus mixing cosmology, plasma physics, and hydrodynamics. A good knowledge of Python programming is required.
-
arXiv:1511.04299
-
arXiv:1511.04391
-
arXiv:1511.04405
- arxiv:2106.11061
- arxiv:2212.06860
The EPFL Radio Waves [1] project has been constructing and developing a Small Radio Telescope (SRT) over the past year. This 1.9m parabolic dish antenna has been used to observe basic radio emissions from the Milky Way galaxy and track satellites, and will be installed on the EL-B roof building early this semester.
We would like to commission this SRT by measuring the doppler spectrum of interstellar atomic hydrogen and the dynamics of the galactic rotation. This project will involve observing the 21-cm hyperfine line of interstellar atomic hydrogen in various directions along the Milky Way, and designing the required software and analysis pipelines. From these observations one can measure the radial velocities of the HI clouds in the galactic disc and deduce the spiral-arm structure of the Galaxy.
[1] http://www.radiowavesepfl.ch/
Several projects in scientific computing for radio astronomy are available at: https://www.epfl.ch/research/facilities/scitas/interdisciplinary-student-projects/
Supervisor: Emma Tolley (LASTRO)
Type of Project: TP4b/Master/CSE project
Project flavor: Data Science/Software
Radio astronomy provides a powerful probe of phenomena within and beyond our solar system, allowing us to study the evolution of galaxies, search for evidence of complex molecules in space, observe radio sources associated with black holes, and map the hydrogen throughout our Universe from nearby galaxies to the Cosmic Dawn epoch when stars and galaxies first form. Image synthesis in radio astronomy is done with interferometry, a powerful technique allowing observation of the sky with antenna arrays with otherwise inaccessible angular resolutions and sensitivities. However, image formation is a complicated problem.
The Bluebild algorithm [1] offers a novel approach to image synthesis, leveraging fPCA to decompose the sky image into distinct energy eigenimages. The software of bluebild is under active development as part of a PASC project [2]. The algorithms are developed by EPFL Imaging Center, the HPC tuning is done by SCITAS, and the radioastronomy applications are tested and used by LASTRO.
In this project the student will work on the python interface to bluebild and integration with other radioastronomy data reduction pipelines. For example, an interface compatible with stimela2 will allow bluebild to run as part of the caracal pipeline [3], used for creating images for the MEERKat telescope, and and interface compatible with rascil [4] will allow bluebild to image mock observations of next-generation radiointerferometry telescopes.
[1] https://infoscience.epfl.ch/record/269252?ln=en
[2] https://www.pasc-ch.org/projects/2021-2024/next-generation-radio-interferometry/
[3] https://github.com/caracal-pipeline/caracal
[4] https://developer.skao.int/projects/rascil/en/latest/index.html
Proposed by: Michele Bianco (Postdoc), Jean-Paul Kneib (Faculty)
Type of project: Practical works
Project flavor: Data Analysis
The Epoch of Reionization (EoR) and the Cosmic Dawn (CD) are essential periods in the history of our Universe, which include the birth of the first radiating sources that influenced the formation and evolution of latter-day structures. These luminous objects produced enough ultra-violet radiation (energy ≥ 13.6 eV) that propagated into the intergalactic medium (IGM), ultimately transitioning our Universe from a cold and neutral state to a hot and ionised state. This exciting period is one of the least understood epochs in the Universe’s evolution due to the lack of direct observations.
The spin-flip transition in neutral hydrogen produces a signal at the rest frame with a wavelength of 21 cm, which is known as the 21-cm signal. The reionisation process can be probed by observing the 21-cm signal emitted during the EoR, which due to the Universe’s expansion, is shifted to lower
frequencies (redshifted). The Square Kilometre Array (SKA) experiment is an international effort to build the world’s largest radio telescope with unprecedented sensitivity that will provide the first direct observation of the EoR and produce high-resolution images of the 21-cm signal. A sequence of such 21-cm images observed at different frequencies will constitute a three-dimensional data set known as the multi-frequency tomographic dataset. However, the biggest challenge for upcoming observational data analysis of these astronomical images is to separate the 21-cm signal from the undesired foreground and instrumental noise contamination, as these outshine the cosmological signal by several orders of magnitude.
In this study, we propose a novel, complete study of the different available approaches for foreground subtraction in the case of the SKA-Low 21-cm tomographic dataset. Such methods could be 1) polynomial fitting, principal component analysis (PCA), independent component analysis (ICA) and
wedge removal. 2) The student will be provided with a state-of-the-art simulation of the evolving 21-cm signal detected by the SKA-Low experiment. 3) The student will provide a qualitative analysis to quantify the effect of the subtraction process on the recovered 21-cm signal and power spectra.
Supervisor: eSpace/LASTRO (Prof. Jean-Paul Kneib/Stephan Hellmich)
Type of Project: Semester project (TP4)
Duration: 14 weeks (Official start/end date: February 20-June 2)
Submission of final report: June 19
Final Presentation: TBD
Recommended: This project is suitable for a student interested in the software design of autonomous observatories and space surveillance and tracking with a background in Software Engineering. Prior knowledge in JavaScript and Python is a plus.
CONTEXT
To characterize the physical properties of space debris, eSpace/LASTRO is currently expanding its observation possibilities. EPFL has access to the TELESTO telescope, located at the Sauverny observatory close to Geneva which represents a good observing facility. However, in its current state, the telescope is not very well suited for the observation of space debris. There is no possibility to target or track objects in the orbit of Earth. To overcome these problems and enable TELESTO for space debris observations, the telescope control software needs to be improved. A longer-term goal is the complete automation of the telescope and the work done in this project represents an important step towards achieving this goal.
PROJECT SCOPE
During the project, you will familiarize yourself with the telescope and its software environment. You will learn about requirements of passive optical observations of space debris. In order for the telescope to be used for space debris observation, the control of the various subsystems of the facility needs be integrated into an easy-to-use interface. The resulting software module should provide a command-line or script-based interface through which observers can create observation plans that are automatically processed by the telescope. The basic requirements for the module are pointing and tracking based on two line elements (TLEs), camera and filter control as well as routines for automated acquisition of calibration frames.
Supervisor: eSpace/LASTRO (Prof. Jean-Paul Kneib/Stephan Hellmich)
Type of Project: Semester project (TP4)
Duration: 14 weeks (Official start/end date: February 20-June 2)
Submission of final report: June 19
Final Presentation: TBD
Recommended: This project is suitable for a student interested in orbital mechanics, numerical integration. Prior knowledge in Java and Python is a plus.
CONTEXT
As part of the newly established Space Sustainability Hub (SSH) at eSpace, we are exploring novel techniques for determining the rotational and physical properties of space debris. For this purpose, we are currently developing methods for the detection and extraction of space debris observations from large astronomical data archives. These archives contain observational data over a 10-year period and include a large amount of random satellite and space debris observations. On the astronomical images, these objects appear as characteristic streaks, most of which cross the entire detector during the several minutes of exposure time. To identify the object that caused the streak, the observations are correlated with publicly available catalogs of satellites and space debris. However, due to uncertainties in the cataloged orbital elements, propagation errors and the fact that the real orbits are constantly changing, the observation does not precisely match the cataloged orbit. In order to determine the precise observation time, the orbits from the catalogs need to be fitted to exactly match the observation.
PROJECT SCOP
The goal of this project is to develop a method that does not rely on exact observation time stamps, but rather treat the observation time as a variable during the fitting process. Existing orbital mechanics libraries such as Orekit contain sophisticated orbit determination and fitting routines. These methods however account only for errors in the astrometry (the measured position of an orbital debris particle in the sky) and not for errors in the observation time. Due to the high resolution of the images, the astrometry is very accurate but for the objects that cross the whole field during the exposure, the precise exposure time is unknown and can only be determined from the fitted orbit.
TASKS
- Familiarize yourself with existing orbit fitting techniques
- Investigate if these methods can be modified or extended to account for variable observation times
- Design and implement a method to fit orbits to streaks that cross the whole detector
Supervisor:LASTRO/eSpace ( Prof. Jean-Paul Kneib/Stephan Hellmich)
Type of Project: Semester project TP-IV
Duration: 14 weeks (Official start/end date: September 19-December 22)
Submission of final report: January 15
Final Presentation: TBD
Prerequisites:
- Open to coding in Python
Context
The CHaracterising ExOPlanet Satellite (CHEOPS) is a partnership between the European Space Agency and Switzerland. It is the first S-class mission in the ESA Science Programme. CHEOPS has been flying on a Sun-synchronous low Earth orbit since December 2019, collecting millions of short-exposure science images in the visible domain to study exoplanet properties.
A small yet increasing fraction of CHEOPS images show linear streaks caused by satellites and orbital debris crossing the field of view. CHEOPS’ orbit is indeed particularly favorable to serendipitously detect objects in its vicinity as the spacecraft rarely enters the Earth’s shadow, sits at an altitude of 700 km, and observes within 60 degrees of the anti-Sun direction. Objects crossing the field of view are therefore illuminated nearly all the time with small phase angles relative to the Sun making them as bright as they can ever be. This observing configuration is quite powerful and it is complementary to optical observations from the ground.
In order to characterize the population of satellites and orbital debris observed by CHEOPS, all and every science images acquired over the past 3 years have been scanned with a Hough transform algorithm to identify the characteristic linear features that these objects cause on the images. This led to the detection of numerous satellites and orbital debris observations in the CHEOPS data which are currently analyzed.
Project Scope
This project is intended to assess the completeness level of the detections obtained with the currently used detection algorithm and to determine the limiting magnitude to which CHEOPS is sensitive to space debris. Therefore, the detection efficiency of the algorithm needs to be evaluated. This can be done by inserting synthetic streaks with defined brightness and random orientations to the raw data and analyzing how the detection rate drops with decreasing brightness of the synthetic streaks.
Tasks
- Familiarization with the CHEOPS data and the streak detection algorithm
- Implementation of a tool to insert synthetic streaks
- Evaluate the detection efficiency of the streak detection algorithm
Past projects
Supervisor: Adrien Saada (eSpace), [email protected]
Type of Project: Semester project, 1 student.
Recommended: This project is suitable for a student interested in programming for a space application, with a strong background in python and preferably with a basic knowledge of orbital mechanics.
Description: eSpace has been selected as the organization taking over and putting in place the Space Sustainability Rating (SSR, figure 1), a system that will evaluate the level of sustainability of space missions. It has been developed in the last two years by a consortium of organizations including WEF, ESA and MIT. The objective of the SSR is to push forward sustainability in the
space sector and reward operators whose missions comply with the sustainability norms and guidelines.
For the moment, the Detectability, Identification and Trackability (DIT) module is computed by MIT. They have created it using a functionality of the Systems ToolKit (STK) platform, a licensed software, to simulate ground stations in line of sight with satellites (figure 2). This means eSpace has to send some operators’ data to the USA to get the score of the DIT module, before aggregating it to get the total SSR score. Some operators have strict export control policies and do not want to share their data with eSpace if they have to be exported elsewhere so there is a need to translate the MIT code using resources that are available open source. The ultimate goal is that eSpace can compute this module in house.
You will:
● Understand the Space Sustainability Rating concept
● Understand the DIT module algorithm
● Meet the MIT student that developed the DIT code for a hand-over procedure
● Understand the MIT code that perform the computation
● Find available open-source solutions and adapt the module’s code correspondingly
● Test the code and compare the result values with the MIT software
● Write a user’s and developer’s documentation so eSpace can continue using it
Proposed by: Romain Lucchesi (PhD), Nicolas Longeard (Postdoc) and Pascale Jablonka (Faculty)
Type of Project: TP4b/Master/CSE project
Project flavour: Observation/Data Science
The next generation of large spectroscopic surveys of the Milky Way (WEAVE, 4MOST, DESI to quote a few), which aim at understanding how galaxies form and evolve, will deliver millions of spectra that need to be analyzed with methods developed in the framework of big data tools. This project aims at developing and testing machine learning codes capable of automatically delivering the stellar atmospheric parameters (effective temperature, gravity, chemical abundances) out of high resolution spectra.
URL/References:https://arxiv.org/pdf/1501.07604.pdf
Proposed by: Jean-Paul Kneib (Faculty)
Type of Project: TP4b/Master/CSE project
Project flavour: Instrumentation/Observation
At the Sauverny Observatory near Versoix, we have a 70cm telescope (Telesto) hosted in an old dome which is controlled manually. The idea of this project would be investigate solutions to make the observatory more autonomous in its operation and ideally allowing remote or automatic observation.
URL/References:
- https://www.skyatnightmagazine.com/advice/automate-your-observatory/
- https://www.researchgate.net/publication/319118320_Low_cost_automated_astronomical_observatory_for_remote_operation_A_requisite_analysis
Proposed by: Romain Lucchesi (PhD) and Pascale Jablonka (Faculty)
Type of Project: TP4b/Master
Project flavour: Observation/Data reduction and analysis
The search for and the study of extremely metal-poor stars, which keep the chemical imprint of the very first generations of stars in the Universe, guides our understanding of the early build-up of galaxies and the epoch of reionization. However, these stars are exceedingly rare and the search for them has been compared to looking for the needle in a haystack. We have designed a new and very efficient narrow band photometric survey which preselect these extremely metal-poor candidates. They are then confirmed with high-resolution spectroscopy. This project deals with the data reduction of a new set of these outstanding systems and the analysis of their chemical pattern.
URL/References:
Proposed by: Andrei Variu (PhD) and Cheng Zhao (Postdoc)
Type of Project: TP4b
Project flavour: Numerical methods
Cosmic voids are large regions of space in the Universe which have low matter densities. These structures are influenced by the expansion of the Universe and the clustering of matter. Thus, they can be used to better understand the cosmology and the large scale structure.
The purpose of this project is to test whether numerical models could be used to extract cosmological information from the clustering of voids.
URL/References:
https://arxiv.org/abs/1904.01030
Proposed by: Martin Millon (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b or Master/CSE project
Project flavour: Data Science/Simulation
Abstract:
Microlensing is a unique tool to study the inner structure of Black Hole accretion disks (also called quasars). It is produced by stars passing in front of a background lensed quasar resulting in an extra amplification of its luminosity due to the gravitational lensing effect.
The aim of the project is to pursue the development of a Convolutional Neural Network (CNN) to analyze this signal. If successful on the training set, the method will be applied on real observation to obtain the first estimate of the size of a quasar accretion disk with a CNN.
URL/References:
Proposed by: Michele Bianco (Postdoc), Emma Tolley (Data Scientist), and Jean-Paul Kneib (Professor)
Type of Project: TP4b or Master
Project flavour: Data Science & Simulation
Cosmic reionization is studied using Radiative Transfer (RT) codes to simulate the radiative feedback from the primordial galaxies and stars in the early Universe. Part of the simulation uses a differential equation to keep track of the state and evolution of the neutral fraction of hydrogen in the IGM based on the computed ionising radiation. These simulations are used to predict observables such as the 21-cm power spectra, neutral hydrogen intensity mapping, HII regions size distribution, identifying sources in the 21 cm images, etc. RT simulations ideally require huge volumes (~1Gpc size) with many particles (~10^11) to reproduce the relevant scale for the 21-cm signal and capture the global reionisation history. Therefore, these simulations require a substantial computational effort as the number of operations needed to solve the RT equation grows exponentially with the number of simulated particles and sources.
In recent years, Artificial intelligence (AI) and deep learning techniques have been proven to be powerful tools that can learn patterns and identify features in image data or generate mock samples by learning features in the dataset, substantially reducing the computational effort. A new class of Physics-Informed Neural Networks (PINN) use physics-based constraints from partial differential equations to further accelerate network training ans reliability.
We propose to develop a PINN which predicts the evolution of reionized hydrogen in our universe. This would involve 1) involve studying the dynamical equations of reonization and inputs available from simulation to create a suitable physics-constrained network, using the reonization rate as an input observable, and 2) predicting the reionization rate using PINNs or other neural network architectures.
Proposed by: Jiaxi Yu (PhD), Cheng Zhao (Postdoc)
Type of Project: TP4b or Specialisation/Master
Project flavour: Observation, Data Analysis
Spectroscopic surveys measure the redshifts for millions of objects and decode the cosmological information from their spatial two-point correlation function (2PCF). The redshift uncertainty is a kind of error that impacts the 2PCF on small-scales. This uncertainty is inevitable because the emission lines and absorption lines that are used to determine the redshift have a certain width instead of being a delta function. The uncertainty model for different surveys may not be the same.
The effect of the uncertainty on 2PCF, especially on its quadrupole, will affect the cosmological tests, like the Redshift-Space Distortion (RSD) measurement and the modified gravity tests. As the spectroscopic surveys, like the Baryon Oscillation Spectroscopic Survey (BOSS) and the Dark Energy Spectroscopic Instrument (DESI), aim at having a sub-percent-precision measurement for the cosmological parameters, the influence of this uncertainty must be taken into consideration in the models of the cosmological analysis.
TPIVb: look into the effect of the redshift uncertainty for 2PCF, power spectrum P(k), and higher-order statistics (3PCF and bispectrum) of galaxies and voids and their cross-correlations.
specialisation/master project: illustrate the effect of the redshift uncertainty on the galaxy clustering, and quantify its effect on the RSD measurement and modified gravity tests
URL/References:
The source and the measurement of the redshift uncertainty: Chapter 3.1 and Chapter 5.2 of https://iopscience.iop.org/article/10.1088/0004-6256/144/5/144
The effect of the redshift uncertainty on galaxy clustering: Chapter 2.1 and Figure 6 of https://academic.oup.com/mnras/article/499/1/269/5908387
The modified gravity tests using small-scale galaxy clustering: Chapter 2.2, 3.1 and 3.3 of https://iopscience.iop.org/article/10.1088/1475-7516/2021/11/050
Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)
Type of Project: TP4b or Specialization/Master
Project flavour: Numerical methods
Dark matter is a universal phenomenon that can be observed from the galactic to the largest observable scales. We intend to investigate the nature of dark matter from the observation of the filaments that constitute the intergalactic medium. The intergalactic medium is observed in absorption in quasar spectra. We intend to train a neural network with mock data generated from cosmological numerical simulations from different cosmologies with cold and warm dark matter. We intend to investigate if the classical estimators used in the Lyman-alpha forest are optimal estimators or if other estimators could be considered.
URL/References:
https://arxiv.org/abs/0711.3358
https://arxiv.org/abs/1809.06585
Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)
Type of Project: TP4b or Specialisation/Master
Project flavour: Numerical methods
The Universe is filled with a cosmic web of baryonic filaments since very high redshift, that is called the intergalactic medium. The intergalactic medium is largely observed in absorption in the spectra of distant and bright objects as quasars. It was speculated that the width of the absorbing lines in the quasar spectra is due to the spatial extent of the filaments. We aim to directly investigate this statement with the use of numerical simulations. The student will analyze the outputs of existing cosmological numerical simulations and compare with mock spectra extracted from the same simulations. estimators or if other estimators could be considered.
URL/References:
https://arxiv.org/abs/0711.3358
https://arxiv.org/abs/1502.05715
Proposed by: Aymeric Galan (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b
Project flavour: Data analysis & Numerical methods
When modelling a gravitationally lensed system, many degeneracies can arise between components of the model. More specifically, it is challenging to disentangle complex structures in the source galaxy from complex structures in the mass distribution of the lens galaxy. Here we propose to use a novel source reconstruction technique to assess at which level we can trust our model, depending on the object being modelled and observing conditions. By using a multi-scale approach based on wavelets, the goal is to explore model quality and degeneracies arising at different spatial scales: 1) at which scales can we trust a source model? 2) at which scales the source-mass degeneracy can be mitigated?
URL/References:
https://arxiv.org/abs/2012.02802
Proposed by: Aymeric Galan (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b
Project flavour: Data simulation & Numerical methods
To model a strongly lensed galaxy, we usually describe both the foreground and background galaxy by analytical profiles. Sometimes the foreground galaxy is large and its light overlap on the lensed arcs from the background source. If not accounted for, the light in a given pixel might not be linked to the correct object in the system. This can lead to biases in the parameters of the model, beyond the parameters of the light profiles alone. If such biases propagate to the parameters describing the mass of the foreground galaxy, systematic errors can arise on the final inference, depending on the application: effective radius, velocity dispersion, dark matter content, mass-to-light ratio, or even the Hubble constant. The goal of this project is to simulate images of strongly lensed galaxies with various degrees of light blending, and study different modelling assumptions to assess biases in the recovered parameters.
Proposed by: Gianluca Castignani (Postdoc) and Pascale Jablonka (Faculty)
Type of Project: TP4b or Master project
Project flavour: Observation/Numerical methods
Galaxy clusters are the most massive gravitationally bound structures in the Universe. They are essential laboratories to study galaxy evolution in extreme conditions and constrain cosmological parameters and their evolution. They are among the primary goals of the future missions such as Euclid and the LSST. This project will focus on the implementation and the application of a newly developed method to search for distant clusters, the wavelet-based Poisson Probability Method. The method will be applied to search for distant clusters using ongoing spectro-photometric catalogs and on existing catalogs of clusters for the exploitation of next generation spectrographs such as MOONS in which Switzerland and EPFL play a central role.
URL/References:
Proposed by: Martin Millon (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b or Master/CSE project
Project flavour: Observation/Data Science
Quasars accretion disks are known to be the most luminous objects in the Universe. They are powered by matter falling on a central black hole, releasing the gravitational energy in the form of radiations.
As the high-energy photons emitted at the center travel across the disk, they trigger delayed emission at longer wavelengths. Reverberation mapping consists of measuring the time-delays between different spectral bands, which correlate to the physical size of the accretion disks.
This project will imply the reduction of the high-cadence data currently taken at the Euler 1.2m Swiss Telescope, and measure the time-delays between the different filters. This will involve the application of the existing techniques and to develop new methods to measure time-delays between distorted light-curves.
URL/References:
https://cosmograil.epfl.ch/data
Proposed by: Martin Millon (PhD), Eric Paic (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b or Master/CSE project
Project flavour: Data Science/Simulation
Microlensing is unique tool to study the inner structure of a black hole’s accretion disk. It is produced by stars passing in front of a background quasar resulting in an extra magnification due to the gravitational lensing effect.
The aim of the project is to build a Neural Network to extract the information contained in the microlensing signal in order to measure the accretion disk’s size and the properties of the stars passing in front of the background quasar.
URL/References:
Proposed by: Martin Millon (PhD), Eric Paic (PhD) and Frédéric Courbin (Faculty)
Type of Project: TP4b
Project flavour: Data Science/Simulation
Time-delay cosmology with multiply-imaged quasars is a promising probe to achieve a precise measurement of the Hubble constant. This technique is based on the measurement of time-delays between the different images created by a strong gravitational lens.
This state of the art method consists of monitoring lensed quasars during several years in order to produce long light curves. It is then possible to find the optimal time-shift between the different images. A precise and accurate measurement of the time-delays is critical as the errors propagate directly to the final estimate of the Hubble constant.
This project aims to assess the reliability of the current curve-shifting technique, namely PyCS, on a simulated set of light curves and on real data taken at the Euler 1.2m Swiss telescope, located in La Silla observatory, Chile.
URL/References:
Liao et al. (2014) : https://arxiv.org/abs/1409.1254
Bonvin et al. (2015) : https://arxiv.org/abs/1506.07524
Proposed by: Loic Hausammann (PhD) and Yves Revaz (Faculty)
Type of Project: TP4b
Project flavour: Simulation / Galaxies
The Milky way is known to impact the evolution of its satellites through two main mechanisms: the tidal and ram pressure stripping.
Both have been studied for a long time, but no one has looked at the impact of the radiation emitted by the Milky Way. This radiation (mostly the UV) can heat the cold gas and is supposed to facilitate its stripping through the ram pressure.
This project implies developing a model for the Milky Way’s UV based on observations and running simulations of dwarf galaxies that include the UV through the moving box technique.
This technique allows simulations of satellite dwarf galaxies to be run at very high resolution while still taking into account correctly the Milky Way.
URL / References:
Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty), James Chan (Postdoc)
Type of Project: Masters/CSE project (also adaptable for TP4b)
Project flavour: Observation/Data Science
Gravitationally lensed quasars are valuable probes for studying astrophysics and cosmology. In particular, the magnification effect allows us to observe and study supermassive black holes only a few hundred million years after the start of the universe. The number and growth rates of these quasars has significant consequences for our understanding of the reionisation of the universe, and of black hole formation and growth mechanisms, however only a few systems are known. This project will aim to find these elusive systems using a spectral energy distribution model-fitting approach, based on multi-wavelength broadband photometric data. Once candidates are identified, pixel modelling will be used to select the most promising systems for spectroscopic follow-up.
URL/References:
https://ui.adsabs.harvard.edu/abs/2019ApJ…870L..12P/abstract
https://ui.adsabs.harvard.edu/abs/2019ApJ…870L..11F/abstract
https://ui.adsabs.harvard.edu/abs/2020ApJ…889…52P/abstract
Proposed by: Aymeric Galan (PhD), Austin Peel, (Postdoc), Frédéric Courbin (Faculty)
Type of project: Practical works (TP4)
Project flavor: Python programming, automatic differentiation
Gravitational lensing offers a unique window on the Dark Matter and Dark Energy content of the Universe. The deviations of light rays from a distant source by an intervening galaxy, caused by both its luminous and dark mass content, allows us to “see” dark matter and study its properties. If the distant source is also variable, like a quasar, the resulting gravitational lens can be used to measure the expansion rate of the Universe, namely study Dark Energy.
We recently developed a new gravitational lens modeling code, fully written in python, which uses automatic differentiation with JAX to improve both the model quality and computation time. In this project, the goal is to add yet another feature to the code: the ability to model the light of distant quasars. This project is heavily programming-oriented, focused on an efficient implementation to improve the modeling of lensed quasars. The student should already be familiar with Python and git, and have some basic understanding or interest in novel machine learning techniques.
Resources:
- Example of an implementation of quasar modeling: lenstronomy
- Lensed quasar modeling techniques: Ding et al. 2021
- Lensed quasar modeling for measuring the Hubble constant: Shajib et al. 2020