Master thesis subjects 2026


CMS: Exploring the Energy Frontier at the LHC 

The Large Hadron Collider (LHC) at CERN in Geneva is the largest particle accelerator in the world. It collides protons at a centre-of-mass energy of 13.6 TeV, probing physics at the highest energy scales ever achieved in collider experiments. The biggest success of the LHC so far was the discovery of the Higgs boson by the ATLAS and CMS experiments in 2012, an elementary particle whose study is crucial for the understanding of the origin of elementary particle masses. Since 2022, the LHC has resumed operations at even higher collision rates, which provides plenty of opportunities to improve precision measurements of elementary particles as well as to extend the reach of searches for new particles beyond the standard model (SM) of particle physics. 

The experimental particle physics group at UGent is involved in data analysis with the Compact Muon Solenoid (CMS) detector, covering a wide range of topics including precision measurements of SM processes, new physics searches, and calibration of detectors and reconstruction methods. All data analysis projects make use of the Python programming language and employ specific high-energy-physics software packages. Some basic knowledge of Python (or another programming language) is required, while more advanced programming skills will be taught during the project. After the completion of one of the data analysis projects, the students will have:

  • Gained insight in the design, technology, and software of one of the most advanced particle detectors in the world;
  • Improved their programming skills in the field of data analysis and visualization, possibly including machine-learning techniques;
  • Got experience in taking part in an international environment at the frontier of fundamental research.

 We offer the possibility of staying at CERN in Geneva/Switzerland for a summer project to all master students that join the CMS analysis group.

Toponium in four-top production: Searching for bound states at the high-energy frontier

The top quark is the only quark that typically decays before it can hadronize into bound states. However, recent CMS and ATLAS results have identified an excess near the production threshold, providing the first evidence for toponium bound states). While four-top quark production has finally been observed at the LHC, it remains a rare and complex process. Currently, we lack a detailed understanding of how toponium formation manifests within these four-top events, which could serve as a unique laboratory for testing the Standard Model at extreme scales. We will investigate the signature of toponium within the four-top quark production process, specifically targeting multilepton final states (events with two same-sign, three, or four leptons). These channels offer a high signal-to-background ratio, making them the “golden channels” for four-top quark studies. You will analyze CMS Run 3 data to search for the toponium threshold enhancement while accounting for the complex combinatorial background of multiple top decays. This involves using Monte Carlo simulations to model spin correlations and developing dedicated selection criteria to isolate the toponium component from the four-top continuum. 

Figure: (left) Graphical representation of a recorded candidate event with four-top quark pair production at CMS. (right) Observation of a pseudoscalar excess near the top quark pair production threshold at CMS.

Skills obtained: advanced data analysis in high-energy physics, proficiency Python-based analysis frameworks, expertise in multilepton event reconstruction, statistical interpretation of LHC data, active engagement within the international CMS collaboration.

References:
CMS Collaboration, Observation of four top quark production in proton-proton collisions at sqrt(s) = 13 TeV, Phys. Lett. B 847 (2023) 138290.
CMS Collaboration, Observation of a pseudoscalar excess at the top quark pair production threshold, Rep. Prog. Phys, 88 (2025) 087801.

Promoter: Prof. Didar Dobur
Supervisors: Dr. Kirill Skovpen, Amber Cauwels
Contact | Topic 50076 on PLATO

Exploring a unique and brand new dataset: forward physics in proton–oxygen collisions

In 2025, the Large Hadron Collider (LHC) performed a unique series of light-ion collision runs, including proton–oxygen (pO), oxygen–oxygen (OO), and neon–neon (NeNe) collisions. These datasets provide a rare opportunity to study hadronic interactions involving light nuclei at unprecedented energies. Such measurements are not only relevant for collider physics, but also play a crucial role in improving hadronic interaction models used to interpret extensive air showers produced by high-energy cosmic rays in the Earth’s atmosphere. The CMS experiment is particularly well suited for these studies thanks to a set of dedicated forward detectors such as the Forward Shower Counters (FSC), the Zero Degree Calorimeter (ZDC), and the Precision Proton Spectrometer (PPS). Few analyses are able to use these detectors, but thanks to the special configuration of the LHC during data-taking, these important systems were able to be installed. In this master’s thesis project, the student will focus on the analysis of proton–oxygen collision data recorded by CMS. The primary goal will be to study data from the forward detectors to study specific event classes, such as diffractive interactions or nuclear dissociation of the oxygen ion. This would allow the student to investigate how forward activity correlates with central particle production and to compare the measurements with state-of-the-art Monte Carlo models. The thesis will involve modern data analysis techniques used in experimental particle physics, including event selection, efficiency corrections, systematic uncertainty evaluation, and comparison with Monte Carlo simulations. The project offers hands-on experience with real LHC data and special detectors that few people can use for their analysis. This thesis will contribute to an active and brand new physics program within the CMS Collaboration, with high relevance for both collider and cosmic-ray physics communities.

Figure 1: This figure shows the special detectors installed within 220m from CMS that are the main focus of this project with the PPS in light blue, FSC in dark blue and the ZDC in green.
Figure 2: Left: FSC during test beam. This detector was specifically installed for the Oxygen runs. Right: An overview of cosmic ray results. As can be seen in the red highlighted area, analyses using different methods and different hadronic models get vastly different results, while we expect them to agree. Results of this master thesis can help us greatly improve this picture in the future. 

Promoter: Prof. Didar Dobur
Supervisors: David Marckx
Contact | Topic 50077 on PLATO

Top quarks as proton probes: Enhancing PDF determinations with differential cross section measurements

What’s in a name, Julliet once asked. In case of the LHC, it is large hadron collider. This refers to the protons it collides most of the time. One typically thinks of protons as 2 up and 1 down quark, but in reality the proton is more like a sea of many quarks and gluons (collectively called partons) popping up and disappearing continuously. This sea is described with parton density functions or PDFs: distributions of each type of parton inside the proton as a function of fraction of the total proton momentum they carry. They are essential ingredients to predict cross sections in proton collisions, but unfortunately, these PDFs are so far not calculable from first principles. Instead, they have to be determined from experiments. The LHC provides a large dataset of top quarks, which can provide valuable information about the pdfs at high energies.

In this project we will study the production of single top quarks, which is interesting for PDF determinations. We will focus on differential measurements of kinematic quantities in single top quark production, since these are particularly sensitive to the momentum of the initial state partons. The main goal of this project is to study different distributions and how they can contribute to global PDF fits by running the PDF fitting tools ourselves. We will identify the most promising candidates for differential measurements and implement them for data analysis in an ongoing CMS measurement.

Figure: Cross section measurements used to determine the proton PDFs by the NNPDF Collaboration as a function of momentum fraction of the proton and collision energy. From: http://nnpdf.mi.infn.it.

Promoter: Prof. Didar Dobur
Supervisors: Maarten de Coen, Dr. Jan van der Linden
Contact | Topic 50078 on PLATO

Boosted-Top-Tagging with GloParT: A new window to four-top production 

The top quark is the heaviest known elementary particle with a mass of about 173 GeV, consequentially the simultaneous production of four top quarks (tt̄tt̄) is an extremely rare process to occur at the LHC. Still, the UGent CMS analysis group achieved the first observation of tt̄tt̄ production by analysing events with two same-sign, three, or four electrons and/or muons (so-called “multilepton” events) [Phys. Lett. B 847 (2023) 138290]. In this project, we will build upon this great success and attempt to extend the phase-space in which we measure tt̄tt̄. For this we will target events in which the top quarks can be boosted, which means that the decay products are closely collimated and produce a single particle jet, instead of 3 resolved jets. We will explore whether this approach is promising and if it can be used as a complementary approach to the other measurements of tt̄tt̄. This project includes a detailed study of how we can exploit the state-of-the-art jet flavour tagging machine learning algorithms, provided by the CMS experiment. The most recent developed boosted-jet-tagger (Global Particle transformer), aims to tag hadronic and leptonic finals states from the decays of top quarks, W, Z, and H bosons whose decay products are fully clustered into a large-radius jet. Exactly what we need for this analysis, let’s find out if it works! 

Figures displaying a boosted top quark (left) and a resolved top quark (right).

Promoter: Prof. Didar Dobur
Supervisors: Amber Cauwels, Dr. Jan van der Linden
Contact | Topic 50079 on PLATO

How shiny are top quarks: Measuring the interplay between the heaviest particle and the massless photon with the highest-energy LHC data

The production of top quark-antiquark pairs (tt̄) is one of the most common processes at the LHC. Top quark production is so abundant that the LHC is even nicknamed “top quark factory”, and tt̄ measurements are among the most precise cross section measurements ever performed by the CMS Collaboration. To study the electroweak interactions of the top quark, measurements of the production in association with a vector boson are a crucial tool. In this project, the CMS data collected in 2024 will be analysed to identify tt̄ production events that contain one or more additional photons (tt̄+γ). This will include tackling the challenge of accurately reconstructing the tt̄ system in the presence of a photon. The goal is to perform the first cross section measurement of tt̄+γ and tt̄+γγ production at a center-of-mass energy of 13.6 TeV, and to constrain modifications introduced by physics beyond the standard model to the coupling between photons and top quarks.

Fig.1: Feynman diagrams for tt̄+γ production, where the photon originates either from a top quark (left), an incoming quark (middle), or a charged lepton in the top quark decay (right). From: JHEP 05 (2022) 091.

Fig. 2: The distribution of the photon transverse momentum in tt̄+γ events, shown for standard model (SM) couplings between photon and top quark (yellow filled area) and for different scenarios with modified couplings induced by new-physics at a higher energy scale (red and blue lines). From: JHEP 05 (2022) 091.

Promoter: Prof. Didar Dobur
Supervisors: Jules Vandenbroeck, Dr. Beatriz Ribeiro Lopes
Contact | Topic 55080 on PLATO

Larger jets catch more Higgs bosons: study of the Higgs-charm coupling using large radius jets and Data Scouting techniques 

A central part of the LHC physics program is the measurement of the Higgs boson couplings to other particles. Higgs couplings to second generation quarks (e.g., the charm quark) have not yet been experimentally confirmed. The associated production of a Higgs boson with a Z or W boson (collectively VH) provides the most sensitive results to the Higgs-charm coupling so far. While final states with the vector boson decaying to leptons have already been studied, the hadronic final state remains unexplored at the CMS experiment. The main challenge of this final state is the large background from multi-jet production events. Additionally, the standard online selection filters only keep events with very large momentum jets, rejecting most of the potential signal candidates, due to constraints on the maximum event rate. The project will focus on exploring the potential of “Data Scouting” to measure this process. The Data Scouting enables the storage of interesting proton-proton collision events that would otherwise be rejected by the normal online selection filters, for example events with lower jet momentum. The signature of such events consists of two back-to-back jets of particles covering a wide solid angle (large radius jets), each containing the decay products of the Higgs or the vector boson. Preliminary results using a fraction of the data collected during Run 3 will be extracted in this project. 

Figure: (left) Couplings of the SM particles to the Higgs boson as a function of particle mass, with a circle indicating the expected coupling of the charm quark, which has yet to be observed. (right) Schematic diagram illustrating the topology of the VH fully hadronic events to be studied.

Promoter: Prof. Didar Dobur
Supervisors: David Kavtaradze, Dr. Beatriz Ribeiro Lopes
Contact | Topic 50081 on PLATO

Top future: Study of the top quark reconstruction at future-collider experiments at CERN

The study of the properties of the heaviest elementary particle in the Standard Model, the top quark, is a central objective of current experiments at the CERN Large Hadron Collider. Future collider projects aim to significantly improve the precision of these measurements and enhance the sensitivity to possible effects of physics beyond the Standard Model. In this context, the Future Circular Collider (FCC), a proposed next-generation collider at CERN, offers a particularly clean environment through high-energy electron–positron collisions. The goal of this project is to perform an Effective Field Theory (EFT) [1] study of the coupling of the top quark to the Z boson and the photon [2] at the FCC. Such interactions are sensitive probes of new physics and can reveal subtle deviations from Standard Model predictions. To achieve this, the student will investigate how EFT operators modify top-quark pair production in electron–positron collisions. A key component of the project will be the development of a new analysis methodology tailored to the FCC environment. The student will design techniques that improve sensitivity to EFT effects, including top-quark reconstruction optimized for FCC conditions. The student will generate and analyze simulated datasets [3], reconstruct the relevant physics observables, and evaluate the sensitivity of the FCC to anomalous top Z/photon couplings. The analysis will be performed using the ROOT data analysis framework with Python-based tools for data processing, visualization, and statistical interpretation. This work will contribute to assessing the potential of the FCC to probe new physicseffects.

Figure: (left) Measured differential cross section of top quark pair production with a photon at ATLAS. (right) Representative Feynman diagram with the top quark pair production at FCC-ee.

Skills obtained: advanced data analysis in high-energy physics, proficiency Python-based analysis frameworks, expertise in event reconstruction and analysis interpretation.

Promoter: Prof. Didar Dobur
Supervisors: Jules Vandenbroeck, Dr. Kirill Skovpen
Contact | Topic 50082 on PLATO

Heavy neutrino hunt: Optimizing the Search for Heavy Neutral Leptons in the Minimal Left–Right Symmetric Model 

Neutrinos are weakly interacting particles whose masses are not explained in the Standard Model of particle physics. A model that embeds the neutrino masses is the minimal Left–Right Symmetric Model (LRSM). In this model, parity symmetry is restored at high energies by extending the gauge group of the SM and introducing right-handed counterparts to the left-handed weak interactions. Consequently, new gauge bosons such as the heavy right-handed (WR) boson and heavy right-handed neutrinos appear in the spectrum. At the Large Hadron Collider (LHC), these particles could be produced through processes involving the heavy WR boson. A particularly interesting regime arises when the HNL is significantly lighter than the WR, wherethe HNL can be produced with a large Lorentz boost and acquire a macroscopic lifetime. As a result, it may travel a measurable distance before decaying inside the detector, giving rise to displaced signatures. Because the HNL is highly boosted, its decay products can be collimated, leading to a characteristic boosted jet containing a displaced lepton. The goal of this thesis is to study and optimize the search for heavy neutral leptons in the displaced regime within the minimal LRSM framework. To achieve this, reconstruction strategies will be developed to identifydisplaced leptons within boosted jets in simulated LHC events. Attention will be given to optimizing the event selection to maximize the sensitivity to the signal. In addition, the project will investigate data-driven techniques for estimating the relevant backgrounds, especially processes that can produce displaced or non-prompt leptons and therefore mimic the signal signature.

Figure: Representative Feynman diagram with the production of a WR boson and its decay to a heavy neutrino.

Figure: (left) Schematic representation of the heavy neutrino event in the detector. (right) LHC sensitivity to the KS sensitivity to the KS signal in the M(WR) —m(N). Extracted from Nemevšek et al., Phys. Rev. D 97, 115018 (2018).

References: 

Promoter: Prof. Didar Dobur
Supervisor: Adina Maria Tomaru, Dr. Kirill Skovpen
Contact | Topic 50083 on PLATO

SHiP: Searching for the Hidden Universe

The SPS is the second largest accelerator at CERN, beyond feeding the LHC, it will be used to conduct high intensity searches for new physics unattainable at LHC leveraging a beam dump configuration at the Search for Hidden Particles (SHiP) experiment. Starting 2032, it will cover searches for extremely rare events such as interactions of dark sector particles mediators, the origin of neutrinos’ mass, the reasons for baryonic asymmetry as well as signs of extra symmetries of nature lying beyond the reach of any other facility. SHiP will also allow the detailed study of the seldom observed tau neutrino by increasing the available events from up to 26 today to several 10 000s. 

The UGent experimental particle physics group is involved in the development and prototyping of innovative new detectors to enable the observation of new physics and tau neutrinos at SHiP, in addition with studies of both signal andbackground needed to ensure the experiment covers its full program, both sides of the program work together to optimise overall experimental performance. The detector work is done in a laboratory using a variety of detector technologies such as photosensors, scintillators or gaseous detectors. Analyses are conducted in either Python or C++ with some basic knowledge of programming required for analysis-focused projects. The completion of a detector project within SHiP will provide the students with: 

  • Experience into advanced particle detection techniques 
  • Expertise with fast readout electronics and data acquisition 
  • Contributions to future detector systems within an international collaboration 

The completion of an analysis project within SHiP will benefit the students through: 

  • Insights into new physics modelling and searches 
  • Improved programming skills for data analysis, visualisation and simulation, with possibly machine learning components
  • Contributions to advanced analysis techniques in future experiments within an international collaboration

SHARP: Development of the SHiP SciFi High Accuracy Reconstruction Planes

The SHiP experiment at CERN will be the flagship experiment for high intensity searches of new physics such as dark matter, portals to hidden sector physics and extra neutrinos. The experiment relies for the reconstruction of neutral final states on a specialised and original calorimeter detector design: the SplitCal. This design comprises a sampling calorimeter which includes a longitudinal split as well as 2-3 high precision layers which allow for the vertexing of neutral final states. These high precision layers are to rely on a novel technology developed in Ghent: Scintillating fibre High Accuracy Reconstruction Planes (SHARP). These detector planes are based on scintillating fibres and readout by silicon photomultipliers (SiPMs). They allow for excellent position and timing resolution to be achieved in a calorimeter setup, allowing the SplitCal to achieve unmatched vertexing resolution among calorimeters. The development of the SHARP technology with an evaluation of precise achieved performance, optimisation of the detector and prototyping. The project will include simulation and detector building as well as analysis of the produced simulated data to determine precise physics reach in different Beyond the Standard Model scenarios. The built prototype is to be evaluated in a test beam at an accelerator facility and the ensuing data is to be analysed to inform simulation studies. Progress and results are to be presented on a regular basis to the broader SHiP Collaboration.

Figure: (left) Illustration of the new particle decay and the corresponding experimental signature in the SHiP calorimeter. (right) Proposed detector design for the SHiP calorimeter high-precision layers (HPL) using scintillator fibers.

Skills obtained: High intensity and energy physics, particle detector development, simulation tools, data acquisition, data analysis, work and engagement in an international research collaboration.

Promoter: Prof. Didar Dobur
Supervisor: Dr. Matei Climescu
Contact | Topic 50084 on PLATO

Sailing for new physics: Search for feebly interacting particles with the SHiP experiment at CERN

Numerous particle physics experiments around the world have been extensively searching for new particles to construct a more complete theory of nature. The SHiP experiment at CERN aims at filling an important gap in the current phase space of sensitivities probed by various experiments by employing a high-intensity Super Proton Synchrotron (SPS) beam-on-the-target collisions and a large decay volume to search for the decays of new light particles (N), such as heavy neutral leptons, predicted in many beyond the standard model theories. A comprehensive characterization of the dominant production and decay channels of N particles produced on the target is needed to assess the sensitivity of the SHiP experiment. The project will explore the simulation methods to generate N events, calculating expected cross sections and potential backgrounds. A simulation study will employ the hard-level process generation events as well as the fully simulated events inside of the SHiP detector to account for various detector-level effects.

Figure: (left) A schematic illustration of a process with the production of heavy neutral lepton (N) in the target area and its subsequent decay in the hidden sector decay vessel of the SHiP experiment. (right) The expected sensitivity of SHiP to the mass and interaction couplings of N, compared to the latest CMS results at the Large Hadron Collider. 

Skills obtained: physics analysis of the CERN SPS data, reconstruction techniques, identification of displaced vertices, new physics models, simulation tools, data analysis methods

References:  

Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen, Dr. Matei Climescu
Contact | Topic 50085 on PLATO

Quenching the noise: Study of electromagnetic background at SHiP

The SHiP experiment at CERN will be the new flagship particle physics experiment for high-intensity accelerator searches for new physics, such as dark matter, portals to hidden sectors and extra neutrinos. The experiment relies, to ensure that all observations are genuine, on having no irreducible background. It employs to this effect a variety of filtering components and background taggers. These components however induce electromagnetic background through rare interactions of surviving muons. Evaluate the intensity of the electromagnetic background in simulation, determine its effects on new physics searches by ascertaining the effects on the SHiP detector systems, whether it mimics signals and determine effective techniques to eliminate or compensate for it. Experimental checks of simulated data should also be performed using test beam data. The project will be led in close coordination with the SHiPbackground taskforce with regular updates as to progress being expected.

Figure: (left) Schematic representation of the background events in the SHiP detectors. (right) Estimated background rate in tracking detector stations of SHiP.

Skills obtained: Physics at high intensity and energy, physics of rare processes, data analysis methods, simulation tools, work and engagement in an international research collaboration.

Promoter: Prof. Didar Dobur
Supervisor: Dr. Matei Climescu
Contact | Topic 50086 on PLATO

Plastic-fantastic: Calorimeter design using plastic scintillators for the SHiP experiment at CERN

Sampling calorimeters are used in many high-energy physics experiments for the energy measurement. In such calorimeters, plastic scintillators often represent the active material to measure the energy that is released in electromagnetic and hadronic showers created by the incoming particle in the absorber layers. A new sampling calorimeter is currently built at UGent for the Scattering and Neutrino Detector (SND), which is part of the SHiP experiment at CERN. The main building blocks of this calorimeter are plastic scintillator tiles with 2D optical fiber readout based on silicon photomultipliers (SiPMs). Multiple test beam studies for the sampling calorimeter are foreseen at CERN to measure the energy and time resolution of this new detector. For this purpose, a small-scale prototype will be built and commissioned at UGent using plastic scintillator tiles, optical fibers, and SiPMs. A student will be involved in the conceptualization process of the calorimeter design and commissioning of a cosmic muon test bench to measure its main characteristics. This work will be also complemented by simulation studies to derive the expected performance of the calorimeter and to further optimize its design towards the final implementation and integration in SHiP. 

Figure: (left) Simulation of an interaction of an energetic electron in a prototype of the sampling calorimeter for SND@SHiP. (right) Plastic scintillator tiles with optical fiber grooves manufactured at UGent.

Skills obtained: particle detector development, data acquisition, simulation tools, data analysis, work in international research collaboration 

References:  

Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen
Contact | Topic 50087 on PLATO

High precision calorimetry: SHiP@CERN Calorimeter High-Precision-Layer module development

The SHiP experiment at CERN will be the new flagship particle physics experiment for high-intensity accelerator searches for new physics, such as dark matter, portals to hidden sectors and extra neutrinos. The experiment relies for its particle reconstruction on high precision detector layers located in the calorimeter. These detectors are encapsulated into modules which need to accommodate thousands of ~1mm diameter scintillating optical fibres. The development of a module design and prototyping for usage together with the detector prototype accounting for available manufacturing technologies, cooling requirements and cost. The modules will be built of metal, preferably something light and sturdy such as aluminium and the capability of the module to mechanically support the rest of the detector frame will be studied. The module integration and performance will be simulated in SolidWorks and/or ANSYS whereas the prototype will be manufactured and tested in coordination with the physics team.

Figure: (left) Preliminary design of the High-Precision Layers of the SHiP calorimeter. (right) Scintillating fibers.

References: 

Promoter: Prof. Didar Dobur
Supervisor: Dr. Matei Climescu
Contact | Topic 50089 on PLATO

Detector R&D: Developing the Future of Particle Detection 

Advances in experimental physics are driven by the ability to see what was previously invisible. To push the boundaries of the known universe, we must continuously develop new technologies that offer higher precision, faster timing, and greater resilience to radiation. While many of these innovations eventually find their home in large-scale experiments, the core of Detector R&D (Research and Development) happens in the laboratory, where we explore the fundamental limits of sensor technology independent of any single collaboration. 

The experimental particle physics group at UGent is a leader in this creative “hardware-to-discovery” process. We work with a diverse portfolio of technologies that form the backbone of modern detection:  

  • Gaseous Detectors (GEMs & RPCs): We develop high-rate Resistive Plate Chambers (RPCs) and Gas Electron Multipliers (GEMs) that can handle the extreme particle fluxes of future colliders. 
  • Scintillators & SiPMs: We innovate with scintillating materials paired with Silicon Photomultipliers (SiPMs) to achieve ultra-fast timing and high-granularity calorimetry.  

While our research is often motivated by and applied within the CMS and SHiP experiments at CERN, our R&D program extends beyond these collaborations. We investigate novel materials, optimize readout electronics, and perform characterization studies that contribute to the broader field of instrumentation 

A project in Detector R&D will provide the student with:  

  • Technical mastery of laboratory instrumentation, including cleanroom assembly of GEM/RPC prototypes and high-precision characterization of SiPMs; 
  • Experience in the full R&D cycle, from hardware prototyping and test-beam preparation at CERN to the final performance analysis; 
  • Insight into detector physics, learning how to model complex signals and background effects using simulation tools like Geant4 or Garfield++; 
  • A unique skill set that bridges the gap between hardware engineering and high-energy physics analysis, valuable in both academia and high-tech industry. 

Looking through the glass: Glass RPC muon telescope for non-destructive material imaging

Non-destructive imaging techniques capable of probing the internal structure of dense or shielded objects are important in fields such as security, archeological studies and materials inspection. Conventional imaging methods (e.g., X-rays) can be limited when large volumes or very dense materials are involved. Cosmic muon tomography provides an alternative approach by exploiting naturally occurring atmospheric muons, which are highly penetrating and can traverse thick materials. The project aims to develop and study Glass Resistive Plate Chambers (GRPCs) operated in sealed mode for muon-based imaging. Several GRPC modules will be assembled into a telescope to detect cosmic muons and reconstruct their trajectories. After studying the detector performance and developing an algorithm to reconstruct muon tracks across multiple layers, the system will be used for muon tomography. The goal is to reconstruct the shape and density of objects placed inside the telescope in order to evaluate the imaging capabilities of GRPC-based systems for non-destructive material inspection.

Figure: Application of the muon tomography fo (left) empty cavity search in pyramids and (right) container screeing at the boarder control.

Skills obtained: design and operation of particle detectors, data acquisition and analysis, track reconstruction and algorithm development, work and engagement in an international research collaboration.

Promoter: Prof. Didar Dobur
Supervisor: Karam Kaspar
Contact | Topic 50090 on PLATO

Fast analog: High-speed analog front-end design for particle detectors

The initial stage of any particle physics experiment is the analog front-end. Whether detecting scintillation light with Silicon Photomultipliers (SiPMs) or ionization tracks in gaseous detectors, the raw signals are often too weak or too fast for direct digitization. Developing a custom analog chain is critical to suppress electronic noise and prevent signal pile-up in high-luminosity environments, directly determining the detector’s energy and spatial resolution. The student will design and prototype a high-performance analog readout board. The project focuses on the development of a low-noise pre-amplifier and a pulse-shaping stage tailored for the specific discharge characteristics of both SiPMs and gaseous sensors. The performance will be validated through LTSpice simulations and lab measurements using cosmic muons.

Figure: (left) Fast signals produced on a SiPM by incoming photons. (right) An example of the analog circuitry for detector signal amplification and shaping.

Skills obtained: analog circuit simulation (SPICE), PCB layout design for high-frequency signals, signal integrity analysis, and experimental characterization of particle detectors

References:  

  • H. Spieler, Semiconductor Detector Systems, Oxford University Press, 2005.
  • G. F. Knoll, Radiation Detection and Measurement, 4th Edition, John Wiley & Sons, 2010. 
Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen
Contact | Topic 50091 on PLATO

Fast digital: FPGA-based data acquisition and time measurement

Converting ultra-fast analog pulses in high-energy particle experiments into precise digital timestamps requires sophisticated digital logic. Field Programmable Gate Arrays (FPGAs) are the industry standard for this task, but they require highly optimized firmware to achieve picosecond-level precision without massive power consumption. This project is devoted to the digital side of the readout chain. The student will implement a Time-to-Digital Converter (TDC) on a GateMate FPGA. The work involves developing the logic for high-speed hit detection, data buffering, and transmission protocols. The project will specifically explore the unique architecture of the GateMate FPGA to push the limits of timing resolution for high-rate particle detection.

Figure: (left) Evaluation board for GateMate FPGA developments. (right) Schematic diagram of the time measurement using the clock signal on the FPGA.

Skills obtained: FPGA programming (HDL), implementation of TDCs (tapped delay lines), high-speed digital communication, and hardware-software interfacing for data acquisition (DAQ) 

References:  

  • U. Meyer-Baese, Digital Signal Processing with Field Programmable Gate Arrays, 4th Edition, Springer, 2014. 
  • M. J. Kwiatkowski and R. Szplet, “Low-Resource Time-to-Digital Converters for Field Programmable Gate Arrays: A Review,” Electronics, vol. 13, no. 17, 2024. 
Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen
Contact | Topic 50092 on PLATO

Mighty holes: Design of micropattern gaseous detectors

Gaseous detectors are essential for large-scale tracking and muon systems in high-energy physics due to their cost-effectiveness and high spatial resolution. However, traditional designs like Multi-Wire Proportional Chambers (MWPCs) are limited by high occupancy and aging effects under extreme radiation environments. Micro-Pattern Gaseous Detectors (MPGDs), such as ThickGEMs and Micromegas, have emerged as the standard solution to overcome these bottlenecks. The challenge lies in optimizing the detector geometry—such as hole pitch, amplification gap, and induction fields—to ensure high gain, stability against discharges, and excellent timing properties. This project focuses on the design and optimization of ThickGEM and Micromegas detectors through comprehensive simulation studies. Using tools like Garfield++ and Magboltz, the student will model electron transport and gas amplification to determine the ideal geometric parameters. These findings will serve as the baseline for developing a detector prototype. The project aims to bridge the gap between theoretical modeling and physical hardware, to provide a detailed production strategy for a functional MPGD prototype.

Figure: (left) Simulation of an avalanche created by an electron in the gaseous amplification region in a GEM foil. (right) A macro photograph of a fabricated GEM foil.

Skills obtained: simulation of particle-matter interactions, MPGD hardware design, specialized software proficiency (Garfield++/GEANT4), R&D methodology, experience in high-energy physics instrumentation

References:  

Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen
Contact | Topic 50093 on PLATO

Cosmic dew: Measuring soil moisture with cosmic rays

The scarcity of water resources is naturally connected to the process of climate change. The project explores the feasibility of measuring the volumetric water content in soil through application of the innovative technology of cosmic-ray neutron sensing (CRNS) detectors. Cosmic-ray thermal neutron fluxes, which are inversely correlated with the presence of water above the ground, can be measured using radiation detectors capable of thermal neutron identification. A prototype of the soil moisture sensor will be developed to measure cosmic-ray neutrons above the ground with wireless data transmission and powered by solar panels. This probe will be built using plastic scintillators and lithium-6-enriched materials for neutron identification. The performance of the prototype will be compared to the state-of-art sensor that is currently collecting data at Proeftuin campus. The study will also make use of various simulation techniques to model the atmospheric flux of neutrons, their interactions in the soil, and detection efficiencies. The neutron fluxes will be studied as a function of altitude, soil type, and environmental conditions.

Figure: (left) Illustration of the main concept of probing soil moisture through detection of cosmic-ray neutrons. A fraction of thermal neutrons is absorbed by the water present in the soil, while the rest is detected by the cosmic ray neutron sensor (CRNS). (right) A CRNS probe installed at Proeftuin campus.

Skills obtained: particle detector development, design of readout electronics, application of IoT technology, simulation tools, data analysis

References:  

  • S. Gianessi et al., Testing a novel sensor design to jointly measure cosmic-ray neutrons, muons and gamma rays for non-invasive soil moisture estimation, Geosci. Instrum. Method. Data Syst. 13 (2024) 9, https://doi.org/10.5194/gi-13-9-2024
  • L. Stevanato et al., A Novel Cosmic-Ray Neutron Sensor for Soil Moisture Estimation over Large Areas, Agriculture. 2019; 9(9):202, https://doi.org/10.3390/agriculture9090202
Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen
Contact | Topic 50094 on PLATO

Bike collider: Accelerating particles by cycling

The infinitesimally small scales of particle physics are intrinsically linked to the natural phenomena occurring at vast distances in our universe. In order to recreate the conditions that took place at the very early stages of the Big Bang, the particles are accelerated to the speeds close to that of the light before they collide. High-energy particle collisions are at the heart of the Large Hadron Collider that is currently operating at CERN. A rich physics program and diverse technological challenges of such a major international project require an innovative approach to communicate scientific results and engage the general publiccommunity in fundamental research for the best integration of science in society. Science communication and outreach pave the way for laying out educational foundations and support from society for future research. The project illustrates the concept of a collider by implementing a visual acceleration of particles in the collider rings through actual cycling, for the particle beams to eventually collide and create new massive particles. Two static bikes are placed in front of each other to emulate the acceleration of the two beams in the collider rings. In order to be able to reach high energies, both participants will have to correlate their cycling speeds for the beams to meet at one point and at a given energy, touching upon sociological aspects of working in a tandem. The dynamos of the bikes will be connected to a series of LEDs to indicate the current position of particles in the rings. Electronics for the control system will use low energy consumption modules, such as Arduino. Additionally, a dedicated software running on a RaspberryPi device can be used to project a visualization of the collider on the wall. The installation will be used as part of science fairs and exhibitions in Belgium and at CERN. The project naturally finds its place at the intersection of particle physics, engineering, fitness and exercising, sustainability aspects, science communication, and education.

Figure: (left) A conceptual visualization of a particle collider controlled by two cycling participants. (right) A schematic representation of colliding beams at the Large Hadron Collider at CERN.

Skills obtained: embedded programming, STEM, science communication, educational methods, fundamentals of particle physics

References:  

Promoter: Prof. Didar Dobur
Supervisor: Dr. Kirill Skovpen
Contact | Topic 50097 on PLATO

Gravitational waves data analysis topics

Active Learning for Numerical Relativity

Gravitational waves (GWs) provide us with a unique method to probe spacetime at its very extremes. To analyse the signals we detect from coalescing binaries of compact objects (black holes, neutron stars) we need theoretical models for these sources. These so-called waveforms lie at the cornerstone of GW analysis. Constructing them however, is no easy task. Ideally we would use high-precision numerical relativity (NR) simulations as the waveform itself, but this is computationally infeasible. Instead, we can rely on approximate phenomenological models calibrated against a set of NR simulations. Due to the extremely high cost of NR simulations it is important to try and minimise the number of them to run. An interesting avenue to pursue may be to use an active learning scheme where new simulations are performed iteratively, depending on an uncertainty measure related to the waveform model [1,2]. At each iteration a new point in phase space is selected to perform an NR simulation for, based on the current waveform uncertainty and the waveform recalibrated against the updated set of simulations. This may lead to an efficient selection of NR simulations to perform.

In this proof-of-concept project you will not be performing NR simulations, but rather using the waveforms themselves as a ground truth for recalibration. You will use state-of-the art waveforms implemented in JAX, a Python package designed for parallelisation on GPU-hardware. Its autodifferentiation capabilities allow for efficient recalibration of waveforms [3], making it perfectly suited for the task at hand. The Ghent Gravity Group is highly involved in the development of JAX waveforms, bringing you near the source of cutting-edge GW research. You will also be part of a wider international team of researchers developing a JAX-based GW data analysis ecosystem.

[1] Andrade et al., Actively Learning Numerical Relativity
[2] Doctor et al., Statistical Gravitational Waveform Models: What to Simulate Next?
[3] Lam et al., Recalibrating Gravitational Wave Phenomenological Waveform Model

Promoter: Prof. Archisman Ghosh, Robin Chan
Contact | Topic 50006 on PLATO

Measuring the Hubble constant with LIGO-Virgo-KAGRA data

The fourth observing run (O4) of the LIGO-Virgo-KAGRA (LVK) gravitational-wave (GW) detector network ended in 2025 and a six-month IR1 observing run is planned later this year.

We have more than 200 announced detections and several more are expected to be announced soon. After interesting candidate events have been identified and their parameters have been measured, it will be time to obtain science results out of the observations. A central role played by researchers in UGent is in the Cosmology working group of the LVK. A short-term goal of this group is to measure the Hubble constant, H0, the local expansion rate of the universe. One uses the GW distance measurement together with complementary redshift information (from possible electromagnetic counterparts, host galaxies, or galaxy clusters) to infer the cosmological parameters such as H0. Due to a tantalizing discrepancy between the local and early-universe measurements of H0, now dubbed as the “Hubble tension,” such an independent measurement of H0 from the GW sector can prove to be invaluable.

Caption: The latest measurement of the Hubble constant by the LIGO-Virgo-KAGRA Collaboration; figure from [1].

In this project you will work with researchers who have developed the LVK codebase gwcosmo+ for cosmology inference and have put together the galaxy catalogues to go along with it. You will be a part of the team that carries out the first cosmology analyses on O4/IR1 data. If we are lucky, we may observe a multimessenger signal in IR1, and you will get to analyze the data from it. Until now, we have seen only one such multimessenger signal, namely GW170817.

The work involved will be computational. You will learn about statistical methods and get exposed to state-of-the-art data analysis techniques. You will be involved in a large international collaboration in the forefront field of GW science.[1] B. P. Abbott et al. “GWTC-4.0: Constraints on the Cosmic Expansion Rate and Modified Gravitational-wave Propagation,”https://arxiv.org/abs/2509.04348

Promoter: Prof. Archisman Ghosh
Contact | Topic 50027 on PLATO

Are the Einstein Field Equations Correct? Testing General Relativity with the Einstein Telescope

Gravitational waves (GWs) provide us with a unique way to probe strong-field gravity, making them exceedingly interesting to test general relativity (GR). One method to perform tests of GR is called TIGER [1] and has been well-established in the GW community. However, it is computationally expensive, making it difficult to run for the long signals we expect in next-generation detectors such as the Einstein Telescope.
To tackle the analysis of long-duration signals, one can exploit the speedup enabled by GPU hardware. There is currently an international effort being undertaken to create a GPU-compatible GW analysis framework, which Ghent University is actively contributing to. In this project you will learn how to work with JAX, a user-friendly Python package for GPU acceleration, and add TIGER to the recently implemented IMRPhenomX family of models [2,3]. After validating, you are free to explore different applications of the test to Einstein Telescope signals. You will be working in a small-scale international collaboration where you will have the opportunity to work closely with both the people who developed the TIGER pipeline and the developers of the JAX-based GW analysis toolset. This project will be computational and theoretical and provide an invaluable contribution to the wider JAX-GW effort.

[1] Agathos et al. TIGER: A data analysis pipeline for testing the strong-field dynamics of general relativity with gravitational wave signals from coalescing compact binaries
[2] Pratten et al. Computationally efficient models for the dominant and sub-dominant harmonic modes of precessing binary black holes
[3] Roy et al. An improved parametrized test of general relativity using the IMRPhenomX waveform family: Including higher harmonics and precession

Promoter: Prof. Archisman Ghosh, Robin Chan
Contact | Topic 50015 on PLATO

Neutrino-triggered GW searches

In the past 10 years since the first detection of gravitational waves from a binary black hole merger, we have detected hundreds of additional mergers. However, so far we have had only one multi-messenger detection: the binary neutron star merger GW170817. Additional multi-messenger detections would be very valuable, since they allow us to study source in more detail: gravitational waves (GW) measure the movement of matter and source properties, while electromagnetic (EM) and high-energy neutrino (HEN) emission provide insights into the interaction of matter in the source environment. Traditional searches for multi-messenger emission start from gravitational-wave detections made by all-sky searches for gravitational waves, and then perform a search for associated EM or HEN emission. However, if we have a confirmed source from EM or HEN observations, we can instead perform a more sensitive, targeted search for gravitational waves. Such searches can be about 20% more sensitive than all-sky searches, which means almost a doubling of the sensitive volume.

Gamma-ray burst as a typical candidate for associated high-energy neutrino and gravitational-wave emission. A single high-energy neutrino can be used as a trigger for a targeted search for gravitational waves.

In this project, you will perform targeted searches for gravitational waves triggered by high-energy neutrinos detected by IceCube. Such neutrinos have a high probability of being of astrophysical origin, due to their declination and energy. These neutrinos can be produced by active galactic nuclei, tidal disruption events, gamma-ray bursts, galactic sources,… If the source is a gamma-ray burst, produced by either the merger of two neutron stars or the core collapse of a massive star, we also expect detectable gravitational-wave emission. You will help carry out the analysis of data from the fourth Observing run of the LIGO-Virgo-KAGRA collaborations using two targeted search pipelines that can detect a vast range of gravitational wave sources. Your work will be computational, and you will be involved in a large international collaboration in the forefront field of GW science.

Promoter: Dr. Matthias Vereecken, Prof. Archisman Ghosh
Contact | Topic 50023 on PLATO

Gravitational waves instrumentation topics

The direct detection of gravitational waves (GWs) is a breakthrough discovery of recent years. The several GW detections by the Advanced LIGO and Virgo detectors, since they were first discovered in 2015, have opened up a new window to the observable universe. The current “second generation” detectors are Michelson interferometers with km-scale arms. The principle behind the detector is based on the fact that when a GW passes through, the arms of the interferometer are stretched and squeezed in opposite ways in the two directions.

Effect of a gravitational wave (propagating in the direction perpendicular to the plane of the paper) on the arms of a Michelson interferometer.

The current detectors will eventually be replaced by the next generation of GW detectors of significantly higher sensitivity and distance reach. One of these “third generation” detectors is the Einstein Telescope (ET), which is planned to be built in Europe in the 2030s. Fundamentally new technology will be required to go beyond the limitations of the current detectors. Extensive work and study is required for each of the new techniques to be implemented. ETpathfinder is an R&D facility, located in Maastricht, the Netherlands, with the aim to test these new techniques and to give important inputs for the design and the construction of third generation GW detectors like ET. The ETpathfinder is a Fabry-Perot Michelson interferometer with 10 m arm cavities working at cryogenic temperatures. It will mainly focus on developing prototypes and testing cryogenic temperatures (120 K, 15 K), new mirror material (Silicon), new laser wavelengths (1550 nm, 2090 nm), and advanced quantum noise reduction techniques.

Calibrating GW detectors using scattered light

With the increase of the sensitivity, gravitational-wave detectors will require calibration with better accuracy and precision. Currently, gravitational-wave detectors are calibrated using two independent methods, in order to be able to cross-check the results. The two methods are the Photon Calibrator (PCal) and Newtonian calibration method (NCal) and for both of them the calibration is done inducing a displacement on the mirrors.

A new independent technique which can use scattered light as a signal for the detector calibration was recently proposed [M. Wąs et al., 2021, Class. Quantum Grav. 38 075020]. The scattered light is injected in the interferometer from the back of the end mirrors using a scattering element that can be modulated in amplitude. Unlike the other two, this technique does not involve the movement of the mirrors and thus it can be very useful to cross-check the results and be sure that the mirror suspensions do not add any error in the model.

The Ghent Gravity Group, in collaboration with the University of Antwerp, is planning to develop for the Advanced Virgo detector this new calibration technique. This is a very innovative work, since this technique has not yet been used in any other GW detector so far. The work involved will be hands-on instrumentational, with focus on optics and mechanics. The project will be under joint supervision with Dr. Daniela Pascucci (UGent) and Prof. Hans Van Haevermaet (UAntwerpen). For some parts of your research, you may need to work at the lab in UAntwerpen. You will also get a chance to work at the ETpathfinder facility in Maastricht and be a part of the exciting activity in the Belgian-Dutch region centred around the Einstein Telescope.

Layout of Advanced Virgo. The scattered light used for the calibration will come from the Suspended West and North End Benches (SWEB and SNEB) behind the West End (WE) and North End (NE) mirrors.
Promoters: Prof. Archisman Ghosh, Prof. Hans Van Haevermaet (UAntwerpen)
Supervisor: Dr. Daniela Pascucci
Contact | Topic 49996 on PLATO

Development of the Output Mode Cleaner for ETpathfinder

A mode cleaner is an optical device that allows us to select a preferred mode for the light beam and removes any other unwanted modes. It is basically an optical cavity with resonance frequency equal to the one of the selected mode, which will then be enhanced while the others are reduced. The OMC is fundamental to improve the read out signal.

In GW detectors the input beam is a (almost) purely Gaussian beam, thanks to the presence of an Input Mode Cleaner. However due to optics aberrations and thermal effects the output usually also has some higher order mode components. Furthermore, GW detectors are set to have the dark port at the output of the interferometer, but optical imperfections between the two arms lead to an incomplete cancellation of the main beam and the control sidebands. When this “junk light” reaches the photodetectors at the output port of the interferometer, it increases the shot noise and, if it is time-dependent, it will also create additional noise.  Therefore it needs to be filtered out. The figure below shows the layout of the OMC of Advanced LIGO.

Layout of Advanced LIGO OMC. All the optical components are bonded on a single breadboard made of fused Silica [8].

The first step of the project will be to define the design of the OMC and a complete simulation analysis needs to be done to have the final configuration and requirements.

The OMC is a very refined optical device and its performance is very much influenced by the quality and accuracy with which all its components and the full assembly is manufactured. So, in order to characterise the OMC, a number of tests needs to be done. The characterisation phase of  the project will start with tests on the single components, both optical and electronics and, in the end, tests on the performance of the full device will be done. In these tests we will include measurements of the mirrors radius of curvature, centre of curvature, transmissivity and scattering, measurements for the characterisation of the photodiode in terms of response and dark noise, measurements of the piezo length noise and measurements to evaluate the alignment, backscattering and cavity length of the assembled OMC. The measurements will be done in collaboration with Nikhef, Maastricht University and VUB.

References:

[1] A. Koji et al., Output Mode Cleaner Design, LIGO technical note, LIGO-T1000276-v5 (2013)

Supervisors: Dr. Daniela Pascucci, Prof. Archisman Ghosh
Contact | Topic 49983 on PLATO

Second harmonic generators for 2µm wavelength

Second harmonic generator (SHG), also known as frequency doubling, is a mechanism happening when a light beam passes through a material with non‐linear dielectric coefficient. 

Using non-linear crystals as SHG to create green light from an infrared light source has been widely used in the past in a wide range of applications, like display technology, biomedicine, etc. Also in GW instrumentation this technique is not new. In fact, it is used in Advanced Virgo to generate from a 1064nm input source the green light of auxiliary lasers needed for the cavity control system and the squeezing system. However, it has never been used for 2 μm wavelength. Furthermore, since one of the biggest challenges for next generation detectors is the development of high‐efficiency photodetectors, like cameras and photodiodes, and this study can be a first step to develop a method can be used to overcome the problem, changing the wavelength of the laser beam before it reaches the photodetector. Currently the most used materials (from IR to green) are the Periodically-poled Lithium Niobate (LiNbO3 or PPLN) and the Periodically Poled-Potassium Titanyl Phosphate (KTiOPO4 or PPKTP). However, other materials will be studied and taken into account.

The main objective of the proposed research is to study possible SHG materials for 2 μm wavelength, analysing if they are suitable for GW detectors.

Schematic representation of how a non-linear optical medium acts as a second harmonic generator.
Supervisors: Dr. Daniela Pascucci, Prof. Archisman Ghosh
Contact | Topic 49990 on PLATO

Modelling the effect of Newtonian noise for the Einstein Telescope

Newtonian noise or gravity gradient noise is an effect of the change of the Newtonian gravitational potential due to fluctuations of displacements on the surface of the Earth. Although it is a small effect for current GW detectors, it can become a major nuisance for future detectors such as the ET. Newtonian noise couples to a test mass in a manner very similar to GWs, and it is not possible, even in principle, to remove this noise source (except via active subtraction). It is therefore important to understand and quantify the impact of Newtonian noise. While estimation of Newtonian noise from homogeneous media has been well explored [1], a lot remains to be known in cases of complex, heterogeneous media.

(a) A schematic of Newtonian-noise coupling to the test-mass. (b) Contribution of different fundamental noises to ET’s sensitivity with Newtonian noise dominating the low-frequency contributions.

In this project, you will analyze the generation mechanism of Newtonian noise and simulate scenarios that will best represent the contribution of Newtonian noise to ET’s low-frequency sensitivity. Simulations of elastic waves in a heterogeneous medium will be performed using a spectral finite element solver SPECFEM3D. A software library will be developed to compute the gravitational acceleration on the interferometer’s test-masses, based on the simulated displacement of the medium’s elements. In particular, focus will be given to the understanding of contributions from different parts of the medium like interfaces, and cavern walls that host the vertices of ET.

You will work in collaboration with researchers in the University of Liège, including Dr. Soumen Koley, who is one of the leading experts in modeling Newtonian noise for GW detectors.

Your results may have a significant impact in bringing the ET to Belgium!

Supervisor: Prof. Archisman Ghosh
Contact | Topic 50028 on PLATO

Neutrino experiment topics

Development of a synthetic electron source for neutrino mass experiments

The neutrino mass remains one of the major unknowns in both particle physics and cosmology. The neutrino mass is not zero, which we know from the observation of neutrino oscillation. But it is tiny, so tiny, that we have not yet been able to measure it. The present-day experiment KATRIN has just finished data taking, and produced an upper limit of 0.45eV for the electron neutrino mass, with about half of the data analysed [1]. For the next generation neutrino mass experiment, development of new technology is required, which the Project 8 collaboration [2] is developing. At UGent, we are developing large-volume resonant cavity detectors that will very precisely measure the energies of electrons released in tritium beta decay alongside neutrinos, and designing an experiment to reach a sensitivity of 0.04eV. The technique used to measure electron energies is called Cyclotron Radiation Emission Spectroscopy (CRES). To charecterize the cavities we design and build, we need a calibration source. Known sources would require a vacuum environment, a magnetic field, and cryogenics – making a calibration too expensive and involved.  In this Master project, you will explore the desired qualities of a synthetic CRES source. This has been previously achieved in a non-cavity environment [3], but needs to be adapted to the resonant cavity environment. The cavity signal phenomenology is currently being further developed within the group, and the cavity is being designed. This Master project has components of both phenomenology / modeling, as well as design / engineering and is quite exploratory. You will be embedded in the local neutrino mass group, as well as the international Project 8 collaboration.

Figure 1: Schematic view of a mid-scale Project 8 neutrino-mass experiment. Cold atomic tritium is guided into a large, evacuated cavity where it decays. To prototype the cavity, we need to develop an alternative source.

References: 

[1] M. Aker et al. [the KATRIN collaboration], Direct neutrino-mass measurement based on 259 days of KATRIN data, Science 388, 180-185 (2025), https://arxiv.org/abs/2406.13516  

[2] The Project 8 experiment: https://www.project8.org/  

[3] A. Ashtari Esfahani et al. [the Project 8 collaboration], SYNCA: A Synthetic Cyclotron Antenna for the Project 8 Collaboration, JINST 18 (2023) P01034, https://arxiv.org/abs/2212.08026 

Promoter: Prof. Juliana Stachurska
Contact | Topic 50117 on PLATO

Towards the neutrino mass measurement with Project 8

The neutrino mass remains one of the major unknowns in both particle physics and cosmology. The neutrino mass is not zero, which we know from the observation of neutrino oscillation. But it is tiny, so tiny, that we have not yet been able to measure it. The present-day experiment KATRIN has just finished data taking, and produced an upper limit of 0.45eV for the electron neutrino mass, with about half of the data analysed [1]. Project 8 [2] is a next-generation direct neutrino mass experiment using the tritium endpoint method. To reach the target sensitivity of 40 meV, new technologies are required. Among them is Cyclotron Radiation Emission Spectroscopy (CRES), a non-destructive technique to measure electron energies via the frequency of their cyclotron radiation. CRES has been successfully demonstrated on a small scale in a waveguide section. To be able to scale to large volumes of tritium gas, we are exploring resonant cavities as detectors, which is a focus of the UGent group. The first demonstrator of CRES in cavities, the Cavity CRES Apparatus, is being commissioned at the University of Washington in Seattle and expected to start data taking in Summer 2026. In this project, you will assistwith data taking remotely (such as by monitoring data quality and performing initial reconstructions), refine reconstruction algorithms, and study the electron energy resolution in the cavity. Experience with Python and basic knowledge of C++ is helpful. This is a software / data analysis project. You will be embedded in the local neutrino mass group, as well as the international Project 8 collaboration.

Figure 1: The elements of the Cavity CRES Apparatus which is being commissioned at the University of Washington. 

References: 

[1] M. Aker et al. [the KATRIN collaboration], Direct neutrino-mass measurement based on 259 days of KATRIN data, Science 388, 180-185 (2025), https://arxiv.org/abs/2406.13516 

[2] The Project 8 experiment: https://www.project8.org/  

Promoter: Prof. Juliana Stachurska
Contact | Topic 50120 on PLATO

Assessment of the “double pulse” feature for tau neutrino identification in IceCube and IceCube-Gen2

IceCube is the world’s largest detector, with an instrumented volume of 1km^3 and located at the South Pole. It measures local, atmospheric neutrinos and astrophysical neutrinos arriving from the depth of the cosmoswith energies far beyond those reachable with human-made accelerators [1]. The UGent group focuses on the analysis of the astrophysical flavor composition, i.e. the ratio of electron-, muon-, and tau-neutrinos. This composition relates to (a) the production mechanisms at cosmic particle accelerators, and (b) potential New Physics affecting the neutrino sector. Cosmic sources are expected to produce only electron and muon flavors, no tau neutrinos, but the latter appear en-route to Earth. The atmospheric flux at the energies of 10s of TeV to PeV contains virtually no tau neutrinos. Therefore, the identification of tau neutrinos is not only necessary to measure the flavor composition, but also a smoking gun signature of astrophysical neutrinos. However, tau neutrinos are notoriusly difficult to identify, and in 12 years of analyzed data, only a dozen or so candidates were found. In this Master project, you will look at one signature indicative of tau neutrinos: a double pulse, created when there is a delay between when light from the tau neutrino interaction (and tau creation) and light from the subsequent tau decay reaches a detector module. This is based on a 3-year analysis [2], where you are expected to first adapt the old code to the modern age. Then you will perform a study of the influence of the distance between the interaction vertex and the nearest module on the double pulse detection efficiency using simulated data. The simulated data exist and are also used within the group for work on other aspects of tau neutrinos. Finally, you will study the sensitivity of the proposed extension to IceCube, IceCube-Gen2, to this tau-neutrino detection method. This is a software / data analysis project.

Figure 1: Schematic view of the IceCube detector. Over 5000 Digital Optical Modules are arranged on 86 strings in depths of 1450m and 2450m in the ice at the South Pole. 
Figure 2: Schematic of a double pulse induced by a tau neutrino interaction. Light from the tau interaction reaches the detector module first, light from the tau decay later.

References:  

[1] M. Aartsen at al. [the IceCube collaboration], Evidence for High-Energy Extraterrestrial Neutrinos at the IceCube Detector, Science 342, 1242856 (2013), https://arxiv.org/abs/1311.5238  

[2] M. Aartsen at al. [the IceCube collaboration], Search for Astrophysical Tau Neutrinos in Three Years of IceCube Data, Phys. Rev. D 93, 022001 (2016), https://arxiv.org/abs/1509.06212 

Promoter: Prof. Juliana Stachurska
Contact | Topic 50121 on PLATO