Meet the ESRs: Vasyl Hafych

Hello everybody!

My name is Vasyl, and it has been more than a year since I have started this amazing Ph.D. journey as an Early Stage Researcher as part of the INSIGHTS Innovative Training Network. One could say that my introduction is a year overdue, though during this year my research trajectory managed to converge, which means that I can introduce myself with more clarity.

I was raised in a small city located on the western side of Ukraine called Ivano-Frankivsk. For those who are not familiar with eastern Europe, this city is located close to the geographical center of Europe. This part of Ukraine is famous for the special role of national traditions and culture in people’s lives. Being very traditional, yet modern, this region attracts many tourists that want to visit the Carpathian mountains, go hiking, skiing or to eat the best Ukrainian food.

Ratusha is a rathaus in the downtown of the city of Ivano-Frankivsk at the city’s Market Square. [retrieved from:]

I left Ivano-Frankivsk and moved to Kyiv — the capital of Ukraine — when I turned 17, to pursue a bachelor’s degree in the department of Radiophysics, Electronics, and Computer Systems. It was my dad introducing me to electrical engineering when I was a child that first sparked my interest in physics. What I have always found so appealing about physics is a fundamental way of understanding how our world works. The creativity inherent in crafting and applying simple concepts for an explanation of complicated processes is what inspired me to study physics and to pursue a bachelor’s degree in the subject. It turned out much later that Artem and Vitaliy — which are my INSIGHTS colleagues — started studying there at the same time with me!

In addition to my interest in physics, I have always been fascinated by programming, especially in the context of physics. Many examples of this have fascinated me, such as seeing how the Boltzman or Vlasov equations can be numerically solved for the kinetic simulation of plasma, and seeing the wonders of protein folding obtained from computationally expensive Markov Chain Monte Carlo simulations. This interest never left me, only growing more profound and passionate with every new subject I took at university.

The Old Town Hall in Munich. The building was first mentioned in city records in 1310, and it was the seat of the Munich city council until 1847.

Inspired through my great love for physics and programming, my next step was completing a master’s degree in a program called Atomic Scale Modelling of Physical, Chemical, and Bimolecular Systems organized by the European Commission. My classmates and I have been living and studying in the Netherlands, Italy, and France. This was a remarkable experience that broadened all of our cultural and research horizons tremendously. I completed my final project at the European Center for Atomic and Molecular Calculations in Lausanne, where I have been working on the quantum free energy reconstruction using Langevin-guided Monte Carlo. This stimulated my current interest in Markov chain Monte Carlo methods, which — together with knowledge of all benefits of EU-funded scholarships — led me to apply for my Ph.D. at the INSIGHTS Innovative Training Network.

One can wonder what is my Ph.D. project about? Broadly speaking, it is a mixture of physics, statistics, and programming. I have the privilege of working on these topics at the Max Planck Institute for Physics in Munich, with supervision from Prof. Allen Caldwell and Dr. Oliver Schulz. Working in these conditions has allowed me to develop my research skills tremendously. From spending an overnight shift in the AWAKE control room at CERN collecting experimental data, to using hundreds of CPUs for massively parallel computing, the past year has brought a lot of new experiences to my life. I can keep writing much more about them, but instead, I encourage you to have a look at a paper on parallelization of the Markov chain Monte Carlo technique that we are planning to publish in the near future. There, you will be able to find a more detailed explanation of all the interesting things that we do.

Stay tuned!

Swans seem to be pretty common near Lake Geneva. This picture was taken during my secondment at CERN.

Meet the ESRs: Rahul Balasubramanian

Duck’s-eye view of the city of Amsterdam from the Amstel river. The city has more than 100 kilometres (60 miles) of canals which divides it into about 90 islands, which are linked by more than 1,200 bridges.

Hi, I’m Rahul from Chennai, India and I work at Nikhef in Amsterdam for the ATLAS experiment with Wouter Verkerke and Pamela Ferrari who have given me the opportunity to join the INSIGHTS program . I would like to use this post to describe I got the chance to be part of this network.

IIT Madras is spread over 620 acres and a large chunk of it is a protected forest. Running into spotted deers, black bucks, and other wildlife is a daily encounter and provides a unique experience whilst living on campus.

Rewinding back to the summer of 2012 when the discovery of the Higgs Boson was announced to the world, I vividly remember taking the first steps towards my undergrad institute in India. I was immediately attracted to the Engineering Physics program at the Indian Institute of Technology Madras and it was a fascinating amalgam of physics and electrical engineering. I was able to obtain a good foundation in these two fields thanks to excellent and challenging courses offered by the two departments. From learning how fundamental spin-1/2 particles were described by the Dirac equation- to understanding how transistors are used to design analog and digital circuits –I had been exposed to more than I could ask for !

Bird’s-eye view of the French Riviera. The region is rich in natural and cultural diversity.

My first project in physics was in the summer of my pre-final year of undergrad where I worked in Laboratoire Cote d’Azur in Nice on a project to measure the temperature gradient in the solar photosphere. My enthusiasm for physics grew by leaps and bounds after this and I pursued a couple of other projects in solar physics including my bachelor’s thesis where we built radio telescopes using off-the-shelf components to study solar radio-emissions.

The CERN summer student program offers a great opportunity to work at CERN and meet people from all around the world.

I had my first particle physics course in my final year and I found it exciting that we can answer fundamental questions about our universe using physics and state-of-the art technology. I had the privilege of being chosen as a CERN summer student to work with the LHCb experiment. I was energised by the environment at CERN, where I was able to meet people from all around the world working together on a common scientific pursuit.

I learned a lot about particle physics during my masters at University Paris-Saclay. The university is associated with large number of research laboratories in the region, giving access to a lot of researchers in the field. This not only benifits our learning experience but also provides the opportunity to work with during summer internships. I had the chance to work on experimental analysis project for the ALICE experiment in the first year and in a theoretical project on heavy-strange meson in lattice QCD in the second year.

I learnt a lot about physics and research from my mentors and the INSIGHTS project gives me the oppurtunity to continue this journey. It has been a busy first year and I hope I can be more active on the blog front in the future. Thanks !

INFN School of Statistics

The lecture programme of the INFN School of Statistics began today, June 3rd 2019 in Paestum, Salerno, Italy.

This edition has close to 100 participants, which is the record number for this series of schools, now at the 4th edition. Many ESRs from the Insights project are in Paestum and will participate in the school.

The programme includes introduction to probability, parameter estimate, hypothesis testing, multivariate analysis and machine learning with hands-on exercise session.

The social programme includes a visit of the greek temples and museum of Paestum, kindly offered by the Archeological Park of Paestum.

Here follows the list of lecturers:

  • Glen Cowan, Royal Holloway, University of London
  • Sergei Gleyzer, University of Florida
  • Eilam Gross, Weizmann Institute of Science
  • Mario Pelliccioni, INFN Torino
  • Harrison Prosper, Florida State University, Tallahassee
  • Aldo Solari, University of Milano-Bicocca


Generating events with GANs

In the coming years, the LHC experiments will need to produce an incredible amount of Monte Carlo events to match the expected increase in luminosity. This challenge, together with the need of using more sophisticated generators, will stretch the computing resources and may limit the physics reach of the experiments.

In our paper, we provide a solution to overcome these problems by using Machine Learning techniques, specifically Generative Adversarial Networks (GANs). In this first attempt, we focussed on the relatively simple di-jet events but we also prepared the tools to produce more complex events such as top quark pair events and multi boson events which are produced in large quantities at LHC.

We actually trained two networks, one using the output of the generators (which is commonly called particle level) and another after the detector simulation and reconstructions (also known as reconstruction or detector level). The good agreement of first network with the training sample demonstrates that it is possible to use a relatively small number of events to train a GAN which can be subsequently be used to significantly increase the number of events as we are able to to generate 1 million events in less than a minute.  The good agreement of second network demonstrate that the detector response and even the reconstruction steps can be reproduced by a GAN.

The results we obtained are a clear indication that GANs can have a high impact on LHC experiment in several areas. You can find more details on the paper but we would like to share a video showing the learning of the GAN as it is trained which cannot be included in the paper.

Let us know what you think of this new application of the GAN in the comments below or by contacting the authors of the paper.

Michele and Serena



INSIGHTS visits DESY: the first Terascale School of Machine Learning

Written by Sitong An, Artem Golovatiuk, Nathan Simpson, and Hevjin Yarar. Edited by Sitong An.


A small squad of INSIGHTS ESRs (Sitong An@ CERN, Artem Golovatiuk @ Università di Napoli, Nathan Simpson @ University of Lund, and Hevjin Yarar @ INFN Padova) visited DESY for the 1st Terascale School of Machine Learning from 22 to 26 October 2018. This is our long overdue account of the school and the competition event that followed (spoiler alert: we won!).

P.S. Nathan, our newly-anointed team vlogger, has made a wonderful video about the event. Check it out at [here]!

A bird’s-eye view of DESY and the Machine Learning School

A bird’s-eye view, literally

DESY (Deutsches Elektronen-Synchrotron) is a national centre for Particle Physics, Accelerator Physics and Photon Science in the suburb of Hamburg, Germany. It used to host important Particle Physics facilities like HERA, which was a lepton-proton collider aimed to probe internal structure of protons and properties of quarks (“is there anything smaller hidden inside the quarks?”). Nowadays, the focus of on-site facilities has gradually shifted towards Photon and Accelerator Science, with sizeable groups of researchers working on data from ATLAS and CMS at CERN. DESY is one of the research partners of INSIGHTS, with Dr. Olaf Behnke from DESY as a member of the network.

Main entrance to DESY. Photo credit: Sitong An

The 1st Terascale School of Machine Learning covered an introduction to Deep Learning and hands-on tutorials on the usual tools of the trade: PyTorch, Tensorflow and Keras. It also went beyond the basics to include several talks from experts in the fields on advanced topics like GANs (Generative Adversarial Networks) and semi-supervised/unsupervised learning.

Highlight of the Expert Talks

When using machine learning methods in high energy physics (HEP), the usual paradigm is to train on simulated data, while validation and testing are done on real data collected by the detector. In reality, we are unable to perfectly model real data, so there will always be discrepancies between our simulation and the real world. One of the talks was given by Benjamin Nachman on ‘Machine Learning with Less or no Simulation Dependence’, who is tackling this problem with weakly supervised machine learning. Directly training on data is not possible since we do not have labels. However, in the case of two classes (such as q and g for quark vs gluon jets in data) that are well-defined, i.e. q in one mixed sample is statistically identical to q in other mixed samples, two methods were discussed: training using class proportions of mixed samples (ref )  and training directly on data using mixed samples (ref ). This talk was a great opportunity for us to learn about new, simulation-independent approaches in to search for new physics with Machine Learning.

On the last day of the school Gilles Louppe gave a talk on ‘Likelihood-free Inference’. When discriminating between a null hypothesis and an alternative, the likelihood ratio is the most powerful test statistic. In the likelihood-free setup, the ratio of approximate likelihoods is used, which is constructed by projecting the observables to a 1D summary statistics and running the simulations for different parameters of interest. Reducing the problem to 1D is not ideal since we then lose the correlations of the variables. One of the introduced ideas to address this was to do a supervised classification to estimate the likelihood ratio. In this way, one does not need to evaluate individual likelihoods and can use the estimated ratio for inference. For details, here is a link to check out.

The Machine Learning Challenge

As part of the school, a machine learning challenge was held to allow students to test out their newly-acquired skills with a problem and a data set from particle physics. Specifically, this involved the tagging of heavy resonances, i.e. being able to distinguish heavy objects like the top quark, W and  Z boson, or the Higgs from light quark and gluon jets. These jets leave energy deposits in the calorimeters in the detector, which can then be mapped to images, which look a bit like this:

Overlay of 100k particle jet images, taken from

Using these images and the data from the detector, such as transverse momentum, pseudorapidity, and combinations of different variables, we were tasked with building a machine learning solution to classify jets as coming from a top quark or not. The challenge was organised by Gregor Kasieczka, who recently authored a nice summary paper on this very topic (machine learning for top tagging) – check it out at

So what did we come up with, and how well did it perform?

Our INSIGHTS team had several major advantages comparing to the other participants. First of all, we were the team of 4 people working together, leading to many fruitful discussions. This also allowed us to try different approaches at the same time and to distribute parts of the task (data preprocessing, trying out different hyperparameters or architecture or the model, etc). What’s more, we had access to the GPU-machine in the University of Naples, which gave us a great boost in computational power and a possibility to play around with relatively large models.

The winning model was jokingly named as “A bit Tricky Beast”, because it was an epic Frankenstein’s monster composed of two Neural Networks trained separately, brought together by the third Neural Network. And there was a little trick in a way we trained the model. First network was a CNN (Convolutional Neural Network) taking jet images as an input. It was already pretty big with about 1.7 million parameters. The second network was an RNN (Recurrent Neural Network) taking the preprocessed constituents. We used particles 4-momenta together with physically motivated high-level features as invariant mass m2, transverse momentum pTand pseudo-rapidity . Finally, as a cherry on the top, we used several fully-connected layers to combine the outputs from CNN and RNN, and produce one number – probability of jet coming from the top quark.

The trick was in the way of handling the data. In order to mimic the effect of data-monte carlo disagreement, the data for scoring the solutions differed from the training data with some small fluctuations. However, the part of test data provided to us and the part organisers used for final scoring had the same fluctuations. Therefore, after a thorough training of our network on the provided training set, we trained it for a bit on the provided test data. This allowed our network to learn some features of the fluctuations applied to the test data and slightly boosted the performance.

The schematic representation of the models architecture.

The network itself together with Jupyter notebook can be found at

After 9 hours of continuous coding, collaborating and drinking coffee, we produced several networks (with very slight differences among them) that took 6 first best scores on the challenge!

Our winning photo with the challenge organiser Gregor Kasieczka and our prize – a bottle of nice Austrian wine 🙂


Overall this school was a wonderful and fruitful experience for us. The breadth of the introduction allowed us to learn about and compare different Deep Learning tools, and the talks on the advanced topics offered a glimpse into the kind of problems on the frontier of the field that the experts are working on. And – fairly obviously – we enjoyed thoroughly the hospitality of the school organisers, the tranquil campus of DESY and the city of Hamburg!