Skip to content

Commit

Permalink
rewrite abstract to include CernVM-FS
Browse files Browse the repository at this point in the history
  • Loading branch information
boegel committed Dec 17, 2024
1 parent d95608a commit 522e2f9
Showing 1 changed file with 22 additions and 11 deletions.
33 changes: 22 additions & 11 deletions isc25/EESSI/abstract.tex
Original file line number Diff line number Diff line change
@@ -1,21 +1,32 @@
What if there was a way to avoid having to install a broad range of scientific software from scratch on every HPC
cluster or cloud instance you use or maintain, without compromising on performance?
What if there was a way to avoid having to install a broad range of scientific software from scratch on every
supercomputer, cloud instance, or laptop you use or maintain, without compromising on performance?

Installing scientific software for supercomputers is known to be a tedious and time-consuming task. The application
software stack continues to deepen as the
HPC user community becomes more diverse, computational science expands rapidly, and the diversity of system architectures
High-Performance Computing (HPC) user community becomes more diverse, computational science expands rapidly, and the diversity of system architectures
increases. Simultaneously, we see a surge in interest in public cloud
infrastructures for scientific computing. Delivering optimised software installations and providing access to these
installations in a reliable, user-friendly, and reproducible way is a highly non-trivial task that affects application
developers, HPC user support teams, and the users themselves.

This tutorial aims to address these challenges by providing the attendees with the tools to \emph{stream} the optimised
scientific software they need. The tutorial introduces European Environment for Scientific Software Installations
(\emph{EESSI}), a collaboration between various European HPC sites \& industry partners, with the common goal of
creating a shared repository of scientific software installations (\emph{not} recipes) that can be used on a variety of
systems, regardless
of which flavor/version of Linux distribution or processor architecture is used, or whether it's a full size HPC
Although scientific research on supercomputers is fundamentally software-driven,
setting up and managing a software stack remains challenging and time-consuming.
In addition, parallel filesystems like GPFS and Lustre are known to be ill-suited for hosting software installations
that typically consist of a large number of small files. This can lead to surprisingly slow startup performance of
software, and may even negatively impact the overall performance of the system.
While workarounds for these issues such as using container images are prevalent, they come with caveats,
such as the significant size of these images, the required compatibility with the system MPI for distributing computing,
and complications with accessing specialized hardware resources like GPUs.

This tutorial aims to address these challenges by introducing the attendees to a way to \emph{stream}
software installations via \emph{CernVM-FS}, a distributed read-only filesystem specifically designed
to efficiently distribute software across large-scale computing infrastructures.
The tutorial introduces the \emph{European Environment for Scientific Software Installations (EESSI)},
a collaboration between various European HPC sites \& industry partners, with the common goal of
creating a shared repository of optimised scientific software installations (\emph{not} recipes) that can be used on a variety of
systems, regardless of which flavor/version of Linux distribution or processor architecture is used, or whether it's a full size HPC
cluster, a cloud environment or a personal workstation.

We cover the usage of EESSI, different ways to accessing EESSI, how to add software to EESSI, and highlight some more
advanced features. We will also show attendees how to engage with the community and contribute to the project.
We cover the installation and configuration of CernVM-FS to access EESSI, the usage of EESSI, how to add software
installations to EESSI, how to install software on top of EESSI, and advanced topics like GPU support and performance
tuning.

0 comments on commit 522e2f9

Please sign in to comment.