- Home
- Users & Science
- Scientific Documentation
- ESRF Highlights
- ESRF Highlights 2012
- Enabling technologies
Enabling technologies
The instrumentation and enabling technology-related activities aim to provide state-of-the-art instruments and infrastructure to support the ESRF’s forefront science programme. When designing a beamline, one of the primary objectives is to preserve the photon beam properties along the beamline and in particular, to ensure that the full brightness of the source reaches the sample. However, following a more holistic approach, one should aim to achieve an optimum conversion of brightness into high-quality data and ultimately extract the most significant scientific output from the recorded data. In this context, the design specifications should span from optical design to online data analysis. Most synchrotron beamline projects worldwide show two different trends in the way beamlines are designed. On one hand, there are beamlines that perform challenging but routine measurements. This class of beamlines targets high throughput, reliability and standardisation and requires an increasing level of automation, intelligent decision algorithms and state-of-the-art data acquisition chains. On the other hand, the second class of beamlines performs experiments that require instrumentation with a high level of flexibility and versatility, where the scientist is constantly involved in the decision process. In this case, the use of tailored tools to support the decision process helps to minimise the dead-time and maximises the use of beam time.
Both classes of beamlines rely increasingly on modelling tools and on an optimised acquisition system from the high-throughput detectors down to the strategy for data transfer, processing and storage. In 2012, many new projects were launched while others were consolidated. Most of them concerned the hardware (optics, mechanics, detection and control systems, etc.). The long shutdown offered a unique opportunity to update several beamline control software applications and to strengthen our initial efforts in developing a comprehensive strategy for data processing and online data processing. By definition, computing infrastructure strives for invisibility. Our users are understandably interested in the applications that perform useful services for their scientific endeavour, rather than in the layers of system abstractions or the physical components that make these services possible. Whether it takes place at the level of operating systems, data centres, communication protocols, file formats, standardisation, or policy work, the design, planning, and operation of the computing infrastructure are major professional activities which are in constant evolution. In 2012, the invisibility of the computing infrastructure has been challenged by the needs to transfer massive amounts of data while simultaneously analysing the data. With finite financial resources at our disposal, delicate trade-offs had to be made and we decided to put emphasis on the enhancement of the data communication network and disk-based data storage.
This year, the chapter on enabling technologies reflects this effort and thus comprises a collection of four articles, focussing on software and computing infrastructure issues.
The first article illustrates the importance of optical simulations when dealing with coherent beams. Indeed, the dramatic improvement of synchrotron source properties has driven the development of new simulation tools, no longer using a purely ray-tracing approach but rather based on an accurate wave-propagation model that takes into account all physical properties of both the incident radiation and the optical system involved in the propagation of the beam. Among many such projects developed worldwide, the article selected treats the particular case of focusing X-ray multilayers for nanofocusing applications.
The second article exemplifies the efforts of the ESRF in developing on-line data analysis packages to optimise the coupling of feedback from data analysis to the experimental control system for macromolecular crystallography experiments. The article reports on a new intuitive graphical user interface (GUI) called the Data Analysis WorkbeNch (DAWN), which provides a flexible framework to build complex process workflows, used in macromolecular crystallography and other domains.
The third article deals with high-resolution diffraction experiments using multi-million-pixel area detectors producing huge volumes of complex data. For this class of experiments, data processing increasingly requires innovative computing approaches to reduce the volume of collected diffraction data into exploitable diffraction patterns in a way that does not slow down the overall acquisition process. This article is an introduction to a new algorithm, PyFAI, which has been designed to reduce SAXS and WAXS patterns by azimuthal integration using a new approach. The final article discusses the current state-of-the-art in disk-based data storage, describing the trade-offs required to find the right balance between performance, functionality, compatibility with client operating systems, ease of operation, reliability and, last but not least, cost.
R. Dimper and J. Susini