Features
Science Computing SRS Computing from Zenith to Closure – 1987 to 2008

by

Mark Enderby and Glenys McBain

As work gathered pace on designing the next generation synchrotron light source (Diamond) during the 1990s, thoughts began to turn to next generation data acquisition and analysis.

Since the SRS started, development had been somewhat evolutionary, driven by both the needs of a particular science area and also the availability of computing technology, which had started to develop at an ever increasing rate. As the expansion of the SRS plateaued and funding became more restricted as attention started to be diverted to Diamond, it became increasingly apparent that separate resources for each science area were no longer sustainable.

As the SRS had developed, Computing Group members (as for other engineering support) had been devolved to individual science groups as part of more integrated development teams. As resources became tighter and the SRS and its staffing levels began to wind down, the emphasis turned towards maintenance rather than development; the SRS Computing Group gradually regrouped accordingly.

In the mid-90s, data acquisition and analysis was at it's peak with each science area having its own “computing village” closely focussed on its needs. This ensured that new requirements such as remote automated processing for PX could be delivered without compromising other science areas. However, the downside of this was that resourcing became increasingly difficult. Each village had its own compute server, data storage, printing and archiving services all connected by a virtual LAN (Local Area Network).

The position in the mid-90's is described in “Computing for Synchrotron Radiation Experiments – Ackroyd et al. J Synchrotron Rad 1994 1 63-68”.

Over the first decade, the computing needs of synchrotron experiments had become better understood and yet the landscape was now incredibly diverse … from bespoke experiments taking many months to routine characterisations taking minutes. The number of experiments and throughput of users had grown considerably and it was no longer tenable to write custom software for each application. Standardisation, particularly for user interfaces, was essential. However, it was clear that hardware diversity would remain in order to address experimental needs and users needed to be insulated from this.

Fortunately, increasing demand for data rates and storage was matched by the accelerating technological development and falling prices. The demands of new 2D detectors, time-resolved experiments and remote access were all being addressed by the computing group during the decade.

This period proved to be a fertile proving ground for new technologies and methodologies,  (e.g. 4th Generation Languages (4GLs)). In particular, the move to object-oriented programming (e.g. C++) presented great opportunities for modularisation and creating layered architectures which separated the user interface from the complexities below. This then made it easier to change the underlying hardware without interfering with the user experience. In addition, it made the use of commodity hardware and software (UNIX, Windows)  much easier and also the tracking of the increasingly accelerating pace of developments. Finally, some experience was gained with commercial data acquisition software – LabVIEW (a graphical programming system) – which was gaining increasing traction in universities and therefore appearing on experiments arriving at the laboratory.

While there was still a need for generic analysis software to do visualisation such as charting and plotting (e.g. PLOTEK and OTOKO), increasingly, the line between acquisition and analysis was getting blurred. There was a growing need to perform significant analysis in real-time – either to reduce the increasing amount of data to manageable levels or to direct the experiment and maximise the use of valuable beam time. Hence the SRS Computing Groups became more involved in this area.

In parallel to these developments, data acquisition was also being driven forward by the CCPs (Collaborative Computing Projects, based at Daresbury). In particular CCP3 (Surface Science), CCP4 (Protein Crystallography), CCP13 (Fibre Diffraction) and CCP14 (Powder Diffraction). See an informal blog by T N Bhat for a detailed account of the early development of CCP4.

NOBUGS

During this period new synchrotrons were being constructed across the world and Daresbury staff were often called on in on an advisory capacity. In addition, scientists started to work across multiple facilities, either because of specific machine characteristics or beamtime availability. Often data acquisition was carried out by existing computer groups at laboratories. Inevitably this led to a large number of development efforts essentially producing similar software. This exposed the peripatetic user to a variety of different approaches and hence there were requests to adopt various laboratory software on a wider basis.

As contacts between software engineers across laboratories increased, the idea of a loose association of computer professionals with similar interests was born. From a few initial ad-hoc meetings between DL and ESRF staff and a PX Software workshop at BNL, NOBUGS (New Opportunities for Better User Group Software) was born and the first bi-annual conference held at the ESRF. These have continued over the years and promoted sharing and standardisation of software across, not only the synchrotron community, but also the neutron community where developments in detectors had started to present similar problems to those faced by SR. In addition much energy has been spent discussing best practice in software development.

Above – NOBUGS 2000, held at Arley Hall, near Daresbury.


Now (2016) in its 19th year, there have now been 10 conferences held on 4 continents with Daresbury hosting the third one in 2000.

Generic Data Acquisition

The software engineers at Daresbury benefited significantly from exposure to these new developments, many which were green field, using latest technologies, as compared with some of the legacy development on the SRS. In addition, the differences between science areas and techniques were also becoming more blurred, as with the combination of spectroscopy and scattering techniques employed by station MPW6.2. There was a considerable will to make a clean start and break with the past so that users could benefit from these new technologies and ideas. However, at the time, there was considerable uncertainty due to the impending decision on the replacement of the SRS by Diamond.

Finally the news that this was to be built at Harwell was broken and this had a considerable adverse effect on staff morale. However, it quickly became clear this presented the ideal opportunity to act on the developing vision. It was clear that, to be successful, Diamond needed to hit the ground running and  it was in Daresbury's gift (with over 200 man-years of combined experience) to provide the basis for tried, tested and state-of-the-art data acquisition which would then have a life following the SRS closure. Thus the idea of a Generic Data Acquisition framework (GDA) was born, capable of not only accommodating the existing needs of the SRS, but also of growing and developing with new sources, technologies and experimental needs.

One realisation was that GDA would need to accommodate the EPICS control system hardware (see https://en.wikipedia.org/wiki/EPICS) for beamline control and, while not in use on the SRS, EPICS test rigs were obtained to ensure integration.

Following initial planning by the DL Group and discussions with Diamond, the GDA plan was launched at 2002 NOBUGS

GDA benefits included agility and flexibility to keep track of scientific and technological developments (using plug-and-play) as well as deliver economies of scale. Incremental development would avoid future big-bangs, allow user expectations to be tracked effectively and avoid potential lock-in.

GDA goals included

•    Future proofing via Hardware/operating system independence and a plug-and-play architecture
•    Standard user interfaces available (faster to learn, with less potential for error)
•    Improved software reliability and maintenance
•    Support for remote access

and a framework was developed to support this.


A layered structure was adopted for maximum flexibility – providing modularity and hardware independence. One major benefit of this was that remote access is intrinsic as all interfaces are via the network. However, it should be noted that this then introduced an issue as remote access is not always desirable (particularly for safety reasons) and hence software interlocks and security had to be built into GDA.

A proof of concept was developed and a common toolkit were developed from code on MPW6.2 and PX stations and applied to the new PX station 10.1. Subsequent to this, the rest of the PX station code was merged and code also implemented on IR11 and CD12. This was followed by the first low energy beamline 5U1.

To deliver this change required changing how the SR Computing Group operated. New tools and  methodologies were adopted:  most important was the project planning of the whole project with a planning/resourcing/review cycle to ensure priorities and on-going support commitments were met. This resulted in a continuous flow of incremental releases which ensured that all GDA stations benefited from new developments promptly. A considerable amount of training, both formal and through workshops, was required to achieve this.

This did not come without considerable challenges  such as working across geographically (and organisationally) separate teams and meeting the externally driven deadlines of Diamond.

As the Diamond Data Acquisition Group started to build, it became increasingly involved in the GDA development started by the SR Computing Group. Interchange of staff between facilities helped ensure focus on the new and developing needs of the new synchrotron and DL staff spent significant time seconded to Diamond.

The ultimate aim was to get GDA adopted at other synchrotrons and, indeed, on other kinds of facilities, e.g. neutron sources. However, it became clear that the problems in making clean breaks in software and hardware technologies were difficult to overcome, even given the potential advantages offered. The plan to open source GDA, in order to get wider acceptance, was frustrated by bureaucracy but, fortunately, achieved by Diamond following the transfer of development leadership to them on the closure of the SRS.

In conclusion, GDA was an idea that arrived at the right time and helped to ensure the success of Diamond.

Data Acquisition Milestones

The major milestones of data acquisition at the SRS up to its 25th anniversary in 2005 are summarised in the picture below.