P. 1
Cern History

Cern History

|Views: 164|Likes:
Published by Marco Ponte

More info:

Published by: Marco Ponte on Aug 02, 2011
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





CERN–2006–004 8 May 2006




Fifty years of research at CERN, from past to future

Academic Training Lectures 13–16 September 2004, CERN

Editor: G. Giudice

Geneva 2006

CERN–200 copies printed–May 2006

This report contains the text of the four presentations given as Academic Training Lectures on the occasion of the fiftieth anniversary of CERN. Fifty years of work in the fields of accelerators, theory, experiment, and computing are briefly recounted.



and computing. Many people responded by asking the Committee to publish the write-ups of the talks to preserve the presented material. which I am honoured to chair. and computing.Preface The Academic Training Committee. We hope that this document will remain as a witness of the exciting atmosphere that led to the great intellectual achievements of research in high-energy physics at CERN. Gian Giudice v . and u David Williams gave superb and fascinating lectures on research at CERN during these 50 years. The lectures were very well received. Daniel Treille. personal recollections of the scientific atmosphere during those years of great research progress and discoveries. For this reason. to all members of the CERN community. and as a stimulus to pursue the exploration of the particle world in the future with the same enthusiastic spirit. with nostalgic feelings by the older generations. Kurt H¨ bner. as told by people who actively participated in those events. opportunities to deepen and broaden their scientific knowledge. Gabriele Veneziano. We had the idea of inviting four speakers who could present the most important scientific developments that occurred at CERN during the last 50 years in the four main research fields: accelerators. In 2004. experiment. the Academic Training Committee wanted to organize some special lectures. but rather. has traditionally played a very active role in offering. we encouraged the four speakers to present their personal experience and to express their particular point of view on the past and the future of their respective field of research. theory. to mark this important event for our laboratory. engineering. it organizes about 20 lecture series a year. We were not aiming at a precise account of the history of CERN. between 13 and 16 September 2004. for a total of 80–100 hours of lessons on topics in high-energy physics and related fields of applied physics. and as a source of inspiration for the younger component of the large attendance. in the main CERN Auditorium. As a result. Among its activities. on the occasion of CERN’s 50th anniversary.


Preface G. Giudice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v Fifty years of research at CERN, from past to future: The Accelerators K. H¨ bner . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 u Fifty years of research at CERN, from past to future: Theory G. Veneziano . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Fifty years of research at CERN, from past to future: Experiment D. Treille . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Fifty years of research at CERN, from past to future: Computing D. O. Williams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77


Fifty years of research at CERN. The description of the early part of the development is based on the available documents [1–3] and the reminiscences of senior colleagues who have been my teachers and friends. Adams and V. 2 2. Geneva. The later evolution is presented with a more personal touch. 1 Introduction This report. from past to future: The accelerators K. Emphasis is put on the salient features and highlights of the different facilities and on what has been learnt at each stage in terms of accelerator physics and technology. The achievements presented here have been made mainly by CERN staff but the continuous support and the significant contributions from our colleagues outside CERN must be gratefully acknowledged. The leader of the second Group was Odd Dahl of Bergen. a very personal account is presented of what we have learnt in terms of management in all these large projects. Switzerland Abstract The evolution of the CERN accelerator complex since its inception is summarized. As Director for Applied Physics. The third part is dedicated to the period when CERN had to choose from a number of options which finally led to a family of powerful particle colliders. A selection of references for further reading on the future options for CERN is presented. H¨ bner u CERN. the ISR and the SPS. Possible future accelerator options for CERN are discussed at the end. as the accelerators become more and more complex. distinguished accelerator physicist and man of vision. The fourth part sketches the accelerator options for the future which have already been under study for a number of years since. B. The first part treats the early accelerators. followed by the Large Electron–Positron collider (LEP) and then by the Large Hadron Collider (LHC) which will be housed in the tunnel drilled for LEP. He joined CERN in 1952 and became one of the leading figures for the construction of the PS. He died in April 2004. The energies 1 . I have given only a few references. summarizes the evolution of the CERN accelerator complex with an emphasis on the highlights in accelerator physics and technology. and the second Group to explore a weak-focusing proton synchrotron similar to the 3 GeV Cosmotron being commissioned at Brookhaven National Laboratory. responsible for planning under J.1 Early times Starting conditions In May 1952 the provisional CERN Council decided on the creation of two Study Groups: the first Group to study a proton Synchro-Cyclotron (SC) led by Cornelis Bakker of Amsterdam. The lectures and this paper are dedicated to Mervyn Hine. the Synchro-Cyclotron (SC) and the PS and their evolution. Weisskopf. when the studies for the upgrading of the Proton Synchrotron (PS) and the construction of the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS) were in full swing [2–4]. he laid the foundations for the expansion of CERN beyond the SC and PS. based on my own observations made after I joined the legendary CERN Accelerator Research Division in 1964. starting with the conversion of the SPS into a proton–antiproton collider. Finally. who made eminent and varied contributions to the build-up of the accelerator complex at CERN. the lead times for a project become very long. The second part deals with the accelerators built by CERN after its expansion into France. based on lectures given in the Academic Training Programme on the occasion of CERN’s fiftieth anniversary.

¨ K. which held the promise that a higher energy could be reached for the same cost.1 GeV are plotted as a function of the time of the proposal in comparison with the energy of the US accelerators as a function of their start-up. The audacity of Council to make a large step in energy of the synchrotron using a scheme which had never been tested can be appreciated from Fig. PS and AGS proposal (large squares). e. Although a design of a Cosmotron-like synchrotron had been worked out. as a function of the time when the proposal was made. which meant abandoning the scaled-up weak-focusing Cosmotron-type 10 GeV synchrotron and studying instead a 30 GeV proton synchrotron based on the untried AG principle. 2 2 . 1: (a) The energies of the proposed PS and AGS in units of 0.synchr. In October 1952 the still provisional Council fortunately accepted this bold proposal. It also gives the energy of the two first AG synchrotrons. Synchro-cyclotrons (diamonds).cycl. p synchr. the PS at CERN and the AGS at BNL. A group led by Odd Dahl visited BNL in summer 1952 and learnt about the alternating-gradient (AG) focusing principle [5]. 1 46 48 49 50 51 52 53 54 1000 100 10 1 46 48 49 50 51 52 53 54 Fig. electron synchrotrons (small squares). (The Stanford Mark III 1.2 GeV electron linear accelerator (linac) which was completed by 1950 is not shown. H UBNER of these accelerators were discussed in a meeting in June 1952 in Copenhagen and the recommendation reported by Heisenberg to aim for 600 MeV for the SC and 10 to 20 GeV for the synchrotron was accepted at the second Council session in June 1952. non-AG proton synchrotrons (triangles). 10 PS/AGS prop. 1000 100 syn. it was decided after this visit to forthwith design a synchrotron embodying the new principle. 1(a) where the energy of various types of accelerators in the US is plotted as a function of the year of start-up. (b) the same plot for European accelerators.) Figure 1(b) shows these two proposals with the accelerators in Europe.

in the foreground. The SC was later taken over by nuclear physics in 1964. well before the construction of the PS. π + → π 0 e+ ν. and the12 GeV ZGS (1959) at ANL in the USA were still constructed. Peierls put it more poetically “for awful gamble stands AG but if it works or not we’ll see”. 2. immediately at maximum energy. However. 3 3 . it was considered an “adventurous high-risk high-gain course of action” according to J. providing the physicists with an opportunity to gain experience. Adams. Figure 2 shows this accelerator. this choice turned out to be decisive for the development of the CERN accelerator complex as only a synchrotron with AG focusing could provide the high-brilliance beams required for colliders with a high luminosity. However. 2: The CERN 600 MeV Synchro-Cyclotron (SC) with the red iron yoke. Until the start-up of the PS the SC was the centre of scientific life at CERN. in order to secure an early start in meson physics and as a training ground for European experimental physicists and accelerator technology. This helped the UK to decide to join CERN. even though the UK was the leader in accelerator technology in Europe at that time. FROM PAST TO FUTURE : T HE ACCELERATORS Obviously. this decision immediately put Europe in the forefront of accelerator physics. fifteen months after the setting-up of the Group which then moved to Geneva in 1954. The design was almost complete in 1953.2 The CERN Synchro-Cyclotron In October 1952 the provisional CERN Council also decided to go ahead immediately with the construction of the 600 MeV SC. the coils. or as R. Fig. the 7 GeV Nimrod (1957) at RAL in the UK. ISOLDE. Others did not trust the new AG principle and weak-focusing proton synchrotrons such as the 10 GeV Synchro-Phasotron (start of construction in 1952) at JINR/Dubna. B. CERN started on an equal footing with the US and ahead of other European Countries. In the long term. and many other contributions to µ physics. in particular for experiments with the isotope separator on line. this attraction of the SC had an adverse impact on the early scientific programme at the PS. the g − 2 measurement of the µ. and. the cylindrical rotating condenser for frequency modulation The physics programme started in 1958 and a number of excellent results were obtained such as the discovery of the rare decay π − → e− ν and of the pion ‘beta decay’.F IFTY YEARS OF RESEARCH AT CERN. Construction started in 1955 and the first beam was obtained in 1957.

The first members of the PS group moved to Geneva in autumn 1953. To find the right balance required a number of iterations. Fig. it made the synchrotron much more sensitive to the errors in the magnetic field and alignment. The most notable was the work done in 1973/74 which included a new ion source. 3: The combined-function magnets of the PS 4 4 . H UBNER A number of improvements were made during the long faithful service of the SC. In comparison. and started to seriously review the design after the approval by Council in October 1953 with a reduction in the energy to 25 GeV in order to contain the cost. 2. and a new extraction channel. the magnets and their foundations. the cost of the vacuum chamber. therefore. Further modifications were made later for acceleration of ions with a charge-to-mass ratio of 1/3 to 1/4 up to carbon. The SC was stopped at the end of 1990 after the decision to move ISOLDE to the Booster Synchrotron of the PS (PSB) which had a higher proton-beam power than the SC. and the tunnel cross-section could be reduced dramatically. This is illustrated in Table 1 which gives the total weight of the magnets at the different design stages.3 The CERN Proton Synchrotron (PS) The PS Design Group was initially spread out over Europe but it quickly became obvious that such a project needed a centralized team. the weight of the magnets of the 10 GeV Synchro-Phasotron at JINR/Dubna is 36 000 t. This led to an increase of the internal beam intensity from 1 µA to 4 µA and the extraction efficiency went up by a factor of 10% to 70% as expected from the design. However. Table 1: The evolution of the weight of the PS magnets during design Date Magnet weight (t) January/1953 800 April/1953 10 000 March/1954 3300 As constructed 3800 Figure 3 shows the PS combined-function magnets which provide a dipole field for bending with an alternating gradient for beam focusing. The advantage of the AG principle was that the size and. a rotary condenser for raising the deevoltage and the repetition rate. The last column gives the final weight which is still rather reasonable. generating a sizeable fluctuation of the parameters during the design until the start of construction in 1954.¨ K.

FROM PAST TO FUTURE : T HE ACCELERATORS The ground breaking took place in May 1954. 2. where β. in addition. The construction of the ring with a circumference of 628 m was completed in 1959. one of prestigious accomplishments of CERN. and. by a technique allowing the extraction of the totality or part of the beam bunches from the ring within one turn. it became conceivable to increase the PS intensity per pulse. which led to the discovery of neutral currents. last but not least. In order to raise the mean proton current. invaluable know-how for the future of CERN. This was partly a consequence of the SC programme having cornered substantial resources which were then not available for the work with the front-line accelerator. A particular effort was demanded from these two accelerators during the last neutrino experiments in the CERN West Hall (CHORUS and 5 5 . and were eradicated completely only in 1980). this was especially appreciated by the experiments with electronic detectors.F IFTY YEARS OF RESEARCH AT CERN.2 × 1013 protons per pulse were reached. The PSB came on-line just in time for the experiments with the Gargamelle bubble chamber. how to manage a large project. The experimental programme started in spring 1960 but was hampered by the lack of secondary beam equipment and adequate particle detectors. heated too much. The PS group had learnt beam physics in a circular accelerator with AG focusing. how to properly control the beam and the accelerating radio-frequency system. Hence CERN could not exploit the advantage of a six-month head start of the PS over the AGS at BNL. which was constructed in 1968–1972. the PS Booster (PSB).γ are the usual relativistic parameters. how to produce magnets with tight tolerances. 2. This was accomplished by a novel kicker magnet with a very short rise-time moved over the beam at the end of the acceleration cycle when the beam was small. Figure 4 shows the extracted proton beam passing through a row of scintillators with a PS ring magnet at the right. how to design stable supports for them. 5. but friendly competition. another device pioneered at CERN and installed in 1961. how to align them. was inserted between the 50 MeV Linear Accelerator (Linac1) and the PS in order to increase the injection energy of the latter to 800 MeV. and the proton beam energy reached 28 GeV (kinetic) in December 1959. and of experienced physicists. which would have been impossible with the internal targets (the latter were contaminating the accelerator. This led to a spectacular increase in the neutrino flux. Since Nmax ∝ βγ 2 for a given invariant emittance of the beam. was enhanced by the magnetic horn.4 Upgrading the PS In order to be better armed for the fierce. As early as 1960. the PS has been continuously improved and adapted to serve as injector for a number of accelerators. the efficiency of the secondary beam production was improved in comparison with that of an internal target by extracting the primary proton beam onto an external target: – A fast extracted beam was produced for the first time in 1963. In a first vigorous step. The third step was the construction of a new 50 MeV linac (Linac2) equipped with a new proton source in 1973–1978. which. Since these two new extraction methods worked very well. The steady evolution of the record number of protons accelerated in the PS and PSB is shown in Fig. The number of particles per pulse N was limited by space-charge effects at injection. a new power supply and rf system were installed to increase the repetition rate by a factor 2 in the mid 1960s. The kicker magnet was replaced in 1969 by a full-aperture kicker in order to remove the constraining kicker aperture. A theoretical gain of a factor 8 in intensity was expected from this new accelerator. – A slow extraction system was brought into operation in 1963 providing a long spill (about 1 s) to the experiments and increasing the duty cycle. a synchrotron consisting of four superposed rings.

It became 6 6 . It was possible to raise the field of the magnets. The first phase started in 1976 with deuterons for the ISR. no longer operate in the long-term cost minimum. which. in a collaboration with other laboratories. Linac3. 5: Evolution of the PS and PSB peak intensity The PSB impressed not only with a spectacular growth in intensity over the years but also with the step-wise increase in proton energy by an amazing 75% from 0. 4: The first extracted proton beam at the PS in 1963 NOMAD) in 1999 where the PS provided on average 2 × 1013 protons per pulse. H UBNER Fig. Since the PS was originally designed for 5–8 × 109 protons per pulse. Linac1 served the PS as its first proton injector and as the injector for a very successful ion programme. which were also served with alpha particles in 1980.¨ K. The second phase was the production of S16+ for the SPS from 1990 to 1992.4 GeV (kinetic).8 to 1. In 1990. however. a good factor of 2000 has been gained over the years through continuous upgrading. construction began of a new Pb linac. Fig.

the PS was modified to permit the acceleration of positrons and electrons to 3. It operated from 1971 to 1983 for physics. Five of these points were used for experiments. and the PS started providing the SPS with 208 Pb ions which are fully stripped after extraction from the PS.1 ppm ripple in the dipole current). The ISR were housed in a big tunnel constructed by the cut-and-fill method. a study of a 300 GeV proton synchrotron was conducted. at the end of a lengthy and intense debate. In 1965. interleaved rings. exactly 1. formed two independent. FROM PAST TO FUTURE : T HE ACCELERATORS operational in 1994. protons. A view of the ISR at intersection point 5 is shown in Fig. It was a pioneer of accelerator technology: – – – – ultra-high vacuum and ion clearing (residual pressure 10−13 Torr in the experimental areas).1 Expansion into France with the ISR and SPS The Intersecting Storage Rings (ISR) While the SC and the PS were being built. a factor 35 above the design luminosity. adjacent to the original CERN site in Switzerland. Linac1 continued to provide p and H− to the storage ring LEAR (see Section 4. A new control system allowed fast switching between cycles within a supercycle. With dc proton currents up to 40 A (single beam up to 57 A!) it reached a luminosity of 1. 6 shows the interleaved acceleration of Pb ions. intersecting in eight points. the study for the second generation of CERN accelerators started at the end of 1956 and gradually swung towards a proton–proton collider. The ISR were constructed from 1966 to 1970 on land in France. 7 7 . low-impedance vacuum envelope. high-stability power supplies (only 0. 7. providing AG focusing.5 times the circumference of the PS. though a number of physicists were not in favour as the ISR would not provide secondary beams and was considered to be too much of a shot into the dark. 3 3.3) for tests until the end of 1996 and was dismantled in 1997. 6: Example of a PS supercycle [6] The highlights of the work of the accelerator groups during the upgrading of the PS were the design and running-in of the novel fast and slow extractions. The combined-function magnet lattice. it was decided to first construct the ISR. Fig. Furthermore. The maximum beam momentum was 31. the mastering of the high-intensity proton beams and of the ion beams. the merging of the proton bunches for antiproton production by using more than one rf system. from 1961 onwards.4 GeV/c. in between. In addition. introducing a damping wiggler and an additional rf system in the late 1980s. superconducting low-beta insertion to squeeze the beam in the intersection point (increasing the luminosity by a factor of 6.4) by adding a pre-injector. The circumference of the orbits was 943 m. the deceleration of antiprotons for LEAR. positrons and electrons and.5).4×1032 cm−2 s−1 in the superconducting low-beta insertion. An example of such a supercycle in Fig. and the refining of the computer control system allowing the creation and manipulation of supercycles without time-consuming commissioning.F IFTY YEARS OF RESEARCH AT CERN.5 GeV for LEP (see Section 4.

and to discover unexpected phenomena such as pressure rise due to multipacting. Two prominent examples are given. Fig. such as beam–beam effects. H UBNER Fig. beam-equipment interaction.¨ K. The excellent vacuum allowed physics runs with the beam coasting for 60 h with beam lifetimes in excess of many months. and intra-beam scattering. which rendered excellent background conditions. 8: The evolution of the average vacuum pressure in the ISR [7] Since it was the first hadron collider. The design figure was 10−9 Torr. Figure 9 shows the beam noise picked up by an electrode. 7: View of the ISR at intersection point 5 Figure 8 shows as an example the evolution of the pressure in the vacuum chamber averaged over the circumference. the ISR provided a unique opportunity to study effects which were predicted by theory. The invention of non-destructive beam diagnostics for coasting beams with Schottky noise was of enormous impact. 8 8 . space-charge detuning.

F IFTY YEARS OF RESEARCH AT CERN. 15.2) could circulate up to 345 h without any significant deterioration. van der Meer in 1968. FROM PAST TO FUTURE : T HE ACCELERATORS Fig. and 19 A [8] The signal is proportional to the square of the proton density as a function of beam momentum. the resurrection of the idea and the first experimental test of stochastic cooling invented by S. Figure 10 shows the result of the first test. This discovery of the transverse Schottky signals led to another unique accomplishment. physics runs with colliding proton–antiproton beams took place with a more refined cooling system so that an antiproton beam (see Section 4. The scan with 19 A beam current (bottom trace) shows a dent in the distribution due to beam loss at a non-linear resonance. This Schottky noise signal became an indispensable tool for monitoring the average momentum. the momentum spread. 10: Measurement of relative beam height as a function of time with stochastic cooling on and off [9] Later. and the density evolution of the coasting beam without disturbing the beam. 9 9 . Online correction of the space-charge tune-shift became possible because the betatron tunes of the stackedges or of a resonance could be determined with high precision from the fast and slow transverse-wave Schottky signals. The beam height is blowing up on account of multiple-scattering on the rest gas when the cooling is off. Fig. 9: Longitudinal Schottky scans of coasting proton beams of 10.

were made elsewhere though they could have been made at the ISR.¨ K. i. The SPS has accelerated ions from 1990 onwards (S in 1900 to 1992 and Pb from 1994). In contrast to the ISR. A magnet lattice with a circumference of 6912 m. the J/ψ and the Υ. In the period 1982 to 1991. Later the top energy was increased to 450 GeV and the intensity reached 4. Two large experimental halls (West. a few months after the start-up of the SPS. accelerating electrons and positrons from 3. After some fluctuations between an initial energy of 150 GeV with missing magnets and 600 GeV with superconducting magnets. The project was approved in 1970.5 × 1013 protons per pulse. The West Area. Very interesting physics results were obtained with this machine but the main discoveries in this period. the discussions resumed but were soon dead-locked over the choice of the site. the bending is provided by pure dipole magnets and the focusing by separate quadrupole magnets (Fig.e.2. the SPS is being modified and improved to become the LHC injector. 11).5 GeV to 22 GeV. At present. 3. very likely with some of the experiments that had been proposed but unfortunately were degraded or turned down. the decision was made in 1973 to aim for 400 GeV with conventional magnets. careful shielding against synchrotron radiation. This required the installation of an additional powerful rf system. provides AG focusing. North) provide the required floor space for the experimentation with secondary beams of highest energy. The issue was settled in favour of a site in France near CERN with the argument of using the PS as injector and other savings being made from synergies if the SPS were built close to CERN. H UBNER The ISR were decommissioned in 1984 in order to free resources for LEP.. the SPS played a key role in CERN’s antiproton programme discussed in Section 4. underlining the flexibility of this synchrotron which served also as LEP injector. and new injection and extraction systems. and the design intensity of 1013 protons per pulse was reached in the same year.2 The Super Proton Synchrotron (SPS) Work on the concept of a 300 GeV proton synchrotron started in the early 1960s at CERN. including the neutrino beams. This allows the attainment of a magnetic field in the dipoles about 50% higher than with combined-function magnets. elsewhere in Europe. was operational from January 1977. it is of the separated-function type. 11: View of the SPS tunnel with dipole magnets (red) and quadrupoles (blue) The first beam circulated at top energy in mid 1976. Fig. After the decision in favour of the ISR. eleven times the PS circumference. avoiding the mistake made 10 10 . for which 22 offers had been received.

In 1978. It is also useful to recall the international context of these decisions: the study of ISABELLE. and very refined. The neutrino programme in the West Area was terminated in 1998. colliding with 270 GeV protons in the SPS. Proton–antiproton collisions in the SPS requiring the construction of an antiproton source. fast–slow) to serve users requiring very different spill times and intensities. produced by protons hitting a target. and to concentrate on LEP as the future long-term flagship. only transverse stochastic cooling had been demonstrated at the ISR. It was a storage ring. CNGS will start operation in 2006. ICE also established a new lower limit for the antiproton lifetime (32 h at rest) increasing it by nine orders of magnitude. 4 4. but a new neutrino beam (CNGS) is under construction in order to send neutrinos to the Gran Sasso Laboratory of INFN 730 km away. while the SPS was still under construction. with the hope that it would be powerful enough to discover the Higgs. continuing the search for neutrino oscillations in Europe which commenced with the West Hall beams. A variety of options was examined up to 1978: – – – – – – CHEEP: 27 GeV electrons in an additional ring. to a large extent because the vacuum system of the FNAL ring could not be upgraded with reasonable effort to reach the required beam lifetime. Hence CERN had to take a quick and economic approach to be first to discover the predicted intermediary W and Z bosons. The latter was constructed and commissioned in record time and successfully completed its mission in May 1978. Figure 12 shows an example of momentum cooling in ICE. had to be increased by many orders of magnitude by stochastic cooling. with 157 m circumference 11 11 . a pp collider with 200 GeV per beam at BNL. The study of the proton– antiproton option at FNAL had been stopped in 1978. In June. 4. Thereafter. Large Electron–Positron (LEP) ring in a new tunnel. Incidentally. the search for the next steps started after completion of the ISR in 1974. Hence it was decided in February 1977 to experimentally prove momentum cooling and simultaneous stochastic cooling in all three dimensions in a new test ring.1 The quest for world leadership Search for the next steps Knowing the long lead-times of a project. slow. often within one and the same acceleration cycle. downstream of the target hit by 26 GeV/c protons from the PS. the decision was made to perform the proton–antiproton experiment in the SPS. high-efficiency beam extraction techniques (fast. was in full swing. MISR: 60 GeV protons in a storage ring built from ISR magnets to collide with SPS protons. the Antiproton Accumulator (AA) was built very quickly from 1979 to 1980. the decision was made to go for the proton–antiproton collisions in the SPS as the medium-term project.2 km.2 The antiproton programme in the SPS In order to achieve a reasonable luminosity in the proton–antiproton collision in the SPS. the Initial Cooling Experiment (ICE).F IFTY YEARS OF RESEARCH AT CERN. SCISR: two new superconducting rings for pp collisions with 120 GeV per beam in the ISR. ISABELLE was stopped at the construction stage in 1983 in favour of the superconducting pp collider (SSC) with 40 TeV in the centre of mass which in turn was abandoned in 1993. FROM PAST TO FUTURE : T HE ACCELERATORS with the PS. LSR/SISR: pp collider with 400 GeV per beam. direct powering from the grid with reactive power compensation (peak active power 150 MW). the phase-space density of the secondary antiproton beam. which was fortunately about three times the minimum required beam lifetime in the upgraded SPS. However. New expertise was acquired and new techniques have been learnt during SPS construction and commissioning: deep tunnelling with a precision of 2 cm over 1. and computer control from the start (both had been done already on the PSB albeit on a smaller scale).

the Antiproton Collector (AC) ring with a circumference of 182 m was added around the AA in 1987.5 GeV/c. its purpose was to capture more antiprotons. In addition. Figure 13 shows the layout of the accelerators. After reduction of momentum spread by bunch rotation of the injected beam. Although 1012 antiprotons per day were accumulated for about two fills of the SPS. 12: Distribution of protons as a function of momentum before (wide rectangle) and after cooling (narrow peak) in ICE operating at 3. 13: Accelerators and beam lines for the antiproton–proton experiment in the SPS Later. where the beam lifetime was 10 h. Fig. thanks to its much larger acceptance. New transfer lines to and from the AA and from the PS to the SPS (TT70) were built. transfer line TT6 was added to channel antiprotons to the ISR. H UBNER Fig. the stack was subjected to fast three-dimensional pre-cooling and then transferred to the AA. electrostatic deflectors were produced to separate the beams. the vacuum system had to be upgraded from the 200 nTorr (design) to better than 2 nTorr. 12 12 . this left little margin and the whole operation was a continuous cliff-hanger because there was always the risk of losing the stack in the AA or losing the beam during acceleration in either the PS or the SPS with the resulting loss of a whole day. The combined action AC and AA eventually raised the daily antiproton production rate by a factor 6 and the six-dimensional antiproton phase space density by up to a factor 4 × 109 . in order to shorten the cooling times and to obtain a higher antiproton density. in 9 of the 12 crossing points. In the SPS.5 GeV/c and equipped with powerful stochastic cooling devices. Operated also at 3.¨ K. low-beta insertions were installed in straight sections 4 and 5 for the UA2 and UA1 experiments. each consisting of 6 bunches. and the rf system was upgraded.

producing antiprotons with a kinetic energy of 5 to 1270 MeV in long spills for the physics community interested in low-energy physics with antiprotons. In order to avoid beam collisions at the other nine out of the twelve possible crossing points when operating with six bunches per beam. which had been denied to the ISR. For the first time the beam–beam effect with bunched hadron beams could be studied. including long-range beam–beam forces and the effect of very unequal beam sizes in the interaction points. invaluable experience for the LHC. which turned out to be extremely useful for LEP and the LHC. New civil engineering know-how was acquired with the digging of the large caverns in the molasse bedrock for the two experiments UA1 and UA2. the AC ring was modified between 1998 and 2000 to become the Antiproton Decelerator (AD). providing a very stable antiproton beam at 5 MeV after deceleration from 2. it was hoped. only a handful of these atoms were produced and not at rest as required for precise tests of the charge-parity-time (CPT) conservation theorem. The ring also had an internal target which served to produce the first anti-hydrogen atoms ever seen. In order to counteract the blow-up of the beam during deceleration below the kinetic injection energy of 180 MeV. Hence in order to pursue seriously this line of research and.7 GeV. FROM PAST TO FUTURE : T HE ACCELERATORS The energy of the colliding beams was 273 GeV between 1982 and 1985. The bunches collided in three crossing points: in the two experiments and in the point mid-way in between. The rewards for CERN were extremely important not only in particle physics but also in accelerator physics and technology. The large 4π detectors. Unfortunately. A highlight is a special linear Radio-Frequency Quadrupole Decelerator (RFQD) lowering 13 13 . so that smooth spills of up to 10 h became possible. It was built between 1980 and 1982 and operated until the end of 1996. Again. providing important experience for LEP. It was raised to 315 GeV from 1987 to 1991 when the programme was terminated. the orbits of the protons and antiprotons were separated in these nine points by distorting the orbits to a ‘Pretzel’ shape by means of electrostatic separators. had been decisive. once with sufficient quantities of anti-hydrogen at rest.F IFTY YEARS OF RESEARCH AT CERN. a buffer and decelerator ring. stochastic cooling and electron cooling had to be used. 4.3 The low-energy antiproton programme The existence of a powerful antiproton source motivated the construction of the Low-Energy Antiproton Ring (LEAR). Fig. A highlight was the ultraslow ejection with a noise signal feeding the particles into a non-linear resonance. Figure 14 shows the performance of the SPS as collider in terms of peak luminosity and integrated luminosity per year. 14: Performance of the SPS antiproton–proton collider [10] The luminosity was rather low before the advent of the AC but still sufficient to firmly establish the existence of the W and Z in the runs of 1982 and 1983 which won Carlo Rubbia and Simon van der Meer the Nobel Prize in 1984. the beam blow-up inherent to deceleration is compensated by stochastic and electron cooling.

However. and by 1989. The last column gives the final design parameters. Since the bending field at injection was below the threshold of distributed sputter-ion pumps.e. at the Z0 resonance (LEP1). Thus superconducting cavities were required to keep the power bill within reasonable bounds. In 1997 the threshold for W-pair production was reached and the final energy of 104. low-beta quadrupoles had to be developed.4 GV required at the Z0 resonance to reach the W threshold and beyond.5 K could be ordered from industry. After approval at the end of 1981. kept together by four pre-stressing rods running over the whole length of 6 m. 20 Nb bulk cavities operating at 352 MHz and at 4. therefore. 4. The concentration of the magnetic field in the laminations avoided reproducibility problems at injection energy where the average dipole field was only 240 G while the field in the laminations stayed below 5 kG at maximum energy.. In the framework of the LEP2 programme. 15. as shown in Table 2. construction started immediately and the first beams collided in LEP in 1989 at 46 GeV per beam. The coupling was chosen so that the electromagnetic energy was stored in the storage cavities in the intervals between the bunches. superconducting. The cheap and rigid steel–concrete dipole magnets consisted of spaced iron laminations with cement mortar in between (27% steel filling). i. complemented with a new Lepton Pre-Injector (LPI) feeding the PS with either electrons or positrons at 600 MeV.5 GeV per beam was attained by LEP2 in 2000 when all available spares were mobilized in a dramatic final spurt to find the elusive Higgs particle. The management had the vision and courage to start R&D for these novel and complex components as early as 1979. for the next batch of 160 cavities. iron-free. A cross-section of the vacuum chamber in the dipoles is shown in Fig. the beam energy was increased in steps from 1995 onwards depending on the installation schedule of the superconducting cavities. The LPI was constructed with the help of and in close collaboration with LAL/Orsay. It went through a number of iterations. the 352 MHz Cu rf cavities operating at ambient temperature were coupled to single-cell spherical storage cavities operating with a mode having vanishing fields on the walls and. this increases the intensity by a factor 50 compared to beams having their energy lowered by degraders. each the result of a compromise between the desiderata of the physics community and feasibility. very low losses. a non-evaporable getter (NEG) pumping system was used for the first time in the LEP dipoles and dealt effectively with the large gas load induced by synchrotron radiation at high energy. it had to be increased massively from the 0. In order to avoid disturbance of the solenoid field in the detectors. The accelerator was dismantled in 2001 to make space for the LHC. long before the approval of LEP1. H UBNER further the kinetic energy of the extracted beam to a value which can be chosen at will between 120 and 10 keV. CERN switched to the Nb-film technology where 14 14 . warm-bore. In order to minimize power consumption. Since the required rf voltage scales with γ 4 .4 The Large Electron–Positron Ring (LEP) The design of the future CERN flagship started in 1975.¨ K. avoiding any saturation effect. A number of technical challenges had to be met by innovative designs. Some examples follow. Table 2: LEP design iterations Beam energy (GeV) Circumference (km) Number of experiments RF power (MW) 1977 100 52 8 109 1978 70 22 8 74 1979 86 31 8 96 1984 55 27 4 16 The discussions about a possible site were cut short by proposing the PS/SPS as injector chain.

The civil engineering know-how acquired with the SPS had to be applied on a large scale for the construction of the 27 km 15 15 .99 juin. FROM PAST TO FUTURE : T HE ACCELERATORS Fig.98 nov. and a higher quality factor.97 avr. insensitivity to stray magnetic fields. 15: Vacuum chamber in the dipole magnet made of extruded Al profile (1) with the elliptic beam channel. savings on Nb material.98 mai.F IFTY YEARS OF RESEARCH AT CERN. the film is obtained by sputtering Nb on Cu sheets. 16: Evolution of beam energy.5 MV/m).9 sept. was required for cooling all the superconducting cavities and the increased number of superconducting quadrupoles. 16 Nb bulk cavities (4.96 août. 56 Cu cavities (1 MV/m) fed by 43 klystrons of the 1 MW class. which together with the Cu system eventually provided an rf circumferential voltage of 3.01 7 Date Fig.00 janv. This technology has inherent advantages such as better thermal stability against quenching. available and nominal rf voltage of LEP2 [12] LEP1 reached an integrated luminosity of 206 pb−1 at Z0 energy. LEP underwent a practically continuous improvement which pushed the integrated luminosity per year from 9 pb−1 in 1990 to the record of 254 pb−1 in 1999.96 mars. It was the largest superconducting rf system ever with 490 m total active length of superconducting cavities. The NEG pump (4) is connected by longitudinal slots (5) [11]. Figure 16 shows the evolution of the available rf voltage and the beam energy of LEP2 over the years.95 févr. and surrounded by a lead shield (3). and 784 pb−1 at and above the W threshold were collected with LEP2. each providing initially 6 kW and later 12 kW. LEP2 required a number of modifications and improvements apart from the upgrading of the rf system: two new klystron galleries had to be dug and equipped. The cavity production was an example of successful transfer to industry of technology developed at CERN.5 MV/m (nominal 6 MV/m). a cryogenic system with long transfer lines and four refrigerators. New challenges had to be mastered during LEP construction and operation.99 déc. 4000 3500 Beam energy 125 [GeV] 115 Nominal RF voltage Beam energy Cryogenics upgrade 105 95 85 75 65 Available RF voltage RF voltage [MV] 3000 2500 2000 1500 1000 500 0 juil. three water cooling ducts (2).6 GV. The final rf configuration comprised 272 Nb-film cavities operating on average at 7.

The project team has to master a number of challenges in technology and accelerator physics. An elaborate quench protection is required as the stored electromagnetic energy is 7 MJ per magnet and the heat capacity of the cable is very much reduced at this low temperature. 16 16 . 17. The cable carries 12 kA. The design of the LHC started in 1983 and the project was approved in two steps: a two-stage approach was first proposed in 1994 with a missingmagnet scheme providing 5 TeV per proton beam which could be upgraded to 7 TeV by adding the missing complement of magnets. and. A cross-section of the very compact ‘two-in-one’ magnet is shown in Fig.5 m (iron yoke). changes in the underground water table and rain leading to variations of 1 mm in circumference resulting in deviations of 10 MeV in beam energy. One of the largest rf and cryogenic systems for accelerators had to be designed. and operated. will only be known from the results of the Tevatron and LHC in a few years’ time. In addition.8 GV to reach 220 GeV in the centre-of-mass. However. solidly confirming the Standard Model. the LEP machine energy was not enough to produce the elusive Higgs particle or the postulated supersymmetric particles. The most important components are the 1232 dipole magnets with a mass of 28 t and length of 14. Refined methods for beam energy calibration had to be developed reducing the contribution to the uncertainty in the W mass to only 10 MeV. 4.5 The Large Hadron Collider (LHC) The option to complement or to replace LEP later with a large hadron collider had been in the minds of European particle physicists since the mid-1970s. bending the orbits of the counter-rotating particles which are separated by only 194 mm. therefore. not to exploit LEP fully with 384 Nb-film superconducting cavities providing 4. The start of operation is scheduled for 2007.¨ K. in 1996. the built-up gas layer on this cold tube has to be protected against direct synchrotron radiation (0. with beer bottles found in the vacuum chamber after a shutdown. which are transversely accelerated by the electric field of the bunches and hit the walls. The magnetic field is 8.7 K generated by the modified and upgraded LEP cryogenic plants. This beam-induced multi-pacting. The nonmagnetic steel collars keep the coil under compression over the whole operating range in order to avoid any movement of the cables or the coils despite the very strong electromagnetic forces (2 MN/m acting transversely per coil quadrant). H UBNER circumference tunnel and the big experimental halls. installed. The vacuum required for a beam lifetime of about 100 h is provided by the pumping action of the vacuum tube at 1. The operations group learnt to deal with perturbations from earth currents caused by near-by dc-operated trains which produced a relative variation of the magnetic dipole field of 2×10−4 .3 T. Although a large number of unique physics results were obtained from LEP. making the LHC the first proton storage ring where synchrotron radiation has an impact on the hardware design. The counter-rotating beams are horizontally separated in the arcs and go from the inner arc to the outer arc and vice versa in the four crossing points where the beams interact in the detectors at a small crossing angle. the proton bunches produce primary electrons by ionization of the rest gas. changes in the circumference of LEP caused by tides. and very strong beam–beam effects provided new insights for accelerator physics. Whether it was a wrong decision in 1996 to discontinue the industrial production of the superconducting cavities and.9 K. The magnets are cooled by superfluid liquid He at 1. last but not least. an effect discovered at the ISR. The superconducting coils are wound with cables made from 6–7 µm Nb-Ti alloy filaments embedded in a Cu matrix which has to absorb the heat dissipation in case the magnet quenches.8 TeV/u.2 W/m) emitted by the protons. when substantial contributions from non-member states had been secured. will produce a substantial heat load. where the number of electrons gets amplified as a result of secondary emission. The nominal luminosity is 1×1034 cm−2 s−1 with protons of 7 TeV per beam and 1×1027 cm−2 s−1 when operating with Pb ions of 2. a single-stage construction with 7 TeV top energy was accepted. Operation with strong synchrotron and radiation damping of beams.

an elaborate beam screen (Fig. Scrubbing the walls with synchrotron radiation and electrons for a while during running-in will further reduce the secondary emission yield. and coating the warm vacuum chambers with a special getter lowers the secondary emission yield. the two beams ‘see’ each other in a common vacuum pipe before the actual interaction points. FROM PAST TO FUTURE : T HE ACCELERATORS Fig. 17: Cross-section of the LHC dipole magnet [13] In order to avoid desorption of the adsorbed gas by direct synchrotron radiation emitted by the proton beam or by photoelectrons and to absorb the heat load. This will be performed at lower beam energy and/or with increased bunch spacing to limit the heat load. therefore. 18: Model of the LHC beam screen [13] Important beam dynamics issues have to be tackled: i) the long-range beam–beam forces in the 120 parasitic crossings near the interaction points. A slight saw tooth on the surface of the screen reduces photon reflectivity. ii) multibunch instabilities. Fig. because the beams contain 2808 bunches each and.F IFTY YEARS OF RESEARCH AT CERN. 17 17 . 18) cooled to 5–20 K protects the cold vacuum chamber and its gas layer. especially those induced by the electron clouds which cannot escape to the walls because the spacing of the bunches is only 25 ns.

the SPS and the PS complex with all the transfer lines and the experimental areas of the PSB (ISOLDE). LEIR is a good example of the tradition of CERN fully exploiting past investment as done. more efficient than stochastic cooling at this low energy. It will convert four to five long linac pulses of low density into two bunches of high density at 4. for example. The CERN Neutrino beam to Gran Sasso (CNGS) is under construction. with the PS and SPS. inserted between the fast-cycling Pb linac. This situation will be very similar to the one we experienced earlier during the passionate. before accelerating these bunches to 72 MeV/u and transfer to the PS. the PS (East Area.¨ K. Linac 3.1 Future accelerator-based options for CERN Overview Since there is increasing awareness of the long lead-times of large projects. Each can lead either to exploratory experiments or to precision physics. H UBNER Figure 19 shows the CERN accelerator complex with the LHC in the LEP tunnel. and in the middle of the 1970s after the completion of the ISR. There are three basic directions to take: hadron colliders. The upgrade of the LHC energy by a factor two yielding 18 18 . It is nothing but LEAR modified to become a buffer accumulation ring for ions. lengthy debates and scrutiny of options in the early 1960s after the construction of the PS.2 MeV/u by means of electron cooling. it is timely that the discussion of this issue has already started in the physics community and in the CERN Council. and the slow-cycling PS. probably continuing for a number of years even when the first results of the LHC become available. lepton colliders. Fig. neutron Time-of Flight facility (n-ToF)) and of the SPS (West and North Area). 19: The CERN accelerator complex The new LEIR ring is also shown as an important element in the ion injection scheme of the LHC. both of which in parallel to the fixed-target programme served as LEP injectors and will be so used for the LHC. and advanced neutrino beams. 5 5. although this time the discussion will be more global and will include our colleagues from other regions of the world. The most straightforward option for CERN seems to be the upgrade of the luminosity of the LHC by an order of magnitude to 1035 cm−2 s−1 .

First. This results in an increase of a factor 2 to 3 in luminosity. A competing idea is to add a decay ring to the SPS where an intense beam consisting of β emitters would coast producing a pure electron– neutrino beam. the luminosity would be raised in steps. is somewhat deterrent. a more powerful but still conventional neutrino beam (‘Superbeam’) based on a powerful superconducting linac (4 MW beam power) using LEP components feeding a new accumulation and compressor ring in the ISR has been considered. Advanced neutrino beams also appear to be an interesting option. In a first instance. 19 19 . could be a ‘NeutrinoFactory’ based on a µ-decay ring providing pure neutrino beams.9 for θ = 0. eventually. for the next large new facility. Increasing further nb and Nb results in a luminosity between 5 and 7 × 1034 . although the step in energy is small and the length of the required shutdown to install new magnets. F is a form factor of order 1 depending on σ ∗ . In both of these decay rings. and probably also to upgrade the injectors.m. KEK. briefly sketched here. using the same linac as proton driver. led by FNAL. in the second case by choosing a positron emitter instead of an electron emitter.5 TeV and 1 TeV is the best choice. its value is 0. the beam crossing angle θ and bunch length σz . According to the study by our colleagues in the USA. In Phase 0.25 m) √ resulting in a σ ∗ decreased by 2. in particular since neutrino physics is rather decoupled from the LHC results and a number of synergies with nuclear physics and condensed-matter physics promoting a European spallation source can be imagined.2 LHC upgrade A staged approach has been suggested [14]. it would require a circumference of more than 200 km to reach 40 TeV and. and SLAC. and F = 1/(1 + (θσz /2σ ∗ )2 )1/2 . for a number of years a collaboration led by CERN has been studying a technology which would allow the exploration of the high-energy frontier beyond LHC with a Compact LInear Collider (CLIC). and the crossing angle is decreased by modifying the layout of magnetic elements at these crossing points. This frontier could also be reached with µ–µ colliders in the TeV class but the required technologies for the production of intense and dense µ beams appear so challenging that some disenchantment has replaced the initial enthusiasm. charge-conjugate beams can also be produced: in the first case by using negative muons instead of positive ones. provided its luminosity is high enough to perform precision measurements. the maximum bunch intensity limited by the beam–beam effects is raised by colliding the beams in interaction points 1 (ATLAS) and 5 (CMS) only.3 mrad.5 → 0. 200 TeV in the centre of mass.s. because there seems to be a vast consensus in the international physics community that a linear electron–positron collider in the centre-of-mass range between 0. conducted in particular by DESY. beam radius at the interaction points (round beam). FROM PAST TO FUTURE : T HE ACCELERATORS a centre-of-mass energy of 28 TeV has also been studied. Nb the number of protons per bunch. The VLHC is apparently less attractive. In parallel with these studies for such a TeV-class collider based on the extension of known technology. also operating with positrons and electrons. Although the energy reach of the latter is rather limited. A Very Large Hadron Collider (VLHC) was considered but turns out to be excluded by the geology around CERN. For the LHC. Very likely these quadrupoles would have superconducting coils made from Nb3 Sn. it is considered to be complementary to the LHC and could provide hints beyond LHC energies. an alloy with a higher critical field but being very brittle (VLHC technology). One could imagine CERN joining an international consortium to construct this International Linear Collider (ILC).F IFTY YEARS OF RESEARCH AT CERN. 5. Phase 1 requires new insertion quadrupoles to increase the beam focusing (β ∗ = 0. σ ∗ the r. The next stage. Examination of the formula for the luminosity shows the parameters which can be influenced 2 L = nb frev Nb F/(4πσ ∗ ) (1) where nb is the number of bunches per beam.

and the safe tackling of 18 MW beam power per beam. the facility had a length of 33 km. though the optimum frequency may be somewhat lower. 20: Schematic layout of TESLA [16] After the decision on technology. In the TESLA proposal. in particular. Test facilities are under construction to test the basic components and to demonstrate the engineering margins required for such a big project. the next step towards the ILC is the setting-up of a central team and regional teams. Both these modifications demand a vigorous R&D programme in superconducting materials and magnet design. providing 0. Figure 20 shows a schematic layout of TESLA with its X-ray free-electron laser (FEL) option which will not be adopted for the ILC. H UBNER Phase 2 is an energy upgrade.5 and 1 TeV and a luminosity around 3 to 6×1034 have been the subject of an intense R&D programme led by DESY. In order to reach a very high energy in a reasonable length. In order to reach 14 TeV per beam compared to the present nominal 7 TeV.3 GHz with an accelerating gradient of 25–35 MV/m as pioneered by the TESLA collaboration based at DESY.. 5.3 Electron–positron linear collider Schemes for a linear collider with a centre-of-mass energy between 0. KEK and SLAC [15]. They will review the TESLA design and its challenges. Recently. the International Technology Review Panel (ITRP) has recommended concentrating on the linac technology based on superconducting bulk-Nb accelerating structures operating at 1. which diminishes the attractiveness of the proposition. new dipoles with a magnetic field of 15 T compared to the present 8 T. the ∗ focusing of the beams to σx/y = 400 nm/3 nm. CLIC [17] would use a very high accelerating gradient (150 MV/m) generated in normal-conducting accelerating structures operating at 30 GHz. e. which tightly couples the positron operation to the electron one. The choice of gradient and frequency makes the linac very short such that 3 TeV in the centre of mass are reached within an overall length of 20 20 .g. a major upheaval which cannot be envisaged before the next decade. the non-conventional positron production through laser light generated by the electron beam at top energy. the very long dog-bone damping rings (2 × 17 km).¨ K. It is not likely that this energy upgrade could be accomplished before 2020.8 TeV in the centre of mass. Fig. amongst many other components new magnets are required and. the injection energy has to be raised to 1 TeV by equipping the SPS with pulsed superconducting magnets.

requiring a substantial investment for a full-scale test.5 TeV). 5.2 GeV generating in the decelerating structures a 30 GHz rf power pulse which powers the structures accelerating the main beam pulse of low-intensity (1 A in 102 ns) up to 1.1 Neutrinos from mesons or muons The first process yields neutrinos either directly through the π and K decay or the µ decay p → Target → π+ (K+) → µ+ + νµ → e+ νe νµ 21 21 (2a) (2b) . Figure 21 shows a cross-section of the CLIC tunnel of the same diameter as the LEP/LHC tunnel.4. The position of the drive linac and main linac are indicated by arrows. and the generation and control of the high-power drive beam.4 Advanced neutrino beams There are basically two production mechanisms of neutrino beams: i) based on the decay of π and K mesons.F IFTY YEARS OF RESEARCH AT CERN. A disadvantage of CLIC is that its basic rf unit is relatively large. The novel power generation scheme of CLIC does not use a vast array of klystrons but a number of drive beams which run along a considerable length (≈640 m) parallel to the main beam.5 TeV. 5. Although CLIC holds the promise of a high potential. ii) based on the decay of beta emitters. The most important challenges for CLIC are the high accelerating gradient (150 MV/m). FROM PAST TO FUTURE : T HE ACCELERATORS about 33 km (10 km for 0. The linacs are linked by a horizontal waveguide [17]. The linac is also very compact in transverse dimension as the outer diameter of the accelerating structures is of the order of 10 cm. ? ? Fig. 21: Cross-section of the CLIC tunnel. the linac technology of CLIC is certainly less mature than that of the ILC whose feasibility is fairly well established thanks to the TESLA Test Facility at DESY. The intense electron drive beam pulses of 150 A in 130 ns are decelerated from 2 to 0.

The resulting π and K decay to muons which subsequently have their phase space density increased in a cooling device. the total rf voltage required is 15 GV. 22: Schematic layout of the European Neutrino Factory Complex [18] The driver is a superconducting linac based on LEP components feeding negative H ions to an accumulation ring where they are stripped during the multiturn injection process. Ke3 and neutral kaons. the 4 MW beam hits the target. Selecting by focusing negative π and K yields the charge conjugate particles. Acceleration to 50 GeV has to be very rapid on account of the limited lifetime of the muons. A Muon Ionization Cooling Experiment (MICE) has been worked out in detail and it is hoped that it will be conducted fairly soon using a µ beam either at PSI or ISIS/RAL. Process (2b) uses muons decaying in a suitable storage ring. it was baptized ‘Neutrino Factory’. the latter being proportional to γ. The hardware must be designed to minimize induced radioactivity and to enable correct maintenance. The neutrino ‘superbeams’ under study are based on it. Since one hopes to obtain very intense beams from such a facility. resulting in a very pure neutrino beam of only two species. For example. about four times the voltage installed in LEP2. It takes place in a linac followed by two recirculating linacs before the muons are injected into the decay ring. This is not a small facility. The critical design issues related to the high beam power are the proton driver. A further challenge is the rapid acceleration of the muons which have a very limited lifetime (cτµ = 658 m at rest) requiring high accelerating gradients at low energy to increase their lifetime as quickly as possible. Figure 22 shows the schematic layout of such a facility. CERN/CNGS). and the wide-band π collection system the intense radiation.¨ K. The target has to withstand the shock of the proton beam. still used at present (KEK/K2K) and in the medium term future (FNAL/NuMI. After compression in the twin ring. Fig. using very powerful proton accelerators providing a beam power in the MW range at the target. the target. Further sources of contamination are the neutrinos from µ decay and the neutrinos from the decay of Kµ3 . 22 22 . H UBNER Process (2a) has been the traditional way of producing neutrino beams. The hitherto untried technology of ionization cooling of the µ beam has to be tested thoroughly. Note that this neutrino beam is always contaminated with neutrinos from the π and K of the opposite polarity which cannot be eliminated completely by the focusing of the secondary particles. and the downstream µ collection system.

F IFTY YEARS OF RESEARCH AT CERN.2 Neutrinos from beta emitters The idea is to generate suitable beta emitters by the ISOL technique. and let them decay in a new storage ring. which would give comfortable lifetimes of 120 s and 96 s. respectively at SPS top energy. The ions diffusing out of the target are collected. There they would produce an intense. The He isotope is obtained from the reaction Be (n. the idea has been aired of having both facilities built at CERN with the two different neutrino beams pointing to the same detector. A possible schematic layout of such a neutrino source (called ‘beta-beam’) using the CERN infrastructure is shown in Fig. the control of the inevitable losses during acceleration of the decaying ions in order to avoid contamination of the accelerators. Common issues with all the other neutrino sources are the handling of the hot target and the activated components downstream of the target. Since the front-end linac is virtually the same as the one needed for the superbeam for which it has to provide 4 MW beam power and since only 200 kW on target are required for the neutrino source based on beta emitters. the Ne isotope is produced by protons hitting a MgO target.8 s lifetime at rest. A new injection system for the PS is required with a Superconducting Proton Linac (SPL) as front-end providing protons at 2. The advantage would be that data on νµ → νe 23 23 . 23: Schematic layout of a neutrino source based on beta emitters [19] The accelerator issues are the generation of an ion beam of sufficient intensity. 9 (3) The He isotope has 0.4. to accelerate them using the existing PS and SPS. and bunched in an Electron Cyclotron Resonance (ECR) device and. 23. pure neutrino beam of only one species with a narrow opening angle due to the high Lorentz factor which is γ ≈ 150 at SPS top energy. and the accumulation of the ions in a very few bunches in the storage ring.6 s. and the politically delicate and not easily controllable procedures to obtain and to retain the authorizations for the operation of a MW-class proton driver and target. the detailed understanding of the transmission efficiency of the many accelerators in this chain. Proton Driver SPL Ion production ISOL target & Ion source Beam preparation ECR pulsed SPS Ion acceleration Linac Acceleration to medium energy Bunching ring and RCS PS New Neutrino Source Decay Ring Decay ring Bρ = 1500 Tm B=5T C = 7000 m Lss = 2500 m 6He: 18Ne: Experiment γ = 150 γ = 60 Acceleration to final energy Existing + additions Fig. subsequently. developed at CERN and currently used at ISOLDE. α) induced by spallation neutrons produced by protons hitting a heavy metal target. FROM PAST TO FUTURE : T HE ACCELERATORS 5. fully ionized. the Ne isotope has 1.2 GeV. Examples of suitable beta decays are 6 He 2 18 Ne 10 → 6 Li e− ν e 3 → 18 F e+ νe . accelerated in a linac to about 100 MeV/c before being injected into a bunching ring which precedes a Rapid Cycling Synchrotron (RCS) [19].

It realized very soon that large projects have long lead-times and. CERN has always gone for projects and has carefully avoided losing time and reputation with uncommitted R&D. There are a number of lessons from this success story. therefore. and the timely stopping of facilities when they were no longer on the leading edge. The strategy has included the full exploitation of existing facilities. in a later run. before inviting tenders from industry. 24. either by upgrading or re-use after modification. one has to think ahead. and faithfully playing the role of injectors for leading-edge colliders. possible synergies are under investigation in a study sponsored by the European Union. Industrial production has to be monitored meticulously and CERN must have the resources to immediately intervene with advice and active help in case a problem arises. preferably by CERN itself. First. H UBNER and νe → νµ oscillations and. 6 Conclusions. The latter have turned out to be the real work-horses of CERN. or what we have learnt in the past 50 years It was the continuous stream of good and attractive projects that created a growing user community giving unfailing support to CERN. which nowadays means thinking in terms of decades. Work on operation and upgrading of existing accelerators and. Operation immediately gains from the new insights obtained and techniques learned in R&D. which keeps the project team on the floor of reality. key technologies must be mastered. Obviously. This has led to an evolution of the accelerator park given in Fig. providing the indispensable test beams. 24 24 . 24: Evolution of CERN’s accelerator park Examination of the graph indicates that colliders have a short lifetime compared to that of accelerators. Fig. significantly speeding up the data taking. on the oscillations of the charge conjugates could be simultaneously collected. either conducting R&D or constructing new facilities has turned out to be extremely beneficial for CERN. and the design of the new facility is based on the experience and know-how acquired in operation. serving as particle sources for the fixed-target programme.¨ K. simultaneously. The SPS was even turned into a very successful collider for a while. As the nuclear physics community is also very interested in spallation neutrons to generate radioactive ion beams.

Rev. Sufficient engineering margins guarantee long-term reliability and flexibility and should not be sacrificed to exaggerated competition leading to rush decisions which are bitterly regretted afterwards. Weiss. Eur. [3] CERN Annual Reports 1955–2003. and sometimes the machine designers are better judges than the physicists who are anxious to start their research as soon as possible. Vacuum 43 (1992) 27. W. U. later than was promised. J. Johnsen. p. S. Nucl. though then Director of Planning. High-Energy Accelerator. 88 (1952) 1190. 1987). Krige (Ed. History of CERN. FROM PAST TO FUTURE : T HE ACCELERATORS When drawing up a project it is imperative to work as closely as possible with the users but the parameters should be chosen based on existing know-how and. Sitges. J. [7] K. one should certainly avoid taking risks with the reliability of the machine because then all its users suffer for as long as it is in service and the worst thing of all is to launch accelerator projects irrespective of whether or not one knows how to overcome the technical problems. Conf. 5th EPAC. whenever possible. Proc. 1990). Schindl. (Bristol. Pestre and L. Although the infrastructure of CERN is one of its well-known assets. and C. But whatever compromise is reached about flexibility.C. Phys. History of CERN. Stanford (AEC. 53. Phys. took a personal interest in my work. and a physicist community which is thoroughly dissatisfied. Launching the European Organization for Nuclear Research. Snyder. Courant. IOP. Krige. J. de Raad carefully read the manuscript and made numerous suggestions for which I am very grateful. The question of how much flexibility to build into a machine is obviously a matter of judgement. U.S.html [4] J. an approach already timidly started with LEP but then adopted for the LHC. Mersits and D. p. [9] P. History of CERN. It is extremely important that they are taught not only in academic lectures like this one. R. which do however provide the indispensable guide lines. Brianti. Report CERN 84–13 (1984). Borer et al. [11] O. D.. Koziol and B. 295. for which to this day I am still extremely grateful. Acknowledgements It is a pleasure to thank G.cern. Proc. K. Methods 125 (1975) 201. 1974). Washington. Hermann.). as I was myself when I came as a young Fellow to CERN and the late Mervyn Hine.. [8] J. unpublished manuscript (1950). Zilverschoon for discussions and advice. [6] D. the most important one is the carefully selected young staff who have been hired in recent years. Gr¨ bner. Amsterdam. Mersits. Instrum. [2] A.F IFTY YEARS OF RESEARCH AT CERN.J. 1996). H. Pestre. Amsterdam. this will increase the lead-times and require a global approach involving not only the European particle physics laboratories but also those of other regions.. Vol. see http://library. D. III (Elsevier. Hermann. Eds. on full-scale tests of all critical components and not on the desiderata of the users. but that they work with and are encouraged by experienced colleagues.C. Vol. CERN will have to adapt to the large size and complexity of future facilities at the high-energy frontier. Bramham et al. Belloni. 1996). J.ch/cern{\_}publications/annual{\_}report. II (North-Holland. o 25 25 . Hohbach. Building and Running the Laboratory 1954–1965. Fernqvist. Simon. References [1] A. 9th Int. That is the surest way of ending up with an expensive machine of doubtful reliability. Herr. Amsterdam. [10] G. Livingston and H. I (North-Holland. All this is best underlined by citing from J. Krige. M. [5] N. Christofilos. L. B. C 34 (2004) 15. Vol. E. Myers et al. Adams’s farewell talk to Council in December 1980.

Gruber Ed. Autin et al. Br¨ ning et al. Andruszkow et al. H UBNER [12] R. Report CERN 2000-008 (2000).).).W. LHC Workshop. Report SLAC-R-606 (2003). Report CERN/PS/202-080 (PP). Br¨ ning et al. G Part. Chamonix. 323. Brinkmann et al. Assmann. u [15] International Linear Collider Technical Review Committee. Phys.. Aleksa et al. I. p. Report CERN-2004-003 (2004). (Eds. Eds. Phys.). Guignard Ed. Eds. J.. 2001. [19] B. Poole et al. u [14] O. [13] O.)..S.. [17] The CLIC Study Team. (G.S. CERN LHC Project Report 626 (2002). 26 26 . (P. Vol.¨ K.. [18] M. Report DESY 2001-011 (2001). (R. Second Report. 29 (2003) 1785. [16] J. Proc. Report CERN-SL2001-003 DI (2001). J..

These revolutions came. I visited TH at regular intervals in or around the summer (basically every year from 1967 to 1974). 1 Actually. in many respects. Switzerland and Coll` ge de France. in particular. . and particularly of theory. . This allowed me to take the ‘temperature’ of CERN-TH at regular intervals and to feel its evolution from year to year. and theoretically sound ideas. 27 . a truly revolutionary epoch in theoretical physics. A combination of ingenious experiments (see Daniel Treille’s talk). I shall insert this crucial decade inside a broader picture and also mention other remarkable theoretical (if not yet experimental) revolutions that took place in the more recent past. It was. the start of a ‘golden decade’ during which our theoretical ideas truly underwent a ‘phase transition’. I shall discuss these two aspects in turn. Veneziano CERN. from past to future: Theory G. together with some personal thoughts on how it may develop in the near future. almost simultaneously (as exemplified by two of Einstein’s famous annus mirabilis papers). objectivity. 1 The past The past 50 years were ones of remarkable achievements in our understanding of nature at its deepest level and. technical developments (accelerators. But. a two-fold revolution: – in our understanding of the physical world. computing. Iliopoulos is highly recommended. made all this an u incredibly successful story. For the high-energy theorist at CERN — and elsewhere — much of what has happened since had to do with finding a consistent framework where these two new paradigms could happily coexist .Fifty years of research at CERN. as two conceptually unrelated developments. The world of high-energy physics. Geneva. since on the one hand it is better known to most of you and. even during the previous decade. or historical precision. 75005 Paris. For this reason I shall occasionally mention developments and episodes that took place elsewhere. has no geographical or political barriers. For a more systematic — and somewhat complementary — study of CERN-TH (at age 40). 1. I joined CERN only in 1976. No serious attempt is made towards achieving completeness. the account by J. I was lucky enough to be active — and at the beginning of my career — by the mid-1960s. the one that shook the world of physics at the beginning of the last century through two profound conceptual revolutions: Quantum Mechanics and Special Relativity. France e Abstract A personal account is given of how work at CERN-TH has evolved during the past 50 years. I shall spend comparatively little time on that recent history. this half-century was second only to its predecessor. and that can possibly explain the data. see talks given by Kurt H¨ bner and David Williams). I shall try to retrace the developments of our understanding of elementary particles and fundamental forces as seen through the eyes of a theorist at CERN1 .1 Revolutions in our understanding In terms of novelty. and – in our research tools and work style. This makes it impossible to isolate CERN-TH activities from what happened world-wide.

not much effort was made to go beyond the classical theory: for the macroscopic world it was aiming at. of the red shift and Hubble’s law and with their interpretation in terms of an expanding Universe.G. (meaning ‘Before CERN’). Schroedinger. Later on.1. Consolidation of the Standard Model and first steps beyond it. strong.C. were known. in spite of its mathematical beauty and experimental successes. 2 28 . but even more so under the push of experimental discoveries. which. In spite of appearances. if any.: Before CERN! (1900–1954) These were very important years for the advancement of theoretical and experimental physics. Maxwell erased the separation between electricity and magnetism. 1. both characterized by their short range. 1.1 Gravitation and cosmology (1915–1974) Ten years after introducing special relativity. quantum mechanics was formulated in precise mathematical terms by Heisenberg. This eventually led (1935) to the standard (Friedmann–Lemaˆtre–Robertson–Walker) hot big bang model. On the one hand. 1. everybody knew that gravity is irrelevant for elementary-particle physics! Furthermore. after the experiments were carried out. only to come back to it later.1. these had their confirmation with the discovery. General Relativity attracted very little interest. V ENEZIANO on the other. Einstein. As early as the 19th century. at the microscopic level. During all those years. gravitational) has evolved with time. would have probably given CERN-TH its first Nobel prize. and on the other. Dirac and others (even if its interpretation remained puzzling2 ). I shall start my journey by ‘getting rid’ of the bottom part of Fig.2 B.1 I have tried to represent how our understanding of the four known basic interactions (electromagnetic. I shall divide this part of my account into four periods: – – – – 1900–1954: 1954–1974: 1974–1984: 1984–2004: B. were discovered. during many decades. the celebrated Bell inequalities. Genesis of the Standard Model. At the beginning of the 20th century only the two long-range forces. the weak and strong 2 The most significant breakthrough in this area was the work done by John Bell at CERN (1964–1966). Leaps forward. proposed his theory of General Relativity. theoretical progress has proceeded independently. weak. Forces whose macroscopic manifestation is completely different (strong and weak. the weak and strong forces. A characteristic feature of this sketch is that the horizontal dividing lines have been progressively removed. At the beginning his proposal caused much scepticism.C. itself confirmed by the discovı ery of the cosmic microwave background and applied successfully to the production of light elements (big bang nucleosynthesis). for instance) turned out to be described. in 1929. by very similar theoretical concepts. precession of Mercury’s perihelion) convinced the physics community. In spite of the above range-wise distinction. This ‘theory merging’ process continued in the 20th century. inside the high-energy physics community and at CERNTH: after all. it has not survived (yet!) the test of time. style) and conclude with some thoughts — and worries — about where we may be heading. For this reason. the first cosmological applications of General Relativity were made by Alexandre Friedmann (1922). Not much later. But just a few years later some striking tests (deflection of light. In Fig. electromagnetism and gravity. I shall then come to other aspects of the way CERN-TH has evolved (tools. in which gravity is seen as a manifestation of space–time curvature. partly because of the theorists’ quest for simplicity. electric and magnetic phenomena are different manifestations of one and the same physical principle as seen in different reference frames. for gravity and for the other three forces. classical General Relativity looked entirely sufficient and satisfactory. starting from the Galilean universality of free fall. those at the basis of gauge field theories.

caused a certain uneasiness with QFT within the HEP community: S-matrix theory gained in popularity. Mandelstam. by the Cabibbo angle (a CERN-TH paper. June 1947). together with those encountered in providing a full quantum version of either the Fermi or Yukawa theory. came just one year later. It should be buried. The vector and axial-vector currents were understood to be conserved (CVC. – the renormalization group evolution of the fine-structure constant and the associated so-called zero-charge problem. and to be ‘rotated’. But. and theorists got busy trying to ‘saturate’ current algebra relations through families of hadronic states (see the program of S.” These problems. particularly for strong interactions under the flag of G. Landau has been quoted as saying. Fermi’s theory of the weak interactions dates back to 1934. have followed parallel histories from the beginning till they found their final place within the Standard Model. before discussing those. Chew’s bootstrap program. It is interesting that the theories of the latter two interactions. based on the current–current (JJ) interaction. Fubini and collaborators). However. the first example of a quantum-relativistic theory with precise predictions. while Yukawa’s model of the strong interactions. even more. with his postulate of the existence of a meson. Interestingly. These hadronic currents satisfied the so-called current algebra. Indeed. they had a moment of crisis. In the weak interactions the discoveries of strange particles (1953–1955) and of parity non-conservation (1956) paved the way to the first improvement of Fermi’s theory. starting within QED itself. later called the pion. in what we now call flavour space. They were at the basis of the isospin and (flavour) SU(3) symmetries that successfully described the hadronic spectra (see Gell-Mann–Ne’eman’s ‘eightfold way’. Quantum Field Theory (QFT) had its first realization with QED. to a successful theory of the masses and interactions of the pions (and of other light pseudoscalars) as pseudo-Nambu–Goldstone bosons. Again. Right after World War II (Shelter Island. early 1960s) was another important achievement in that same direction. through PCAC. 1 (as shown in Fig. respectively. 1958) and partially conserved (PCAC. Two problems indeed mitigated the enthusiasm generated by QED’s smashing successes: – The apparent divergence of perturbation theory.3 1954–1974: the golden decades These were the years during which the Standard Model of particle physics was conceived and developed. 1. 1963). became a theory of high-energy scattering. Regge’s theory (1959). in spite of their vastly different phenomenology. through many false starts and partially successful attempts. with all due honours of course. The two currents were soon identified (1957) as being just a precise combination of vector and axial-vector currents. eventually led to the Standard Model as we know it today. 3 29 . FROM PAST TO FUTURE : T HEORY interactions were discovered and their theory started to be developed. 1961). The hypothesis of spontaneous breaking of chiral symmetry led. 2) in order to summarize the highly non-trivial ‘road map’ that. before theorists gave their full confidence to QFT once again. let us blow up the central part of Fig. L.1. 1961).F IFTY YEARS OF RESEARCH AT CERN. which. Within that S-matrix climate. the hadronic pieces of these same currents were also playing an important role in strong interactions. thanks to the work of G. developments in the weak and strong interactions went hand in hand. new ideas on how to handle the strong interaction sprang out. The growing number of (almost daily!) discovered resonances got neatly classified through T. Its striking successes in explaining the Lamb shift and (g − 2)e swiftly followed and consecrated QED as the quantum theory of electromagnetism. in 1955: “The Lagrangian is dead. leading to maximal parity violation (only one helicity carrying the weak interaction). The description of multiparticle production at high energy through the multiperipheral model (Amati–Fubini–Stanghellini. But. Chew and S.

processes like e+ e− → hadrons (at Cambridge’s electron accelerator). finite-energy sum rules. while at CERN on the way to my first post-doc at MIT. and then he himself gave one of the traditional lunch (journal club) seminars. Current algebra and flavour symmetries were nicely described. and this quickly led to the establishment of the present electroweak theory. these mathematical entities acquired respectability as truly important degrees of freedom in hadronic physics. Ademollo)–Weizmann Institute (H. learn new things. In the summer of 1967. and. and yet nobody. eventually. Before ’t Hooft’s paper on the renormalizability of spontaneously broken gauge theories. had ever seen a free quark! Before I go more into that story let me mention that important developments also occurred in the S-matrix approach to strong interactions. but it seems to define a consistent quantum field theory of the electroweak interactions from which definite predictions can be drawn”. Gradually. invited Gerard to MIT for a seminar (and tough private discussions). such as those that gave. This. the source of a field that is responsible for the unusual behaviour of quarks. – Thanks to the newly discovered Dolen–Horn–Schmit duality. dual-resonance models and string theory. Parallel striking developments were going on in the strong-interaction domain. V ENEZIANO In the next decade (1964–1974) both areas blossomed. of deep-inelastic lepton–hadron scattering (at SLAC). while the GIM mechanism (1970) was successfully confirmed by the J/Ψ discovery in 1974. one called ‘colour’. at the price of attributing to them. From the autumn of 1967 to the summer of 1968 a Harvard (M. mathematically. superconvergence (1966). Quarks could also explain some puzzle with Fermi statistics. in 1971. and get inspiration. duality (1967). After discussing with several people at CERN-TH3 I submitted it to Nuovo Cimento. Rubinstein. Meanwhile. After filling the blackboard with the electroweak Lagrangian. in terms of quark fields. with very encouraging results. as a reference point. and still is. through a visit of ’t Hooft and a lunch seminar by Steven Weinberg. Steve immediately grasped its importance.G. in spite of many searches. out of the CA programme and Regge theory. particularly for European theorists: a place where they can find experts in all branches of high-energy theory. I was attending the Erice summer school: there were plenty of interesting talks but two sentences by M. It marked the beginning of a new era in particle physics. and of hard hadron–hadron scattering (at CERN’s ISR). were all showing that quarks behaved as almost free particles. He was working mainly on chiral Lagrangians and even got interested in the dual-resonance model (see a paper written with M. besides the usual ‘flavour’ label. he added something like: “This is not a particularly pretty Lagrangian. Weisskopf. Weinberg had just joined the MIT faculty in the new Center for Theoretical Physics led by V. By the time of the Vienna Conference (August– September) it had already received much attention and was soon named the dual-resonance model: no one knew it at the time. Weinberg would rarely talk about his 1967 model. as a young post-doc at MIT. the possibility of a ‘cheap bootstrap’ program (as opposed to Chew’s expensive one!) had emerged. and colour was transformed from a book-keeping device into a dynamical concept. In the summer of 1968. M. I completed a paper I had begun just before leaving Israel. but string theory was born! 3 I wish to stress here how important CERN-TH was. check their ideas. renormalizabilty of the GSW theory had been proved by Gerard ’t Hooft. that of electroweak unification. When ’t Hooft’s article appeared. Salam and Weinberg which did not receive a lot of attention at the time. Ademollo and myself). In the weak interactions we may recall the discovery of CP violation (1964) and the seminal papers of Glashow. The important Bell–Jackiw anomaly paper came out of CERN-TH in 1969. I can add a couple of personal stories on that (strong-interaction) side. besides its role in supporting and guiding CERN’s experimental programme. 4 30 . Virasoro and myself) collaboration applied the ‘cheap bootstrap’ idea to the reaction ππ → πω. neither is it clear whether it has anything to do with the real world. Let me briefly recall how I lived that crucial moment. Indeed. Gell-Mann particularly stuck in my mind: – “Nature only reads books in free field theory”. of course.

One finally did have an answer to Gell-Mann’s puzzle: Nature is reading books in asymptotically free quantum field theory! At the same time. . quantum field theory had made a spectacular comeback and it was the turn of S-matrix theory to fall in to disgrace. however. offering an elegant solution to the U (1) problem. it was difficult to give up such an elegant construction. Schwarz proposed reinterpreting string theory as a theory of quantum gravity . where he showed that. for most string theorists of the time). I still felt that there was something deep and unique in the way string theory was reorganizing perturbation theory diagrams. the prediction of a long but finite proton lifetime. I remember its title very well: The price of asymptotic freedom and its abstract: Non-Abelian gauge theories. as well as their shapes. A sentence by Fubini (around 1970) expressed well the feelings of the day: “A piece of 21st century physics that fell accidentally in this century”. were predicted and found. and the Standard Model was now complete! What about string theory? Well. Perturbative QCD was greatly developed justifying a QCD-corrected version of Feynman’s parton model. . non-Abelian gauge theories could explain the absence of free quarks by the highly non-trivial behaviour of the strong force at large distances and could solve Landau’s zero-charge problem. the discovery of neutral currents and that of the W and Z bosons. I guess. for many of us. something very unlike what any normal quantum field theory does. On the theoretical side. the string reinterpretation of the dual-resonance model did not make its magic properties less mysterious. two of CERN’s most glorious achievements. while resurrecting the one of a strong CP violation (for which a solution within the Standard Model is still lacking). 1. This came from ’t Hooft’s 1974 CERN paper. this development had a hard time being accepted by the establishment: it was neither ` quantum field theory nor a pure S-matrix bootstrap a la Chew. naturally incorporating a CP-violating parameter in the Standard Model. QCD diagrams get reorganized precisely as to mimic a string theory! That was enough for me (and. Even then. J. Meanwhile. FROM PAST TO FUTURE : T HEORY However. In 1974 Ken Wilson proposed lattice gauge theory. and the unsuccessful 5 31 . Soon after.F IFTY YEARS OF RESEARCH AT CERN. crucial episode came for me at a seminar at CERN-TH by Sid Coleman in 1973. QCD naturally provided its own way of achieving all that. Various expansions of the 1/N kind were proposed. in a 1974 paper that went practically unnoticed. Sherk and J. and not much later the first numerical proofs of confinement in QCD came about. Instantons were discovered. Those interested in strong interactions (sadly?) turned a page and switched to working on QCD. besides the already mentioned J/Ψ discovery. Yet. in a suitable large-N limit. This decade also witnessed the first bold attempts to go beyond the Standard Model with the first Grand Unified Theories (GUTs). but at least made the whole game more ‘respectable’. I kept working on a ‘planar bootstrap’ and a topological expansion. the community’s attitude towards it was very polarized: love it or hate it! A third.1. Cabibbo’s theory was extended to the CKM matrix.4 1974–1984: Standard Model confirmations and attempts to go beyond The following decade can be characterized as a period of consolidation and experimental verification of the theoretical ideas that had just blossomed. On the experimental side (see Daniel Treille’s talk for details) let me recall. In just a few years. quark and gluon jets. it had to concede defeat. Another pillar had been added to our understanding of elementary particles and their non-gravitational interactions. A couple of years later. which bridged different approaches to hard and soft strong-interaction physics.

g. I shall go quickly to the last decade (1994–2004) during which the so-called second string revolution took place.e. He had a particularly burning question: Is the induced-gravity idea (Sakharov. 1974). As time went by. Theory was quickly moving to energy scales (GUT scale. those hopes started to fade away but. gravity’s Planck scale) never considered before to be of relevance to particle physics. In the autumn of 1984. Also the attitude towards QFT changed. unfortunately. The paper by Green and Schwarz became known as the first string revolution. that some suitable version of it (e. called M-theory. 1968) borne out in string theory? I gave him a straight answer: No! In string theory. Tom Banks arrived from the US for a short visit to CERN. reconciled much of the theory community — until then sharply divided into enthusiasts and violent critics — with string theory. These developments. but. Landau’s zero-charge problem disappears. Zumino (CERN. gravity is already present at tree level. D = 11 supergravity) could be consistently quantized. I came back to the issue and concluded that the induced-gravity idea is also possible in string theory and. It brought many new theoretical tools and ideas through the concept of (mem)branes and of new dualities interconnecting different string theories in various regimes. Green and J. As soon as we met he asked: “Have you heard about SO(32)?” I probably answered: “What do you mean?” A few days later he gave a talk about a recent paper by M. could be a very interesting option . Some beautiful early mathematical developments led to over-optimistic claims that not only would one soon arrive at an experimentally viable TOE. the Newton constant is entirely determined by radiative corrections). Not only: these developments paved the way to the possibility that the extra dimensions of space envisaged by string theory may be ‘large’ or that our own Universe is just a membrane immersed in this higher-dimensional space (brane-world scenarios). Schwarz which provided the first example of an anomaly-free superstring theory with chiral fermions. it became a bona fide quantum field theory after the work of J. in any case. It was then generalized into a new theory of gravity. in submillimetric deviations First discovered (in the West) in dual resonance models. provided the ultraviolet cut-off is kept finite. but also that the choice would be uniquely determined by theoretical consistency. I can recall at this point a brief encounter I had at CERN with A. 4 6 32 .1.5 1984–2004: the leap forward Another personal story. with the hope. 1. V ENEZIANO search for its decay in non-accelerator experiments. Supersymmetry (SUSY) was invented and developed4 . In both cases. after his death. In this optic.g. making the dream of uniqueness move further and further away. He had become very interested in string theory (see his Memoirs) and came to ask me about several issues that interested (or bothered?) him. however. as wonderfully exemplified by the Seiberg– Witten 1994 solution of N = 2 super-Yang–Mills theory. new physical phenomena may be expected to show up at accessible energies (such as those to be reached at the Large Hadron Collider) and/or at short distances (e.. Years later. instead. The number of consistent string theories went down to a single one. loops just renormalize Newton’s constant by a finite amount (in Sakharov’s induced gravity. soon dashed. In the sixties and seventies QFT had been regarded as a complete framework.. of a theory that could in principle contain both the Standard Model of particle physics and quantum gravity. the conceptual barrier between gravitational and gauge interactions was gone for ever.G. actually. called supergravity.. . The new ‘stringy’ techniques offered powerful tools for studying gauge theories in various regimes. In the eighties theorists started to view it as an effective low-energy theory that cannot be extrapolated above a certain energy scale without an ‘ultraviolet completion’. the number of solutions went up by many orders of magnitude. string theory being just one example of such a completion. Sakharov in or around 1987. It marked the beginning of an exciting decade in theoretical physics that we may characterize as ‘The (S)-matrix reloaded’. Wess and B. i. string theory came back into the spotlight as nothing less than a ‘Theory of Everything’ (TOE). .

Today. Non-accelerator experiments gave another striking piece of news: neutrino masses and oscillations discovered in both solar. our style of research has changed in many respects: – Specialization In the early days of my career I could basically follow all that was happening in high-energy theory: there were fewer areas and they were closer to each other. through string theory. within the Standard Model or its extensions. Perhaps as a result. This is particularly true of string theory. – Computing tools I still remember the computing facilities I had at my disposal when I did my Master’s thesis in Florence in 1965: punching cards to compute a one-dimensional integral on a big IBM machine took a substantial chunk of my time. an area of research that not only exploits the most advanced mathematics but that has also contributed to its development (see E. They still rely today on the exponential growth of computing power. As with our tools. and how theorists in the sixties and seventies were led to string theory by following an amazing ‘red herring’. they were caught by surprise as far as the neutrino-mixing pattern appears to be realized. we were sending long letters full of calculations over the Atlantic to get an answer. and theorists had largely anticipated this possibility. 1. There have also been quite spectacular achievements in explaining. Witten’s Fields Medal). in particular with theories of baryogenesis based on realizing. during the Weizmann–Harvard collaboration. Weinberg’s “physics thrives in crises”).2 Revolutions in our work’s tools and style The tools theorists use have greatly evolved during the last 50 years. we shall finally ‘close the circle’ by understanding what kind of strings describes the long-distance behaviour of QCD. In particular: – Mathematical tools The level of mathematical sophistication needed in order to take part in (or just to follow) some theoretical developments has reached unprecedented levels. At the same time. astrophysical and cosmological data appear to offer the much-sought-after crises that theorists need to make progress (quoting S. ultra-high-energy cosmic rays). search tools Again a huge revolution has taken place. as well as with the theory of inflation. This is reflected in the mathematical background of the average young theorist today (no comment on his/her average physics background!). experiments in astrophysics and cosmology made a quantum jump in precision and led to striking discoveries. – Communication. the microscopic origin of Black Hole entropy and in establishing a holographic bridge between gravity and gauge theories. Thus. Language barriers across different areas of research are often another obstacle and people do not seem to try to make the effort to be understood beyond their inner circle. I find discussions at TH seminars less frequent and interesting than they used to be 30 years ago. for instance. I remember when. some of which were totally unexpected (dark matter.F IFTY YEARS OF RESEARCH AT CERN. Hope is growing that. dark energy. Although a slight extension of the Standard Model can accommodate these phenomena. one day. developed by particle theorists in the eighties. at 7 33 . could not have been conceived of at that time. this decade brought the fields of particle physics and of astro-cosmo-particle physics closer than ever before. FROM PAST TO FUTURE : T HEORY from Newton’s law). Sakharov’s famous three criteria.and atmospheric-neutrino experiments. Finally. every field is very specialized and trying to follow what is going on (or even looking at what is new on the archives) requires a big time-consuming effort. the number of new papers per month was manageable. Fruitful interactions between particle theory and cosmology had actually started much earlier. Nowadays I can do that integral in a fraction of a second on my laptop! Lattice calculations.

dark energy. and Tevratron data. cosmic microwave background. V ENEZIANO best. – electronic archives. and its search engines. I would characterize this situation as being tough (for theorists). ultra-high-energy cosmic rays. of course. HERA. we have learned to take advantage of: – fax. we would dare to make a short phone call). Possibly. much of high-energy theory in the last two decades. Quite the contrary: theorists have proved their great imagination again and again. roughly. – Computers There can be a tendency to lean too much on numerical computations.and cosmo-particle data where the situation looks much like that of particle physics in the 1950s and 1960s: lots of good data and very little understanding (see dark matter. There is a too widespread attitude to use the computer as an all-encompassing black box. they appear to encourage quick writing and even quicker reading of papers. 2 The future 2.and cosmo-particle physics. – the Web. if any. . astro. with the exception of QED. Since then.G. . a couple of weeks later (very exceptionally. gamma-ray bursts . – e-mail.. with supersymmetry. Finally: – Theory much ahead of Experiment Also here there is no lack of examples: the beginning of General Relativity. 8 34 . theory should appeal more to astro. the nature and origin of the CKM matrix. without the guide of experiments. while theoretical research. superstrings and the like. HEP in the 1970s and 1980s. More recently. while waiting for great news from the LHC. always for the theorists. We can indeed consider. tends to make random walks. In other words: very precise experimental data mean little without a theory matching that precision. without trying each time to understand what exactly one has put in and what exactly the output means. The flood of papers is such that most theorists use an ‘on-line trigger’ (like our experimental friends) to select what to read (or at least to glance at). I would characterize this case as being easy but dangerous. in general. there can be a danger of becoming so addicted to mathematics that one forgets about the original physics goals. the relative status of Theory and Experiments determines to a large extent the way we make progress in physics. was elsewhere: the lack of experimental checks on whether any of those ideas had much to do with real life! I would say that. The problem. when experiments keep finding no disagreement with theory. i.e. Unfortunately all these goodies come with some side-effects: – Mathematics Warning to theorists: “Excessive use of mathematics can be dangerous for your health”. However. although it may become too perfect.1 New theoretical ideas The past 30 years have not been stingy in terms of new theoretical ideas. three cases: – Experiment (much) ahead of Theory Examples are the whole of particle physics of the 1950s and 1960s. – Electronic archives These are very useful indeed. – Experiment and Theory at a similar level Examples are QED. I would call this situation a perfect fit. ). but healthy and challenging. the Standard Model facing LEP. and a bit boring.

. they can be overcome. . become harder and harder to maintain.g. particularly on the youngest generation! What could be at stake? – Free choice of research The job market constrains more and more the choices young people have on the kind of research they want to carry out. . Iliopoulos. chances are that the community will realize it too late . CHS-39 (December 1993).. – Enlarging and deepening one’s knowledge The pressure to produce is so high that little time is left for broadening one’s culture beyond a very limited range of topics. is just around the corner . and with experimentalists.F IFTY YEARS OF RESEARCH AT CERN. – Embarking on risky projects The system penalizes projects that go out of the mainstream. new styles Computing power is going to grow. 9 35 . allowing us to tackle problems we have not been able to deal with in the past (e. Physics in the CERN TH Division. But we have to watch the side-effects . – Interactions between different areas of theory and with experiments As people concentrate more and more on a narrow area. in lattice gauge theories and numerical relativity). if we are aware of them. Means of communication will also improve offering easier and easier access to information. Bibliography J. report in the CERN History Series. Dangers exist but. interactions between the various groups of theorists. I am pretty sure.2 New tools. Even if one achieves some nice results there. . FROM PAST TO FUTURE : T HEORY 2. Another golden era. . .

QCD. (1934) V-A (‘57). Nuclear democracy. MP(‘62) bootstrap.(‘47) . DRM. V ENEZIANO 1900 B.C.G. QNGB’s (‘61) Superconvergence. CVC (‘58). SU(3)C Strong Yukawa (1935) Strings Gravity Newton Einstein (1916) Fig. ABJ(‘69) GIM(‘70) --> J/ψ (‘74) SSB (local) Renormalizability of GSW(‘71) Naïve quark model.. Free QFT? Quarks are for real.1/N(‘74) Weak QFT Strong Yukawa SSB (global) S-Matrix (1935) PS-mesons as Sum rules. FESR. Colour. 1 SUGRA? 1954-1974 Strangeness(‘53-’55) Fermi P (‘56).AF PCAC S-mtrx S-mtrx SSB duality Strings Cosmology. duality Regge(‘59). Quantum Mechanics Weak Fermi (1934) GWS V-A GIM CVC SSB SU(2) QFT SU(3) Colour CA QCD. String Theory Fig.triviality SU(2)LxU(1) SUSY? Shelter Isl. JJ int. PCAC(‘61) Cabibbo (‘63) SU(2)I --> SU(4) SU(3)F --> SU(6) CP (‘64) GSW(‘67). AF& Conf. Lattice(‘74). ’ 54 QED (‘49) ‘ 74 CERN ’ 84 ’ 04 Special Relativity. 2 Current Algebra 10 36 GUTs? SUPERSTRINGS? EM Maxwell (g-2)..

trade energy against luminosity. My own vision of CERN was that of an outsider before 1968. and informal review of half a century of physics at CERN. I shall be informal: by this I mean that I shall evoke the shadows as well as the lights. It was also clear that a cultural gap seemed to exist between electron and hadron communities. A hadron collider offers a high potential for discovery. gluon. we still do not know whether SUSY. It has now been realized that an e+e– collider is a threshold machine. and from various historical accounts already published [3]. from past to future: Experiment D. and that an e+e– ring has never enough centre-of-mass energy. τ). and the DCI ring was built instead. TRISTAN (Z0). heavy lepton search. is a reality or not. the τ. One can. Actually I shall try to be as impersonal and objective as I can. An e+e– machine is ideal for accurate measurements and detection of difficult final states. invisible decays. deep inelastic scattering (DIS). At the time ACO had to run flat out to reach the φ. Geneva. and to cover as many of the important topics as possible. If the available energy √s is below the interesting one you miss the possibility of a direct observation. I had learned that electron scattering and e+e– physics could be of high cleanliness and quality. coming from Frascati. for example. as Fig. In the preparation of this talk. led by P. Hofstadter in 1961) and on pion photoproduction. charm. Electron-proton or e+e– collisions have revealed the structure of the proton. Treille CERN. even light. By that time. a group from LAL. On the contrary. the J/Ψ. SUSY at LEP: in spite of all the accurate electroweak measurements performed. was considered. A further ring of higher energy. maybe LEP200 (h0). The series of ‘just missed’ is long: ADONE trauma (J/Ψ. Then the ACO e+e– ring allowed a detailed study of the vector bosons ρ. 1 illustrates. there is no way to do justice to every actor and I apologize in advance for omissions and a minimal selectivity in my choices of topics. I started my career in LAL Orsay. COPPELIA. I borrowed a lot from several colleagues. AdA. in particular the authors of the 25 and 50 years Physics Reports [1. 2]. Switzerland Abstract An overview of fifty years of physics at CERN is given. decided to start an experimental programme at CERN and I joined it. ω and φ. etc. Obtaining relevant indirect information is not guaranteed either. but the project did not go through. within some limits. DORIS (Υ). jets. selective. a hadron machine is not a threshold one but offers broadband beams of partons with tails at high energy. The usual statement that a hadron collider is a discovery machine and an electron collider a measurement machine is a loose one. Orsay exploited the first e+e– ring. I am pleased to see that this seems to be over.Fifty years of research at CERN. of an insider afterwards. that important discoveries may be made elsewhere than in your own laboratory. 37 . Lehmann. working at an electron linac on nucleon form factors which had been discovered in 1956 in Stanford (Nobel Prize awarded to R. From these years of apprenticeship. For instance. 1 Introduction I was asked to give a personal. not supported by history. provided it is equipped with the right detectors. Nevertheless.

thinks in the proper terms. No doubt the LHC will be another one. one had the feeling that ‘life was elsewhere’. goes to the most promising machine to perform experiments with the most efficient detectors. as I shall illustrate later. at a given time. although they may have been intellectually stimulating. experimental physics. But we also went through some periods which. because of much longer time scales and in spite of the immensity of the task. Could one conceive that young physicists share (unequally) their activity between the main line and a ‘small’ stimulating one? 38 . Currently. for instance after 1964 in terms of quarks and after 1973 in terms of the Standard Model (SM). Do the trends for the future keep that possibility? Let me add some personal remarks: In CERN scientific life. DELPHI or ALEPH event display. LEP1 and LEP200. preparing a third. there is a risk of monoculture and boredom. For long. there were several very intense and euphoric periods: the era of bubble chamber physics. running another one. machines.D. Personally I also consider LEP200 as an unfinished symphony. UA1 Megatek. etc. CERN had a diversified physics programme. The ‘ideal’ physicist is the one who. the W and Z discovery. analysing an experiment. and detectors. Most physicists had several ‘irons in the fire’. Ω optical. A comment: it was always a great pleasure and help for me to be able to visualize the physical event: scanning of bubble chamber pictures. the harvest of results from the SPS. T REILLE Fig 1: The scenery of hadronic and leptonic collisions (courtesy U. during the ‘1974 Revolution’ which followed the discovery of the J/Ψ in the United States. culminating with the discovery of Neutral Currents. were quite depressing as seen from CERN: for instance. Amaldi) 2 Synoptic view of fifty years of particle physics Table 1 below presents in chronological order some of the most important steps and innovations in four different sectors: theoretical physics.

cool. 73 Neutral Currents 73 ISR: σpp increases 74 J/Ψ 75–77 τ 75 jets in e+e– 76 charm 77 Y 78 PV at SLAC BEBC. Salam. :scal. coll. 83 LEAR 81 Si microstrips RCBC 84 search for ν osc. 1980 Phenomenology of SUSY Supergravity 1985 87 La Thuile workshop 78 † Gargamelle 78 ν beam-dump 81 pp collider 82 LHC feas. FROM PAST TO FUTURE : E XPERIMENT Table 1: Some of the most important steps and innovations in fifty years of particle physics 1945 Theory 46 Gamow: Big Bang 48–49 Feynman.F IFTY YEARS OF RESEARCH AT CERN. freedom Kobayashi–Maskawa 73 QCD+SM 73 Quark counting 73–74 Wess–Zumino Physics 47 π − 47 Lamb 47–53 Strange particles 51 Λ. st. 79 Gluon at Petra 81 Beauty at CLEO 82 Jets in hard. McMillan Detectors 47–50 Scint. Z discovery 83 EMC effect 86–87 Bs mixing. . 73 Asympt. Κ 0 . Zweig 64 Higgs mechanism 64 Bell inequalities 66 Greenberg: colour 67 Sakharov conditions 67 Weinberg: ew Lagrangian 68 Salam: ew Lagrangian 68 Veneziano’s model 69 Bjorken scaling. proven in ISR 76 Start of SPS 76 pp idea 77 ICE 1970 1975 77 Altarelli–Parisi 77 Peccei–Quinn 79 LEP Summer St. Argus 88 Proton spin crisis 89 LEP: 3 ν 69 Second muon ring 71 ISR 70 Gargamelle 72 OMEGA 73 Split Field 73 BEBC 74 Stoch. Schwinger. Ξ − 53 νe Reines Machines 45 Synchr. etc. 68 MWPC 69 DIS 72 ISR: hard coll.. count. QED 53 Gell-Mann (V particles) 54 CPT theorem 1955 58 Prentki–d’Espagnat 59 Regge 1960 61 Goldstone. 83 W. UA1. 85 End BC 86 AC 89 LEP 90 low-β UA2 89 LEP detectors 1990 90 Large ED LHC Aachen Workshop 39 . viol. partons 70 GIM mechanism 71–72 ’t Hooft renorm.. 53 BC (Glaser) 1950 55 Antiproton 56 P violation 55 GeV e Stanford 57 SC 58 first g – 2 59 PS 61 AGS 59 30 cm BC 61 81 cm BC 64 2 m BC 63 Σ -Λ parity 64 Ω 64 CP violation 63 First muon ring 64 ISOLDE 1965 67 Budker e-cool. Glashow Gell-Mann (8-fold way) 63 Cabibbo theory 64 Quarks: Gell-Mann.

some of the requirements are similar. While the Wilson chamber was born in 1911 and the Geiger–Muller counter in 1928.. one is now planning to use them as main tracker. in particular the advent of submicron processes. 1968 saw the ‘revolution’ of G. maybe vectorial information (position and direction). when evidence for the νμ was found. SLD. rapid cycling BCs were successfully used for charm physics. While the goal was flavour tag in the LEP era. in particular for charm physics. 40 . thinner. This represents a big step. from the ‘Spaghetti’ to the ATLAS ‘Accordeon’. Whatever the physics goal may be (SuperLHC. from the square metre typical of a LEP microvertex to hundreds of square metre. suggested by J. important breakthroughs were the understanding and exploitation of the compensation mechanism and the move towards pointing geometries. T REILLE 3 The mutations of the detector methods One should remember that the ancestors of modern detectors are not so far away. have played an important role. Steinberger and used by G. as for the CMS tracker at the LHC. seems to have a promising future ahead. The first two decades of CERN life were the reign of bubble chambers. Charpak. The OMEGA optical spectrometer was operated in 1972. From 1949 to 1959 spark chambers (SC) were developed as tracking devices. Seguinot. The first large-scale use of MWPC occurred in 1971: the CERN–Heidelberg experiment. Triggered detectors then entered the game. future fixed targets). rather than from stereo microstrips. Clearly this progress was made possible by various breakthroughs occurring in microelectronics. which then led to the drift chamber. CLIC. the MultiWire Proportional Chamber (MWPC) in 1968. Charpak. one is still far from a completely satisfactory design regarding its granularity and the integration of the electromagnetic and hadronic parts. intrinsically radiation hard. P. at the PS on CP violation and the Split Field Magnet. Hybrid systems. Innocenti et al.G. micro or macro. have played a key role in particle identification. heavy ions). The various readout methods changed from film to filmless ones: acoustic. the Bubble Chamber (BC) was invented in 1953. Liquid argon and scintillating crystals are still quite fashionable. By the end of the life of this technique. a very welcome feature for LHC experiments. an example being LHCb.D. from BEBC to the European Hybrid Spectrometer. For an optimal tracking one needs real 3D information obtained from pixels. by J. faster. The RICH of T. the multistep chambers. a series culminating with the time projection chamber. one had to wait until the years 1947–50 to see the first scintillators. Steinberger et al. vidicon. A. The first massive use of SCs was at BNL in 1962. Currently the ongoing revolution is the rise of silicon detectors. still harder and cheaper detectors. In 1964 appeared the first online computers. after being used during the LEP era (DELPHI. ILC. (a cost of 23 MSF. As for calorimetry. The methods of analysis of their pictures were evolving in parallel and would deserve a paper by themselves [4]. Minten. It is clear as well that. the second decade saw the successes of the liquid hydrogen (LH) and heavy liquid (HL) large chambers. In all these mutations the crucial role of detector R&D (in particular the strong DRD programme undertaken to learn how to exploit the full luminosity of LHC) is obvious. inventor of the MultiWire Proportional Chamber (MWPC). Cherenkov detectors. Ypsilantis and J. unfortunately the detector was blind around 90º). if we want to progress further. from threshold devices and focusing ones (DISC) to the Ring Imaging technique (RICH). Concerning calorimetry. wires. some R&D has to continue. While the first was partly apprenticeship.

After 1960. In 1958 the HBC of 10 cm was in use at the synchrocyclotron. FROM PAST TO FUTURE : E XPERIMENT Figures 2 and 3 below give an idea of the evolution with time of the various classes of detectors.P. its mission being to run the 30 cm BC. led to a very welcome increase in neutrino fluxes. In 1953 the Bubble Chamber (BC) was invented by Glaser. Steinberger was developing propane chambers at Columbia University. In 1961. recognized as the “most courageous of scientific decisions”. What is plotted is simply the number of entries in the search engine Spires whose title refers to a given technique. In 1955–56 J. In 1961 the 1 m HLBC from Ecole Polytechnique in Paris came to CERN. 3: More details on silicon detectors 4 The bubble chamber era After the Second World War. the situation in the US and in Europe was quite different. van der Meer and completed in 1962. Fig. Peyrou. to build the 2 m one and to get hydrogen liquefiers. Europe started with a period of ‘every man for himself’. which was then in activity at the PS from 1960 to 1962. CERN’s interest in this technique started in 1955. but means that it no longer deserves a publication by itself. in Berkeley. and lasted until 1960. 2: Evolution with time of the various classes of detectors Fig.F IFTY YEARS OF RESEARCH AT CERN. The fall of the curves does not imply that the technique is no longer used. In 1960 the BP3 HLBC (heavy liquid) from Ecole Polytechnique in France was operated at CERN. In 1956 many realizations and projects already existed around the world and the LHBC (liquid hydrogen) programme was adopted at CERN. The reaction 41 . The various countries and CERN had to learn the techniques required and how to collaborate. having reached its maturity. In January 1964 appeared R. the 81 cm LHBC from CEA started operating at CERN. By the end of 1963. In 1961 the Track Chamber Division was created under C. Alvarez. From 1957 to 1960 CERN built the 30 cm LHBC. the European bubble chamber community was able to react rapidly to new challenges and to take the right decisions which led to its notorious successes. 1959 saw the commissioning of the 72 inch chamber in Berkeley. telling that times were ripe to build very large chambers. soon started to play a leading role in its development. invented by S. a 2 m BC was being operated in Berkeley and Brookhaven. politically and technically. L. Shutt’s proposal. The magnetic horn.

3 GeV proton synchrotron. Alvarez July 1960: AGS. operated until 1977. Finally. an important role in charm physics was played by the last generation of small rapidcycling chambers. using K– at rest. In February–April 1964 Gargamelle was defined (A. bubble chamber activities have gradually built the strong and diverse CERN technical expertise. Let us quote: – At the LBL cyclotron in 1950: π 0→ 2γ (Panofsky. commissioned in December 1970. Lagarrigue). Λ(1400). the 81cm HBC allowed one to measure the relative Σ – Λ parity. η 1964: η' 1968: Nobel prize awarded to L. exploited until 1966 1953: V0 events 1956: KL 1957: Σ0 1961: ρ 1955: LBL Bevatron. or the observation of the anti-Ξ (Fig. For instance in 1962. 6 GeV protons 1955: discovery of antiproton 1958: anti-Λ 1959 Ξ0 1960 anti-Σ0 * 1961: K (892). 33 GeV 1964: discovery of Ω –. crowned by the discovery of Neutral Currents. Given the initial situation described above. 1965 saw the start of operation of HBC 2 m (cost: 3 MSF. the first Ξ–anti Ξ event (right) 42 . In December 1964 the realization of a large LHBC was recommended. By mid-1964 the 152 cm British National HBC was put in service at CERN. The construction of BEBC lasted from 1967 to 1972. the Tripartite Technical Group was created. Later came the great successes of the big chambers. Moreover. 40 million pictures taken). found to be > 0. it is not surprising that the first important discoveries of the BC era were made elsewhere than at CERN. T REILLE in Europe was fast. Steinberger) – At other machines: 1953: BNL Cosmotron. 4). At the end of 1965 came the decision to build Gargamelle.D. ω. discovery of CP violation Nevertheless one must recognize many important contributions to hadron spectroscopy from CERN in the first years. It is from them that Europe and CERN found the way to collaborate. 3. Fig 4: A Σ–Λ event (left). in 1975 it was fully operational.

The team was composed of physicists from X-Orsay and of former members of the NPA 1 m BC. For the process μ→e γ. It included seven European laboratories and one guest laboratory. Lagarrigue. Leprince Ringuet called it Gargamelle. Musset worked out a proposal of a neutrino detector aiming at increasing the event rate by an order of magnitude. the word NC was not pronounced. This problem had low priority and Deep Inelastic Scattering (DIS) was more exciting then. published in 1964. in 1955 J.F IFTY YEARS OF RESEARCH AT CERN.2 Neutral Currents (NC) I borrowed much from several excellent chronicles about Neutral Currents [6–8]. Rousset. In 1971–72 a viable theory of weak interactions existed and the question was clearly posed: are there NCs. 43 . yes or no? The competitors in the field were Gargamelle and the HPWF counter experiment at NAL. Rubbia et al. After the Siena 1963 conference. Peyrou and BEBC 4. However. 4. Steinberger at Nevis Lab had set an upper limit on its branching ratio Re (< 0. FROM PAST TO FUTURE : E XPERIMENT Fig 5: C. Conversi.1 Early non-BC physics at the Synchrocyclotron [5] Considering the π→eν channel.03). Some evidence of its existence had been reported. A. got evidence for its existence and found Re = (1. This was in agreement with the expected e–μ universality of the axial coupling and a cornerstone of our understanding of V–A interactions. Even in November 1968. One should also mention the determination of the helicity of the muon from π → μ ν at the PS. in 1958.2 m HL chamber. Paty in 1965. Fidecaro et al. G. The idea of lepton-flavour conservation was introduced. showed that the process was forbidden.6 10–4 ) and this started to be puzzling.30)·10–4 . it was the reverse. From the 1. A. This was simply a mistake uncovered by M. at a Gargamelle meeting in Milan. Let us quote also the first observation of the pion β-decay π+ →π0e+ν. Experiments on the g – 2 of the muon started at the SC in 1958. P. there was an upper limit on the ratio of elastic NC to Charged Current (CC) events (< 0. in 1962. both for real and virtual γ.22 ± 0. However. di Lella. L.

Monte Carlo simulation then showed its strength: a neutron background program with no free parameter was used. 7: André Lagarrigue Fig. At the Bonn 1973 conference. complemented by other approaches. and one more year of work was needed to conclude.N. This major discovery established the similarity in structure of the weak and electromagnetic interactions. Yang announced the discovery of Neutral Currents. 6: André Lagarrigue and visitors Fig. T REILLE But in 1971 everything was ready. by that time the HPWF signal had disappeared. The search for hadronic NC candidates went on. This was the experimental beginning of the SM. 8: Gargamelle Fig. Candidates were observed. However. at the London 1974 conference. Fry and D. C. Haidt. counter arguments were put forward and one realized that the only way to conclude was to prove that the number of neutron-induced events was small. and caused much excitement. The term ‘electroweak’ came into being. The goal was to separate neutrino-induced from neutron-induced stars. However. This threw dismay and distrust on the whole matter. The MW was then predicted to be around 70 GeV. CERN Report 75-01). Fig.D. NC events looked like neutron star events without a muon. Finally. It allowed the conclusion that the NC sample was not dominated by neutron stars (J. In December 1972 one electron event was found at Aachen. A careful classification of event types was devised. with a flat spatial distribution. evidence for NC was confirmed by Gargamelle and HPWF. 9: A neutral current event on electron 44 .

By comparing the structure functions of the nucleon derived from neutrino and muon scattering. van der Meer Fig. Salam with P. 12: A. Musset (right) Fig. 10: S. 14). FROM PAST TO FUTURE : E XPERIMENT Fig. 45 .F IFTY YEARS OF RESEARCH AT CERN. Salam receiving the Nobel Prize 4.3 Other important results from the bubble chambers Besides Neutral Currents the large bubble chamber brought other most interesting results. Gargamelle demonstrated that the neutrino–nucleon cross-section rises linearly with energy (Fig. 13).11: A. Gargamelle proved that quarks have indeed a fractional electric charge (Fig.

like Kamiokande. This prompted programmes aimed at detecting proton decay. 16. namely the proof that the nucleon structure functions actually evolve with the resolution power of the observation. 15: First evidence for scaling violation 46 . An amusing fact is illustrated by Fig. 15). At that time. by combining their results. This predicted a rather low GU scale and a proton lifetime of 1030 years or so. the value found for the Weinberg angle was in agreement with non-supersymmetric Grand Unification.D. 14: The rise of the neutrino cross-section Fig. T REILLE Finally. They did not find proton decay because the value of the Weinberg angle was actually incorrect. 13: The charges of quarks are fractional Fig. BEBC and Gargamelle got the first evidence for scaling violation (Fig. But they later discovered the non-zero neutrino mass via ν oscillations! Fig.

Concerning the production of heavy quark bound states. 16: Ad augusta per angusta: the status of the mixing angle in 1981 5 The ISR The Intersecting Storage Rings (ISR) was the first hadron collider ever built. is that several important discoveries (J/ψ. 47 . The first clear mass peaks of charmed hadron production in hadronic interactions were observed in 1979. Jacob we distinguish: – “A brilliant start”: 1971–74 An important early discovery at the ISR was the rising total proton–proton cross-section in 1971. dilepton masses up to 20 GeV. while the proton size increases with energy. – “A very active and interesting programme”: 1978–83 Among other topics one can mention the studies of lepton pair production: J/ψ. FROM PAST TO FUTURE : E XPERIMENT Fig. 17): it demonstrated that partons were behaving as pointlike objects for the strong interaction as well as for the electromagnetic one. charm and beauty semileptonic decays were ignored in these data analyses. charm. Another major finding was the discovery of the large pT phenomena at the ISR in 1973 (Fig. Following M. properties of the Drell–Yan process. The production of a beauty baryon at the ISR was announced but was for a long time a controversial matter. the charm cross-section determination turned out to be troublesome. The measurements led us to conclude that. – “A somewhat difficult period”: 1975–77 The reason. beauty) within reach of the ISR were actually made in the US (see the synoptic table). the results obtained at that time are still competitive. Υ. independently of the nature and energy of the colliding particles. in part because the ISR detectors were not adequate.F IFTY YEARS OF RESEARCH AT CERN. like pp → χC . obviously. its profile stays the same. However. However. following several earlier indications from the measurements of single electron and lepton pairs. The study of Diffractive Dissociation told how energy hadronizes.

D. T REILLE The production of direct photons was studied. and these ‘stole the limelight’. from their discovery in 1979 to the end of the ISR (Fig. in Shakespeare’s Julius Caesar. By 1982 the AFS and R110 experiments at the ISR had an appropriate acceptance to see jets. M. Gluon Compton scattering was found to be dominant. V. But the results of the proton–antiproton collider came at about the same time. These measurements played an important role in testing QCD. the machine worked magnificently: the ISR were a most efficient R&D laboratory for accelerator physics. The study of the two-photon final states was also performed. In particular. Concerning the evidence for jets in hadronic collisions. the sub-leading process being quark–antiquark annihilation. 18). Weisskopf argued that it does not matter where discoveries are made. but unfortunately the luminosity was insufficient. both in fixed-target experiments and at the ISR. the morale at CERN was low.Weisskopf seems to exhort us to turn to superior instances. while V. it allowed the testing of the idea of stochastic cooling. A comparison of the yields with antiproton–proton collisions would have been interesting. it started with a rather confusing period. Jacob. In one of the pictures. Fig. 18: Direct photons and dominant diagram 48 . This fact allowed the determination of the gluon distribution inside the proton. By the end of the ISR in 1983. quoting Mark Antony. Nevertheless. said: “I come to bury Caesar not to praise him”. That is what we did in a quite oecumenical way. 17: The high pt signal at the ISR Fig. Whatever be the area of the physics programme. although different x-regions were probed in the two sets of measurements. key of later successes.

concepts like hermeticity of a detector. For the first time. largely due to the use of stochastic cooling. were fully taken into account. The figures below show the UA1 detector and a sketch of its powerful tracker. It fed two outstanding and quite innovative detectors.F IFTY YEARS OF RESEARCH AT CERN. 22:Visit of the Dalai Lama 6 The proton–antiproton collider Actually the real cure came from the success of the proton–antiproton collider. redundancy of the procedures. etc. A look at the synoptic tables shows how fast were the decision-making and the realization of this programme. Besides the most celebrated discovery of the W and Z bosons. Weisskopf Fig. close to the predicted masses. Lederman discovering beauty Fig. As we all know. FROM PAST TO FUTURE : E XPERIMENT Fig.. 19: L. “la physique était au rendez-vous”. 21:Visit of the Pope Fig. 49 . 20: An exhortation of V. the collider brought a clear picture of hard hadronic scattering [9]. The collider was a great machine.

T REILLE Fig. 25: First Z event in UA1 Fig. 26: Lego plots of a W and Z event 50 . 24: A sketch of the UA1 tracker Fig.D. 23: The UA1 detector Fig.

with full azimuthal calorimetric coverage. then UA1 obtained clear. from its prediction to LEP final results Let me tell you more about hard hadronic scattering. However.F IFTY YEARS OF RESEARCH AT CERN. did not see any jet structure. for instance. in 1982. FROM PAST TO FUTURE : E XPERIMENT Fig. 28: Evolution of the Z mass. first UA2. 51 . The situation in hadronic scattering was thus very different from the one of e+e– scattering in which a clear two-jet structure had been observed since 1975. uncontroversial evidence for jets. Even after the first run of the collider the results were not decisive. had a full azimuthal and a 40° to 140º polar angle coverage. In 1981 the fixed-target NA5 experiment. 27: From the discovery to the final UA2 results Fig. UA2.

D. – the determination of the proton structure functions. In 1983 the EMC effect (Fig. T REILLE Many results then followed. found to agree with NLO QCD. 29) came as a surprise: the structure functions of the nucleon depend on the nucleus in which it is immersed. 30). From its start in 1976 the SPS narrowband beam. is impressive as well. These two effects are far from being explained 52 . Doble) first fed the NA2 (EMC) and NA4 (BCDMS) experiments. of αS and of the nucleon’s gluon content. of heavy flavour production (beauty). Scaling violations. up to 200 GeV. contrary to expectations. in order to guarantee a proper search programme at the LHC. Clifft. Afterwards. Much progress has been obtained since then. in addition to the Bubble Chambers BEBC and Gargamelle (ending in 1978). first seen in BC neutrino data. The programme is currently continuing with COMPASS. The collider thus proved the physical reality of partons inside the proton and opened the door to quantitative tests of QCD. N. In conjunction with electron and muon DIS. EMC became New (or Nuclear) MC then SpinMC. 7. showing the role of gluons at small x. among them: – the determination of the angular distribution of parton–parton scattering. and later CHORUS and NOMAD. deduced from the study of spin-dependent structure functions. were studied in great detail using the Altarelli– Parisi formalism (1977). – the measurement of the total pt of two-jet systems. the giant spectrometers. in particular at the ECFA Tirrenia Meetings. as we saw. all behaving as expected. and finally moved to the search for oscillations. which took data until 1985. as we know now. Since parity violation allows neutrinos to distinguish quarks from antiquarks. 102 times more intense but peaked at low energy. agreeing also with QCD. The most efficient and clean SPS muon beam (R. – the studies of multijet final states. The total quark spin. CDHS and CHARM. unfortunately in a domain of parameters in which. the fractional electric charge of u and d quarks (F2eN = 5/18 F2νN). What remains to be done. 7. of the W transverse momentum.1 Neutrinos This programme has been one of the flagships of CERN activities.2 Muons This programme has been active from 1978 to now. in particular by the Tevatron proton–antiproton and HERA electron–proton colliders. Various tests of perturbative QCD and a precise measurement of the strong coupling constant were performed. and wideband beam. Opposite-sign dimuons were exploited to get the structure function of the sea of strange quarks in the nucleon. 7 The SPS Programme The programme was carefully prepared in 1970–74 by a prospective activity on the SPS beam lines and on its physics programme. is only a small fraction of the proton spin. Later the neutrino programme included a beam-dump experiment. their scattering provided the structure functions of the ‘sea’ quarks. concerning QCD. they could not manifest themselves. – the production of direct photons. fed. they confirmed. These experiments performed more and more refined measurements of scaling violation. In 1988 another surprise was the proton spin crisis (Fig.

Forced to be selective.F IFTY YEARS OF RESEARCH AT CERN. baryonium searches. elastic scattering. Soft processes include studies of particle production. HERA polarized. glueball searches. COMPASS. etc. I have chosen to illustrate the first two items. FROM PAST TO FUTURE : E XPERIMENT and other programmes. Hadron spectroscopy covers in particular the spectroscopy of light quarks. 53 . HERMES. Fig. 30: The spin crisis 7. will continue to shed light. RHIC p–p polarized. The main themes of physics can be classified as follows: – Hard scattering – Heavy flavours – Hadron spectroscopy – Soft processes. low-mass μ+μ− pairs. soft photons.3 Hadron and photon beams This very rich programme includes about 70 experiments extending over 25 years. 29: The EMC effect Fig.

A highlight was the discovery of the K factor in NA3 (Fig. set-up (a) and results (b) 54 . 31: Evidence for the K factor from NA3 Among other experiments. measuring J/ψ and Drell–Yan dimuon production by a variety of projectiles. T REILLE 7. constant with Bjorken-x. (a) (b) Fig: 32: The Omega beam dump. one can quote the Omega Beam Dump in 1978. this K factor did not preclude the study of structure functions.4 Hard scattering The main activities were the study of high-mass lepton pairs or the Drell–Yan mechanism. namely direct evidence for higher-order QCD corrections. 31) and WA11 in 1982.D. and which provided the very first results obtained from the SPS (Fig. in particular those of unstable hadrons like pions and kaons. Fig. 32). Being essentially a multiplicative factor of the cross-section.

obviously a more difficult physics topic requiring higher energies. the very pure signal of Λ C → pKπ from NA32. computed at the relevant order of perturbation. a very active sector as well (about 20 CERN experiments). and the signal of Ω0A of WA89. like microstrips. i. 55 . besides charm studies (see later). not only because of its impact on QCD testing. thanks to its CCD detectors. and hyperons. but provided also some results on beauty. 33). photons (which offer a fraction of charm ten times higher than hadrons). bringing some first-hand information on the hard processes. It used also ‘old’ techniques. 33: Evidence for QED Compton from NA14 The comparison of experimental data with the existing theoretical QCD estimates. Figure 34 illustrates a few remarkable results: the associated production of charm in emulsions (WA58). elastic scattering of the gamma on a quark (Fig. and QCD Compton effect. but also because it was a most innovative sector in terms of high-spatial-resolution silicon detectors. It contributed mostly to charm physics. like emulsions coupled with large spectrometers. provided the first high-energy measurement of the Compton effects on quarks in 1985–86: QED Compton effect. Photoproduction. 7.F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : E XPERIMENT Another active area was the study of prompt single photons and di-photons produced in hadronic reactions.. and rapid cycling bubble chambers.5 Heavy flavours I shall insist a bit more on heavy flavour physics. of beauty in the microstrips of WA92. Figure 35 recalls two beautiful experiments performed at CERN. CCDs and active targets.e. Fig. All types of beams were exploited: hadrons. a process in which the incident photon is changed into a gluon. the shortest lifetime ever measured that way (5 × 10-14 s). contributed strongly to validating the theory.

– muon: 2.D. 56 . 1985) and the pixel set-up of NA57 8 8. QED and EW.1 Accurate tests of gauge theories g-ology The g – 2 programmes have extended over 40 years. μ = g e/2m s. 36) and the g – 2 quantity turns out to be an amazing testing ground of theory. 35: Two remarkable experiments: the π0 lifetime (CERN SPS. For the electron or muon the Dirac equation gives g = 2. but with the potentially interesting effects boosted by the mass squared factor (40 000).0000000000075. Indeed some of the initiators are still currently involved. experimentally and theoretically.e.. As John Adams once said: “g – 2 is not an experiment: it is a way of life”. The present measurements of g are – electron: 2. The g factor is the constant that determines how much magnetic moment μ results from a given amount of charge. mass and spin s.0000000012. i. since all interactions contribute to it. 34: A few results from heavy-flavour physics (see text) Fig. Actually a slight difference arises because of radiative phenomena (Fig. a few parts per trillion. T REILLE Fig.0023318416 ± 0. ‘only’ a few parts per billion. From my point of view it represents one of the most beautiful achievements in particle physics.0023193043718 ± 0.

The role of hadronic light-by-light scattering is important and actually leads to the present residual ambiguity in the interpretation. A better experiment started in 1969. so that one can use electric quadrupoles with a uniform magnetic field. The movement of the spin is described by the Bargmann–Michel– Telegdi equation. and 12672 five-loop ones. for which numerical computations have bean under way since early 1980.2 GeV muons involving E. no clashes”. Picasso.00232. Picasso. Farley. Kinoshita at Cornell (at better than one part per million). S.F IFTY YEARS OF RESEARCH AT CERN. giving g = 2. the energy at which the electric field does not affect the g−2 precession. F.00116. van der Meer.7 σ between experiment and theory led to a correction of the latter. F. There are 7 two-loop diagrams. Krienen et al. actually a 40-year long experimental programme matching a 40-year long programme of more and more refined calculations. In 1963 came the idea of a new experiment at the PS with 1. uncertainty on aμ in 10–9 Fig. FROM PAST TO FUTURE : E XPERIMENT The confrontation between experiment and theory has been a “tennis game with well-matched players on either side of the net ”. 72 three-loop diagrams computed in 1995 by T. The signal was observed for ~130 microseconds. Fig. using the so-called magic γ-factor at 3. A disagreement of 1. 891 four-loop diagrams. Farley [10] underlined the excellent atmosphere in the group: “no wars. At first order: α /2π ~ 0. Figures 37 to 43 illustrate various aspects of the g – 2 programme. F. In his tribute to E.1 GeV. The actors are shown below. 37: The evolution of the uncertainty in the CERN experiments 57 . the agreement with theory was excellent. 36: (a) The one-loop contribution to g – 2. (b) a key higher-order contribution The value of g − 2 is obtained by measuring the beat between the rotation of muons around a ring and the rotation of their spin. The signal was observed for 534 microsecond. In 1958 a first experiment was done at the SC.

D. 41: The first ring 58 . 39: The present status of the confrontation between experiment and theory Fig: 40: The team of the first g – 2 experiment Fig. T REILLE Fig. 38: The signal in the Brookhaven experiment (a) (b) Fig.

surpassing all expectations (see Table 2). 43: The Brookhaven ring 9 LEP and ADLO Fig. the top quark and the Higgs boson. The vacuum. LEP was exploited by four high-quality multipurpose detectors. planned very accurately 15 years in advance. The machine. worked magnificently. reached an accelerating field substantially higher than expected. on account of a careful work of conditioning and balancing.F IFTY YEARS OF RESEARCH AT CERN. electromagnetic. The programme led to clean and subtle physics results: in particular it demonstrated the validity of the SM at the loop level. was outstanding. Through their effect as virtual objects. FROM PAST TO FUTURE : E XPERIMENT Fig. The superconducting radiofrequency cavities. and this contributed to very favourable experimental conditions. ALEPH. It showed that the coupling ‘constants’ of the three forces. 59 . which combined their results. 42: The second ring Fig. L3 and OPAL. DELPHI. it gave results on particles too heavy to be ‘really’ produced. thanks to the large-scale use of getter pumping. LEP is a well celebrated success story. 44: The LEP scenery ADLO is the name of the four experiments.

let us quote: – the use of high-performance luminometers and the most accurate theoretical knowledge of the relevant Bhabha cross-section (Figs. and LEP can be considered as a first approximation of a Z and therefore of a heavy-flavour factory. are actually running and converge exactly at a very high scale (~1016 GeV) in the supersymmetric version of the SM. which may play an important role in the future at a Z factory. T REILLE weak and strong.00 mA 8. It turned out to be complicated and. 46: The vacuum chamber and its getter pumping In all respects LEP experiments did better. than expected as shown by Table 3 below. besides the main programme. it was not implemented. 53). 51).045/0. – the exploitation for beauty tagging of high-quality silicon microvertex detectors.4% 34/100 1. Table 2: LEP performance: foreseen and achieved Foreseen (55/95 GeV) 0.75 mA 6 mA 0. sometimes much better. We recall that. since it would have delayed the LEP energy upgrade. The first was an attempt to exploit the longitudinal polarization of e± at LEP. these studies had interesting consequences. Actually up to 16 bunches were used.03 4.75 m 7 cm Achieved (46/98 GeV) 1. two additional variants of LEP were studied.25 m 4 cm Current per bunch Total current Beam–beam vertical parameter Ratio of emittances Maximal luminosity (10+30) * βx * βy Fig. We briefly quote below the main LEP messages. The second variant aimed at a much increased luminosity by devising multibunch schemes (Pretzel trains later replacing equidistant bunches in the Pretzel scheme). and to exploit transverse polarization.083 0. called the Blondel scheme. 60 .4 mA/6. 52). a key indredient of neutrino counting (Fig. 50.D. 45: The RF voltage around LEP Fig. Among the instrumental breakthroughs which contributed to these successes.0 16/27 1. key to the precise measurement of the Z mass. It led to the invention of a clever scheme.2 mA 0. allowing excellent purity for a still reasonable efficiency (Fig. Even if they were not realized.

47: The LEP inauguration Fig. Myers with M. 49: Some important actors of LEP: H.1 ±2. Schopper and E. 48: The ALEPH experiment Fig.3 ±10–15 (stat) ±17 (syst) ±50 3% Γ Z (MeV) Normalization M W (MeV) μμ AFB < 1‰ ADLO: 42 Twice better 2. > 100 (syst) τ polarization Rb b AFB 61 .5 times better 3 to 6 times better 3 times better 50–60 (stat). Hofmann and S.F IFTY YEARS OF RESEARCH AT CERN. Rector of the University of Geneva and President of CERN Council (June 2001) (centre and right) Table 3: Expected and final accuracies of various measurements at LEP Expected M Z (MeV) Final ±2. Picasso (left). FROM PAST TO FUTURE : E XPERIMENT Fig. Bourquin. A.

T REILLE Table 4: The absolute normalization at LEP (%) 1992 1995 ALEPH 0.068 OPAL 0. 53: Some aspects of LEP microvertex detectors.41 0. 52: Final result on neutrino counting Fig.50 0.09 L3 0.15 0.38 0. a comparison with SLD CCD and performances 62 .D. 51: The ALEPH luminometer Fig.034 Fig. 50: Evolution of luminosity theoretical error at LEP1 Fig.080 DELPHI 0.

F IFTY YEARS OF RESEARCH AT CERN. do we feel other effects due in particular to the top and Higgs? The answer is a clear yes [11]. 13σ away from 0! – Fig.00026. Δα.7 σ away from the canonical value of –1/2. The effects are summarized in the Δρ and Δr factors. 54: The Z lineshape and the progress brought by LEP for Nν. 4. the Higgs boson is felt to be light. whose numerical values give indirect information on these particles. – From the Z leptonic width: one finds gAl = – 0. In addition. An important but trivial quantum effect is the running of the fine structure constant from low energy to the Z mass.002 or ΔrW = Δr – Δα = 0.2183. From the 2.005 ± 0. – From the Z branching ratio into beauty.002.026 ± 0. as shown below. – From the W mass and its relation with Gμ: one finds Δr = 0. Figure 57 shows that. One deduces Δρ = 0.8σ discrepancy one derives: Mt = 155 ± 20 GeV. the measured one is Rexp = 0. within the strict SM frame.033 ± 0. Rb: the predicted value. is Rb0 = 0.21644 ± 0.50123 ± 0.001. FROM PAST TO FUTURE : E XPERIMENT Figures 54 to 56 illustrate well-known results from LEP. neglecting radiative corrections.00065. say < 250 GeV. MZ and sin2θW 63 .

D. direct and indirect Fig. 56: The top mass measurements. 55: The W mass measurement. valid in the frame of the Standard Model 64 . T REILLE Fig. 57: The indications on the Higgs mass. indirect and direct Fig.

Fig. yes or no. at the LHC or the Tevatron.F IFTY YEARS OF RESEARCH AT CERN. cross-sections. 59: Higgs search at LEP200: diagram. exact in the SUSY frame Figures 59 to 62 illustrate a less bright topic: the search for the SM-like Higgs boson at LEP200. The future. it leads to a perfect convergence at about 1016 GeV in the supersymmetric version of the SM with light superpartners. as virtual objects. of the particles existing in the model considered. due also to the role. As I have said already. Fig. 58: Evidence for the convergence of couplings. will tell whether a weak signal at that mass was real or not. about the existence of the light boson (< 130 GeV) predicted by minimal SUSY. Eighty more superconducting RF cavities at LEP would have allowed us to answer. A lower limit was set at 115 GeV. FROM PAST TO FUTURE : E XPERIMENT Figure 58 shows the running of the coupling constants of the three forces. the final mass spectrum of ADLO 65 .

D. 60: The potential impact that a modest energy rise of LEP200 with 80 more superconducting RF cavities would have had Fig. left: the foreseen integrated luminosity. 62: Peter Higgs: ‘l’art d’être grand-père’ 66 . T REILLE Fig. 61: The potential of the Tevatron concerning the Higgs search. right: Higgs sensitivity study Fig.

F. In that case a single parameter is left: αS(MZ) or ΛQCD . by F. Bjorken proposes to study deep inelastic Compton scattering – 1972: Gribov–Lipatov predict scaling violation – 1973: discovery of asymptotic freedom (Fig. The axion hypothesis was formulated as a possible explanation. See “Mass without mass”. On the theoretical side let us quote: – 1966: Greenberg invents the colour quantum number – 1969: interpretation of DIS: Bjorken scaling. Sikivie. Wilczek. A mystery left in QCD concerns the zero or very small value of a potential CP-violating term. Figure 65 demonstrates that this coupling constant is indeed running. Indeed the nucleon mass. “The pooltable analogy of the axion” [14]. CERN is involved with the question through the CAST experiment. Let us recall the main steps that led to the establishment of the validity of QCD. Wilczek again [13]. from bubble chamber to LEP. Gross. namely the discovery of the J/ψ in SLAC and Brookhaven. It is not a big change to our vision of things. About the axion problem. Quark counting rules for hard hadron scattering. One can consider for instance the ‘QCD lite’ version. 63 shows the actors). QCD is a very remarkable theory. which looks for such a particle emitted from the Sun. 64 and 65. all CERN programmes. Feynman’s parton model. see F. θQCD. D. is due not to the mass of the constituents (mgluon = 0. mup . obtained by setting mu and md to zero and considering only the first generation of constituents. and thus the mass of the baryonic universe. Formulation of QCD.F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : E XPERIMENT 10 CERN’s contribution to the validation of QCD On the experimental side the important dates (all three having led to Nobel prizes) are – 1956: the discovery of proton structure at Stanford – 1969: the first evidence of high-energy. 67 . We can summarize the situation by Figs. For the consequences. deep-inelastic e–p scattering (DIS) in Stanford – 1974: the ‘November revolution’. Politzer (Nobel Laureates 2004) As we saw already. mdown = few MeV) but to their dynamics. Fig. Figure 64 shows the very coherent set of determinations of the strong coupling constant (given at the MZ scale): the impact of CERN programmes in this respect is obvious. I recommend a delicious paper by P. Here too. 63: D. have brought important contributions to the build-up and accurate checking of QCD. Wilczek [12].

individual nucleons ‘melt’ into a quark–gluon plasma. On the other hand. It is inconceivable to get quarks free again. 66: Location of LEP and LHC physics in the diagram describing the evolution of the Universe Fig. Fig. 67: The ‘melting’ of J/Ψ as seen by NA50 68 . T REILLE Fig. the CERN experiments have provided striking signals that could result from such a phase transition. This is the reverse of what happened a few microseconds after the Big Bang (Fig. crudely speaking. The HI collider RHIC. a very active sector of CERN physics at the SPS. The idea is that these collisions may provide conditions such that nuclear matter undergoes a phase transition. in which. like the ‘melting’ of the J/ψ in HI collisions as shown by Fig. and the future experiment ALICE at the LHC colliding lead ions will continue these studies. 67. 66) when quarks got confined into nucleons. 64: The set of αs (MZ) measurements Fig. 65: The running of αs Another domain concerning QCD is heavy-ion (HI) collisions.D. in Brookhaven.



CP violation and the origin of matter





(c) Fig. 68: Parity violation (above left); CP conservation (above right); CP violation in the K0 system (below) In our present Universe the number of baryons divided by the number of photons is 6 × 10-10 , a very small but definitely (and fortunately for our existence) non-zero number. A symmetry principle operates for electric charge, guaranteeing an exact neutrality of the Universe, but here an asymmetry seems to exist. In 1954, the CPT theorem (Lüders) was demonstrated. In the theories we consider currently, the triple mirror CPT must be a good one. In 1956, the discovery of parity (P) violation [Fig. 68(a)] in the weak interaction was a major revolution. One then trusted the validity of the CP symmetry [Fig. 68 (b)], product of P and charge conjugation C. However, in 1964 CP violation was discovered through the observation of the ‘forbidden’ mode K0L → 2π (Fitch, Cronin, Christenson and Turlay). We show in Fig. 68(c) an evidence of CP violation. In 1967, Sakharov gave his famous three conditions needed to guarantee the present matter– antimatter asymmetry. CP violation is indeed one of them. But in the SM the level of CP violation is insufficient to fulfil the goal. SUSY could be an answer. It is impossible to do justice to all CERN experiments which dealt with CP.



Briefly: in 1966 a η decay experiment at CERN showed that there is no evidence for C violation in η→3π electromagnetic decay as had been suggested. The year 1970 saw the measurements of the relative phase of the K0L →π0π0 and K0S →π0π0 amplitudes. It was shown that CP violation was accompanied by a violation of T to an amount compatible with CPT invariance, whereas T invariance with CPT violation was ruled out. In 1974 a new determination of the K0 →π+π– decay parameters was performed. By that time, there was full agreement with the Superweak Model (Wolfenstein, 1964). Its mixing parameter ε is equal to the ratio of CP-violating to CP-conserving decay amplitudes of KL and KS . CP violation was seen as a small effect showing in K0 decay because of a small admixture of K01 (CP = 1 eigenstate, allowed to decay into 2π) in the K0L: K0L = K02 + εK01. But Kobayashi and Maskawa realized that, with three generations of elementary particles, CP violation can be incorporated in a natural way in the SM. Then it becomes possible that the CP-odd eigenstate K2 can decay directly into 2π . The corresponding parameter ε′ is generally not zero. Let us resume the history at that stage.


Direct CP violation: NA31, NA48

12.1 NA31 The proposal was submitted at the end of 1981, the data were taken in 1986–89. The first results were available in the summer of 1987. This was an elegant detector, with the KS beam mounted on a train. 4 × 105 KL → 2π0 were recorded. It measured the double ratio R of KS and KL to 2π0 and π+π– . The result was R = 0.980 ± 0.004 ± 0.005, i.e., a non-zero Re ε′/ε ratio at 3σ. However, this result was not confirmed by the Fermilab experiment E731. The Superweak Model was only ‘wounded’. 12.2 NA48

Approved in 1991, it took data from 1997 to 2001. Remarkable features of NA48 were the krypton calorimeter and the magnet spectrometer. The experiment exploited cancellations of systematics occurring in the double ratio. The statistics was increased by an order of magnitude. New results also came from KTeV. Direct CP violation was finally established beyond doubt at the 9σ level (Fig. 69), with a CL for consistency of 10% between all results. This implies an asymmetry in the decay rates of neutral kaons and their antiparticles into π+π– (2π0 ) of 5.5×10-6 (11×10-6), respectively.

Fig. 69: The evolution of the ε ′/ε measurement with time



12.3 CPLear

In this experiment particles and antiparticles are concurrently produced from antiproton–proton annihilations. Thanks to the very good performance of the detector and the excellent quality of the beam, a wide range of studies were made which provided high-precision tests of T violation, CP violation, CPT invariance, and possible extensions of Quantum Mechanics (QM). One labels the K0 by its strangeness, obtained from the accompanying charged K, and observes its subsequent evolution in time under the weak interaction which does not conserve strangeness. So one follows the oscillations of strangeness plus the decay process. CPLear has provided the first and only direct observation of non-invariance under time-reversal: this requires one to know the strangeness of the neutral K at two moments of its life, production and decay. The semileptonic mode eπν was used for the latter. The violation was established at the 5σ level. A direct measurement of the CPT parameter was performed. It showed, 50 times more accurately than previously, that CPT was conserved. The K0–anti-K0 mass and decay width differences are found to be compatible with 0, as they should be according to CPT invariance, at a sensitivity of a few 10–18 GeV. Considering the tests of QM, CPLear has measured special asymmetries whose values are compatible with the non-separability of the wave function of the K0–anti-K0 system, as predicted by QM, and rule out the separability hypothesis.
12.4 CP violation in B physics

It is usually admitted that this was discovered at the B factories. However, it is clear that LEP, and other programmes, also had ‘leur mot à dire’ on this topic. Showing that CP is violated amounts to showing that a triangle, the Unitarity Triangle, built from quantities measured in beauty and heavyflavour physics, is not flat. As one can see from Fig. 70(a), the LEP era brought clear evidence that it is so, by providing an accurate determination of the tip of the triangle. Then B factories measured its angle β [Fig. 70(b)] directly. The fair agreement between the two methods is another great success of the SM. However, the confrontation is continuing at an increased level of sensitivity.



Fig. 70: (a) The indirect determination of the tip of the Unitary Triangle during the LEP era; (b) the measurement of the angle β of the Unitary Triangle at B factories


71 showing both the indirect. However. while they should be below 0. Since the difference of energies ΔE/E between the 1S–2S levels in hydrogen is measured at ~10–14. The NA48 experiment at CERN is paving the way. Fig. 12. for instance by measuring. I think that the study of rare decays. respectively. In 1993 came the idea that antiprotons turning in the LEAR ring could create e+e– in the Coulomb field of a gas-jet target and eventually capture the e+. but this occurred just before the machine was stopped. one may still question the validity of the assumptions which led to this theorem. through the knowledge of the tip of the UT.. this created a great media stir. TRAP at LEAR was able to trap both e+ and antiprotons. the experiments ATHENA and ATRAP at AD observed signals from ~50 000 and 17 000 antihydrogen atoms. 71: Indirect (points) and direct (stars) determinations of the angle β of the Unitarity Triangle 12.D. the study of rare modes of kaons is certainly a highlight. In 2002.6 The study of rare decays Among the roads that deserve to be followed in the future. 10–-10 for baryons. etc. Indeed in 1995 the PS210 experiment at LEAR observed nine antihydrogen atoms moving at a speed of 0.e. there is potentially room for improvement through spectroscopic methods. Fig. The R&D is ongoing. and the direct measurements of the angle β. Among the future plans for fixed-target experiments at CERN. they are still rather fast movers. 72).5 CPT and antihydrogen As we said before. the KS decay mode. However. in order to envisage spectroscopic studies. One had therefore to wait for the Antiproton Decelerator (AD) to pursue the programme. for instance in the frame of superstrings. a bit like in the case of the top mass. 72. 72 . i.9c. shown by the right-hand side of Fig. but also of charm. In 1996.06 meV to be trapped. as a preliminary step. and the muon is a most promising one. not only of beauty. One may recall that a Unitarity Triangle can be built from kaon physics only provided one has access to very rare modes (Fig. the CPT theorem states in particular that particle and antiparticle have exactly the same mass and total width. The validity of CPT symmetry has been tested at the 10–-12 level for leptons. T REILLE One can thus draw. Another reason to be interested in antiatoms is that there is no direct experimental proof of the equality of gravitational effects on matter and antimatter. strange particles.

However. at CERN (NA48) and elsewhere. Fig. the LHC. What else should one foresee at CERN? One should guarantee a minimal.F IFTY YEARS OF RESEARCH AT CERN. 73 . We all know that it will be very demanding. etc. The LHC with its huge mass reach is the right machine to do so. well-targeted programme of non-LHC physics: fixed-target (I alluded above to rare decays). FROM PAST TO FUTURE : E XPERIMENT Fig. The LHC programme has to be a success. One should ensure a boost of the CLIC R&D programme to get answers about its feasibility at the time when LHC results will indicate the road to be followed. led to a slight revision of the numerical value of the second element Vus. the flagship of collider physics for the next decades. 73: A problem with the unitarity of the CKM matrix? (Courtesy Rainer Wanke) 13 The future It is clear that worldwide planning and distribution of large programmes is mandatory. Currently Europe has its share. The situation is now much better and this potentially embarrassing (and exciting) anomaly is on the way to disappearing. Moreover. 73. the physics results from the LHC are mandatory to guide the next steps. which is simply the famous Cabibbo angle. new measurements. This is the goal of the CTF3 facility. as shown by the Fig. low-energy antiprotons. of K and hyperon decays. 72: The Unitary Triangle from kaon physics and a beautiful measurement by NA48 During the last few years one might have had a doubt about the unitarity of the first row of the CKM matrix. The first goal of particle physics is to go up in energy and observe (directly!) the particles and effects we think we ‘feel’ (a light Higgs ?) or suspect (SUSY ?). More than 80% of CERN resources are being devoted to it.

may have a key role to play in this respect. Which policy should one adopt to confront their findings in due course? 13. the latter much work of calibration by exploiting simple well-known physics channels. Concerning its physics. This may lead to many seemingly anomalous features and statistical fluctuations. A first condition is to show our own real enthusiasm about our research: “Enjoy. and we must exploit fully our machines and programmes. the possibility to have more protons available at CERN. for which the SM background is well mastered and the experimental acceptance is good and well known. magnets. Since. pixels.1 A few personal remarks concerning the LHC The initial LHC energy is not a big concern. methods of data treatment and analysis. One clearly needs a European and even a worldwide co-ordination of astroparticle physics: CERN. Right or wrong. control of systematics in the determination of distributions. since we are confronted with cosmological problems. etc. T REILLE One should guarantee. From my personal experience. like medicine.” There exists a clear synergy between particle physics and astroparticle physics/cosmology. and in particular escape the tyranny of the MSSM. check the quality of our publications (in particular on the Web). through its Council.. The LHC will have only two (big) experiments. One should have some limited level of involvement in astroparticle physics and astrophysics. rather than treated as halfwits. If that helps the machine commissioning one can compromise on it and run slightly below the nominal one. The former implies a huge effort in the field of QCD.. i. we should favour a topological approach of the search programme: namely explore ALL final-state topologies that are accessible. CERN cannot (and does not) ignore the news ‘from the sky’. which can benefit several areas. from which other fields. the GRID. in particular rushing from one project to the next. In particular. An army of physicists will construct a large number of distributions extending over a vast energy domain. missing energy). We should naturally develop and exploit it. I think it is possible to convey its messages. we should not fear ‘philosophical’ questions and be prepared to react intelligently and honestly to them. find ways to ensure the visibility of talented young individuals inside huge collaborations. 13. by proceeding in carefully controlled steps. neither can CERN invest massively in such programmes. also profit. we must first ‘sell’ our science for its scientific content. large extra dimensions or Little Higgs models are most welcome because they suggest new topologies to be studied! Besides ‘narrow-bump hunting’ one should consider cases where a more elaborate measurement is needed and focus on problems such as normalization.2 Miscellaneous and personal comments There are quite obvious spin-offs from our discipline: detectors (crystals. it is likely that the rise in luminosity will be rather slow. the Web. one does not really know what could be the signals to look for. accelerators. etc. 74 . etc. The scheme of ‘recognized experiments’ is a first step. However.e. and share your enjoyment. even to the general public. Some ethical rules have to be respected and promoted: we should avoid precipitation. listeners whom we have a chance to interest prefer to be “tirés vers le haut”. to minimize the risks taken for the machine components and the experiments.). It is important to develop and illustrate them. The impact of a well-explained visit of CERN can be decisive. This needs work and practice. one should diversify the phenomenologies under study.D. with few exceptions (Higgs. from ISOLDE to the LHC and an eventual neutrino programme. We have to maintain and improve our peer-review procedures. In particular.

Phys. Today 49 No. ultracold neutrons. We must accept and contribute to a worldwide planning and sharing of activities. Krige (1987). Eur. Int. U. CERN-OPEN-2002-006 (1992). Mod. Rep. Phys. CERN CHS-36. FROM PAST TO FUTURE : E XPERIMENT Given the size and complexity of future programmes. Hübner. Mass without mass. 5 (2003) Suppl. As we did in the past. J.CTP. Phys. condensed-matter physics. is more enterprising than ever. C34 (2004) 25–31. CERN Courier 43 No. Phys. Proc. A20 (2005) 5174–83. [10] F. D. [13] F. 225 (1993) 45. Pauss for their careful reading of the manuscript and their comments.cern. [6] D. [5] L. we must ensure the continuation of a strong and co-ordinated R&D in accelerators and detectors. Pape. di Lella. G minus 2 plus Emilio. It is obviously a big chance for CERN to have the LHC starting in a couple of years. methods) like atomic physics. Aknowledgements I would like to thank warmly R. Weiss (1988). Rep. “Par la force des choses” it will happen for the large accelerators and colliders. [14] P. We should try to get closer to other physics fields with which we can have fruitful reciprocal relations (ideas. Let us hope that it will soon shed light on which physics is hiding behind the SM. 75 . LEP EW Group.ch/LEPEWWG/ . MIT. I would also like to thank the STP service for their excellent work. but it has still to be established in astroparticle physics. [2] CERN–The second 25 years. QCD and natural philosophy. 62 (1980). Suppl. 403–404 (2004). Denegri. [3] Reports CERN CHS-21 and CHS-23. 14 Conclusion In spite of many omissions. Archive: physics/0212025. Pestre (1992). Today 53 No. Farley. K. Mersits (1987).F IFTY YEARS OF RESEARCH AT CERN. 15–19.12 (1996) 22–27. CERN CHS-26. L. we cannot try to do everything everywhere. [8] D. http://lepewwg. Rep. [11] F. Nucl. Phys. I think I have given a balanced account of half a century of physics at CERN. What is most remarkable is that CERN. L. Phys. [4] Report CERN CHS-20. J. [12] F. [7] A. Phys. Phys. 36 (1994) 339–361. Rousset. References [1] Highlights of 25 years of physics at CERN.web. Phys. 8 (2004) 21–24. Wilczek. J. Teubert. [9] D. Rep. which is the key to our future.com/main/article/43/5/22 . Di Lella. and F. Sikivie. CERN Courier 44 No. DAPNIA-04-247 (2004).3328 (2002). Perkins. even being over fifty years old and a victim of abusive slimming treatments. 403–404 (2004). Wilczek. Haidt. L. Barate.1 (2000) 13–14. http://cerncourier. CERN–The second 25 years.


He hopes to complete this work for publication towards the end of 2006. all made important break-throughs. Switzerland Abstract Computing in the broadest sense has a long history. 77 . and Babbage (1791–1871).Fifty years of research at CERN. and the wartime code-breakers. Hollerith (1860–1929). CERN was founded as the first valve-based digital computers were coming onto the market. many other early pioneers. Geneva. netware. O. Zuse (1910–1995). and on other sciences. and the impact which it has had at CERN. Williams CERN. For medical reasons David Williams has been unable to complete the section on fifty years of CERN computing in time for publication with the other material. from past to future: Computing D. on particle physics in general. Computing at CERN will be considered from various viewpoints. In the mean while his PowerPoint slides are included here. epeopleware and more recently middleware) that it incorporates. including the various ‘wares’ (hardware. software.

78 .

FROM PAST TO FUTURE : C OMPUTING CERN-50 50 years of Research at CERN From past to future (Computing) Personal Reflections and Photographs David Williams CERN Academic Training Programme 16 September 2004 See cern. and I have had very little to do with physics computing since ~end 1998. and it would it take a couple of years.F IFTY YEARS OF RESEARCH AT CERN.Williams/public/CERN50computingPub. have them treated and analysed by specialists. Surely a personal picture. comments. which is two Internet generations. posed” We have tried an not always succeeded to assemble “collages” of many of collages” our colleagues 79 . with all the biases that implies. And in no way am And I trying to assign credit to individuals. and many people who have made made important contributions I concentrate on the early years – you are living the later ones and don’t don’ need to be told about those! But I have not talked to two people that I should have talked to about the very early days (George Erskine and Henk Slettenhaar) Slettenhaar) The lighting of many of the photos is not wonderful.ch/David. and many of the early ones are very “posed”. If you want to prepare a proper historical review (book) you should identify should the themes. I know that I will fail to mention fail some important aspects of the puzzle.ppt Caveats I knew that this job would not be easy – it was even harder than I thought I stopped as Division Leader at the end of 1996. and I might even try to do that one day Today I merely try to paint a picture. making some technical comments.

W ILLIAMS The nature of the problem People and their interactions I suspect that the LHC experiments are running at the limit of what is feasible… Not the amount of funds that can be assembled Or the complexity of the detectors But the possibility of keeping such a large number of very smart people working enthusiastically and actively towards a common scientific goal Until the mid-1980s HEP’s “computing problem” was often thought to be about obtaining enough processor power Then we worried about storage capacity The real problem has always been. outside labs and computer suppliers staff – who did the real work behind fifty years of “Computing at CERN” The mistakes – and there will surely be several – are mine alone 80 . and who has scanned a lot of photos for me (more than I am using) CERN Photolab Many others who have provided information and photos. getting people to collaborate on a solution Acknowledgements Miguel Marquina – who helped me to prepare this.D. in my opinion. CERN users. O. and answered my questions All of the people – CERN people.

EU projects. so that they can prepare.punched cards Later Computer Centre machines Software Measuring Machines Online computing and DAQ Onsite Networking Offsite Networking Controlling Accelerators Various other things (Databases and other special applications. run and prepare. available technology – and its cost – keeps evolving Online feedback AND Worldwide access to data and information Worldwide participation in all phases of an experiment are the basic (enduring) challenges What did not work or was too complex – or was impossibly expensive – last year may well work – or be affordable – next year 81 . CERN School of Computing. Emulators. but the same.F IFTY YEARS OF RESEARCH AT CERN. process their experiments as well (and competitively) as possible possible Over time the physics challenges mainly stay conceptually the same. FROM PAST TO FUTURE : C OMPUTING Outline Setting the scene – the world in 1954 and 2004 The early days – roughly to 1967 Some early technology . HEP-CCC) HEPData Handling Information Handling Prospects for the Future Wim Klein First thoughts Computing at CERN has little to do with Research into Computing It is the huge challenge of using leading-edge technologies to provide top leadingquality services for the CERN community.

O. W ILLIAMS SETTING THE SCENE THE WORLD IN 1954 AND 2004 WHERE DID WE COME FROM? 1954 Europe was still recovering from World War II Historically computing had been driven by the need for accurate compilation of tables – especially for navigation – (Babbage “mathematical engine”) and for census purposes (Hollerith) More recently (~previous 20 years) it had been largely driven by military needs – code breaking and bomb simulation 82 .D.

FROM PAST TO FUTURE : C OMPUTING 1954 world timeline – English Literature Lord of the Flies (Golding) Lord of the Rings Vol 1 .F IFTY YEARS OF RESEARCH AT CERN. McCarthy active (investigating Army etc.and Vol 2 .The Two Towers (Tolkien) Lucky Jim (Kingsley Amis) Under Milk Wood (Dylan Thomas) Under the Net (Murdoch) 1954 world timeline – General events First polio vaccination First kidney transplant First four minute mile (Bannister) Battle of Dien Bien Phu Algerian War of Independence starts General Nasser becomes prime minister of Egypt Bikini Atoll hydrogen bomb test First Russian hydrogen bomb test USS Nautilus launched Sen.) but year ends with his condemnation by Senate vote US Supreme Court decision in Oliver Brown v Board of Education of Topeka KA (and others) 83 .The Fellowship of the Ring .

O. USA. France.More Nobel Prizes – Physics – Chemistry – Medicine – Literature – Peace Born (quantum mechanics) and Bothe (coincidence circuit) Pauling (chemical bond …) Enders. Welling and Robbins (for the cultivation of polio virus – leading to vaccines) Hemingway UNHCR Born – Cherie Blair – Condoleezza Rice Died – Alan Turing – Enrico Fermi NATIONAL LEADERS “Who wants to be a millionaire” 1954 political leaders of Russia. W ILLIAMS 1954 world timeline . Germany?? 84 .D. UK.

FROM PAST TO FUTURE : C OMPUTING First Secretary of the CP of the Soviet Union 85 .F IFTY YEARS OF RESEARCH AT CERN.

W ILLIAMS Photo: Karsh Joseph Laniel (PM until June 18) Rene Coty (President from 16 Jan) Pierre Mendes-France (PM from June 18) 86 . O.D.


and then started taking out the ~3 books in the local library about computing.D. in ~1956. W ILLIAMS Married 14 Jan Divorced 5 Oct 1954 computing timeline Computers were valve-based I was 10 years old and had never seen a computer. O. 88 . I saw EDSAC 1 in Cambridge in ~1958 as it was being dismantled. I first saw one. a Stantec Zebra.

Cambridge Computer Laboratory Archive 89 . FROM PAST TO FUTURE : C OMPUTING Built in Newport by STC from the original concept of van der Poel (Delft) 8k 33-bit words. Courtesy.F IFTY YEARS OF RESEARCH AT CERN. Cost 23 kGBP (then ~280 kCHF) and was one of the cheapest general-purpose machines of the time Note: Fantastic Paper-tape equipment and telegraph-style typewriter Stantec Zebra EDSAC 1. Valve-based.

Punched card input. sold over 12 years 1960 PDP-1 launched (18 bit words) PDP1964 PDP-8 launched (12 bit words)) PDP1964 System/360 launched (4*8 bit byte words) IBM RAMAC disk – 1956 7-bit storage.4 Mbytes Held on 50 24” platters with data recorded on both sides on 100 tracks A computer in its own right RAMAC == Random Access Method of Accounting and Control Leased for 40 k$ per year 90 . 12. Transistorised. W ILLIAMS More 50s and 60s computing timeline 1947 (Dec) 1951 (Sep) Point contact transistor invented Major Bell Labs symposium on working junction transistors 1955 Wilkes invents microprogramming – programming the instruction set 1956 First magnetic disk system sold (IBM RAMAC) ~1956 FORTRAN under development 1959 IBM 1401 first shipped.D.000 Transistorised. O. Capacity ~4.

F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING Personal timeline August 1966 Williams to CERN I that time I had programmed the Titan (prototype Ferranti Atlas-2) in Autocode and a DEC (Digital Equipment) PDP-7 – a follow-on machine to the PDP-1 – equipped with an interactive display. A multi-user service and a stand-alone “mini-computer” which you could work on alone in the evening 91 . in assembler.

O.DEC 340 Display (attached to PDP7) with light-pen Courtesy: Martin Pool Note: Chair! Teletype (KSR-33). W ILLIAMS . PTR and DECtape (~256 kB/reel) 92 D.

FROM PAST TO FUTURE : C OMPUTING 2004 Longest period of peace in (much of) Europe since ever Not intended as an exhaustive list Napoleonic Wars 1803-1815 Belgian War of Independence 1830-1832 Italian Wars of Independence 1848-1866 Crimean War 1853-1856 Danish-Prussian War 1864 Austro-Prussian War 1866 Franco-Prussian War 1870-1871 World War I 1914-1918 World War II 1939-1945 van Hove 93 .F IFTY YEARS OF RESEARCH AT CERN.

O.D. W ILLIAMS END OF SCENE SETTING DOWN TO WORK! THE EARLY DAYS ROUGHLY TO 1967 From CERN annual reports Mainly verbatim quotes (but I did some paraphrasing) 94 .

will have to be doubled in the course of 1959 – … to supplement and eventually replace the present Mercury computer by computer hiring an IBM 704 from the latter part of 1960. Because of unforeseen difficulties experienced by the producers of this machine. mid– … Autocode is fairly easily learnt by scientists who have had no experience of experience computers – (referring to the IBM 704 in Paris) in each case the programming was done in Fortran – The speeds at which the available paper-tape devices can transmit data to paperand from the computer are in no way comparable with the computing speed computing – Leading personnel … took part in the “Symposium on the Mechanisation of Thought” at NPL Teddington in November Thought” – The staff of the Computing Group. who wrote customers. there will undoubtedly be some considerable considerable delay in its delivery … – An experienced mathematician-physicist has been recruited to run mathematicianthe future Computer Section. Group. 1956. – The computer service has been used by about 40 customers.F IFTY YEARS OF RESEARCH AT CERN. but the section will not be very large since … 1957 – The Mercury computer will probably not be installed until the summer summer of 1958 – … prepare programmes for use on the English Electric Deuce and the the IBM 704 in Paris – By the end of 1957 there were 2 staff plus 1 fellow in the Computer Section 1958-59 1958 – The Ferranti Mercury computer arrived in the summer and its acceptance tests were completed by mid-October. Further staff will eventually be recruited. more than a hundred Autocode programmes 1959 95 . FROM PAST TO FUTURE : C OMPUTING 1956/57 1956 – Following in the steps of Harwell and Saclay … CERN ordered an electronic computer of a new type (Ferranti Mercury) in May 1956. It is due to start operation in the autumn of 1960 – In October regular two-shift working was introduced on the Mercury … twoNevertheless the is already a backlog of computing work. numbering about 10 at the end of 1958. The installation will be installation equipped with an 8’000 word core store (32 kB) 8 tape units and various 8’ kB) other ancillary equipment. however.

D. O. W ILLIAMS Mercury: note paper-tape equipment and real online typewriter Auffret and Slettenhaar Ferranti Mercury 96 .

F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING 1960-61 1960 – In early November a much larger and faster computer. will require more and more machine time from computers with faster operation and larger memories IBM 709 97 . was delivered to the site and in 1961 seven analysing machines should be in operation using the IBM 709 for data processing – To house it a prefabricated building has been put up with all necessary air-conditioning and stabilized power supplies air– Full time operation of the Mercury was introduced in April 1960. 709. and do some maintenance and upgrades – First mention of Flying Spot Digitisers (Hough and Powell) 1961 – It became clear during the year that even more attention will have to have be given to data handling and analysis. the IBM 709. At week-ends (on Saturdays and sometimes on Sundays) it has been weekpossible to work on one or more shifts. that as … ….

with occasional running at week-ends week– A system … is being written to run (Mercury) Autocode programmes on the IBM 709! – The IBM 1401 computer was delivered in November (for I/O handling).general 1962 – The importance of data handling as a central problem for the laboratory has become even more obvious with the growing flood of of bubble chamber pictures and the beginning of what is likely to be a be comparable flow of spark chamber data – The 709 worked an average of 1. W ILLIAMS PDG of IBM France Baron de Waldner at the official opening? 1962 . O.5 shifts during the year – The Mercury ran 24 hours*5 days throughout the year. CERN’s third computer? (and first transistorised machine) CERN’ 98 .D.

F IFTY YEARS OF RESEARCH AT CERN. computer. A general discussion of the whole picture analysis problem would be forthcoming from Hine’s study group. A possible and untapped source of effort might be available from universities. Cassells. With the influx of 105 pictures from the PS diboson experiment S1 and … it is clear that the available facilities are rapidly becoming overloaded. Wetherell. 99 . The new programmes make use of the greater speed of the IBM 709 computer. …. they are written in Fortran … 709 via magtape – Paper tape measurements read by REAP (Mercury) – Geometry reconstruction by THRESH – Kinematic analysis by GRIND (which later needed an “ignore infinity” infinity” card on the CDC 6600!) From the Electronics Experiments Committee 19 Sept 1962 Preiswerk (chair).the “offline chain” 1962 – A new generation of programmes for the analysis of measurements of bubble chamber photographs was brought into use. FROM PAST TO FUTURE : C OMPUTING 1962 . Dick The situation … with respect to the analysis of spark chamber photographs was briefly discussed. Rubbia. Participation by university personnel in the data taking stage would be essential.

) together with the more conventional computing work of the Laboratory. The computing capacity … estimated to be at least 15x CERN’s present capacity) will allow considerable development of data handling techniques used in HEP. increasing by about a factor 4 the total computing capacity available at CERN.D. but unfortunately not very realistic. A great. It is planned to exploit fully the time-sharing properties of the computer in order to allow simultaneous operation of various on-line applications (counter experiments. vision. the DG proposed to the Council the purchase of a large time-sharing computer – the CDC 6600 – to replace CERN’s present computer by early 1965. O. IBM 7090 – October 1963 Hans Klein and Swoboda (IBM) 100 . W ILLIAMS 1963 By September the IBM 709 computer was replaced by a transistorised IBM 7090. film measuring devices etc. On the basis of the report of the “European Committee on the Future Computing Needs of CERN”.

Programme Library started – with 80 general programmes Definition of CERN Fortran (to provide compatibility across machines) Detailed proposal being made to connect several IEPs to the CDC 6600 New standard interface defined with CDC to enable HPDs to be attached directly to 6600 101 .F IFTY YEARS OF RESEARCH AT CERN. Sonic spark chamber analysis programmes were developed for an experiment using the SDS 920 computer online. 1 km data link to the South Hall (missing mass) and similar link to SC vidicon experiment. processing an average of 350 jobs/day. FROM PAST TO FUTURE : C OMPUTING More 1963 REAP moved from Mercury to 709 BAKE-SLICE-SUMX introduced THRESH for heavy liquids being developed Manuals written! A four-week lecture course in Oct-Nov on the programme chain was attended by 120 physicists from all over the world Preparations to use the Mercury online with a sonic spark chamber experiment 1964 The CDC 6600 will be delivered in Jan 1965 and the changeover from the IBM 7090 is planned to take 3 months By the end of the year the 7090 was operating on 24*7 basis. Second IBM 1401 installed General Mercury service stopped in April and machine moved to online work.

Munday. (i. Heintze. Lock. If I can start with Lindenbaum. and. Preiswerk. Winter Data Handling Facilities Several experiments experience severe difficulties in view of the limitations on the available data handling facilities. you want online feedback!) From March 1964 Informal Meeting on Film-Less Spark Chamber Techniques and Associated Computer Use General Discussion on Online Computer Use (led by Macleod) As far as online computers are concerned.e.D. by and large these appear to correspond to those who have online computers and those who do not! There is nobody who has been near an on-line computer who has said that he did not like it. Sens. It is considered of the greatest importance to organize the data analysis in such a way that a feed-back to the data taking phase of the experiment is possible. W ILLIAMS From the Electronics Experiments Committee 13 January 1964 Puppi (chair). he is essentially in favour of everything. He is not reticent about it. there are those who are in favour and those who are against. O. The EEC recommends as a general policy that experiments which have priority on the accelerators also have priority on the data handling facilities. and he said that he thinks that his kind of experiment could use a 6600 full-time plus any other computers one can find on the East Coast! 102 .

The work is making good progress and it is planned to introduce SIPROS in January 1966 Hardware reliability problems led to 4-week overhaul in midOctober – when I think that it was the only computer onsite (apart from the Mercury) CDC 6600 (console. FROM PAST TO FUTURE : C OMPUTING 1965 The IBM 7090 computer was completely overloaded from the beginning of the year and it was necessary to process work away from CERN until June The CDC 6600 was delivered in January. and technical problems with the hardware itself. card readers. prevented operation at anything like the planned capacity The IBM lease was terminated at the end of July Work on SIPROS was concentrated at CERN.F IFTY YEARS OF RESEARCH AT CERN. but delivery delays with the SIPROS time-sharing operating system. line printers) 103 .

by the end of the year Control Data installed a 3400 to take some of the load. O.D. Another data link to an IBM 1800 104 . which was largely accomplished. Trenel. but running at only ~60% of speed available on a dedicated 7090 Two IEPs online to 6600. planned to lead to FOCUS. Planned to upgrade 3800 to 6500 in spring 1967 It was recognised that computing had become such an integral part of the scientific work of CERN that an interruption of even a day or so due to computer breakdown caused serious disruption to the work of the Laboratory Tender for the first ADP computer HPD1 and Luciole online to 6600. with some temporary set-backs. into full and reliable service. but poor performance (memory) led to dropping that solution. Deller and two CDC analysts 1966 Most of the work of DD Division was centred around the difficult task of bringing the main computer. and CDC 3100 was installed to control IEPs in August Data link SDS 920 – CDC 6600 used up to 5 hours/day. the CDC 6600. W ILLIAMS What’s the problem? Lipps. upgraded to 3800 in August.

F IFTY YEARS OF RESEARCH AT CERN. which was dead) and drum and large disk had been added. from ~400 users 10’000 tapes in the tape library FOCUS system started on CDC 3100. FROM PAST TO FUTURE : C OMPUTING CDC 3800 Bernard Petiot at the console 1967 Experience confirmed that the running-in problems of 1965 and 1966 had been overcome CDC 6400-6500 installed and by end of 1967 both machines were running CERN SCOPE (i. Jobs rose to 5’800/week.e. not SIPROS. Intended primarily for a limited number of experiments at the PS using small online computers which will be connected by datalinks Graphics on CDC 3100 (display with light-pen) 105 .

O.D. W ILLIAMS CDC 6500 Tape reel Display AND A SUMMARY OF ALL OF THAT? 106 .

F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING Summary (1/2) CERN had moved from one valve-based Mercury computer in October 1958 via the IBM 709 (Nov 1960-Sept 1963). then the transistorised 7090 (Sept 1963-July 1965) and CDC 6600 (from Jan 1965) to maybe twenty transistorised computers in 1967 CDC 3100s hovering around the 6600/6400 Several computers controlling film-measurement machines And others controlling experiments online And starting on real special-purpose activity like ADP and accelerator control Summary (2/2) We had learned the hard way that computing is an integral part of the life of a scientific laboratory That software and hardware reliability are vital And that the technology was not yet ready to handle efficiently the mix of a general-purpose computing load with devices such as IEPs and HPDs on the same machine The long lead-times between order and delivery (and for acceptance testing) come from another world We were also learning to balance expenditure between the big central computer and the computers at the experiments (not really suitable for full-blown analysis codes) 107 .




Card copying (and listing)

Plug boards on right Also card supply


Magnified enough you can spot card decks from Aguilar, Dufey, Gavillet, Kleinknecht, Lindelof and Salmeron


Card readers in real use



Looking for the bug


Ferranti Mercury IBM 709/7090 CDC 6600 (et al) CDC 7600 (et al) together with IBM 195/3081/3090 and Siemens HOPE/CSF/SHIFT Servers together with VAX 780/750/8600


FROM PAST TO FUTURE : C OMPUTING The location of the “Computer Centre” The Mercury was installed on the ground floor of Building 1 – on the Jura side – in what are now ATLAS offices. You can find the location by looking for the bell which is still there I’m not sure exactly where the IBM 709/7090 was installed – I suspect close to but not exactly where the CDC was subsequently installed The CDC 6600 plus 6500 was installed where the Print Shop is now B513 was constructed during 1971 and the machines were moved there during 1972 (and maybe early 1973) B513 in construction in 1971 111 .F IFTY YEARS OF RESEARCH AT CERN.

and we certainly are benefiting from the space as we move towards LHC computing IMO just in the wrong place – we built B32 and the hostels much closer to CERN’s natural centre afterwards B513 – re-installation of 6600 in 1972 112 . I must recognise that there was an immense amount of good vision behind the move too. with the wisdom of hindsight. O. that it was a huge mistake that the new Computer Centre was moved so far way in 1972 It took almost all DD/CN/IT staff away from the natural centre of the lab. W ILLIAMS The Move to B513 I have always believed. and made easy integration with the experiments much harder Of course.D.


W ILLIAMS IBM 370/168 .D.the CERN Unit Siemens 7880 in 1985 114 . O.



Cray X/MP-48 (1988-1993)

Servers (NICE etc.)







Weekly interactive users 1987-2000

Windows 95


Number of Users each Week






87 01 87 21 87 41 88 09 88 29 88 49 89 17 89 37 90 05 90 25 90 45 91 13 91 33 92 01 92 21 92 41 93 09 93 29 93 49 94 17 94 37 95 05 95 25 95 45 96 13 96 33 97 01 97 21 97 41 98 09 98 29 98 49 99 16 99 36 00 04 00 24



Does Moore’s Law only apply to hardware?


SRT. and which had been written by no more than ~3 programmers In 2000-2005 an LHC experiment involves several tens of millions of l.. 119 . involving O(100)x more authors Delivering software to the end-user We (fortunately!) no longer need to duplicate card decks or magnetic tape to deliver software to colleagues We deliver it over the network . running in 1000-10’000 processors.o. AFS. up to a few thousands.c. ….to the workstation If we are sharing it on a professional basis it can be configured to some agreed specification The world of CERNlib.F IFTY YEARS OF RESEARCH AT CERN. running on one single computer. over 40 years. The steady improvements here helped to deliver several really major packages which changed the face of physics computing .. ASIS. 104-105-106 more l. and written by at least ~300 programmers So. CVS.. of lines of code.o.c. FROM PAST TO FUTURE : C OMPUTING Did we make any progress with software? Answer has to be YES In 1960 an experiment involved (I guess) several hundreds.

W ILLIAMS The “offline programmers” (Macleod. Carminati. Grote. YGC†).D. Apostolakis Giani. Bock. Palazzi. Burmeister. Palazzi. The Program Librarians James. Kellner. Grote. Pagiola. McLaren Renshall. Burmeister. 120 . Pagiola. Onions Giani. YGC† Zoll† Kellner. Bruyant. Marquina. Norton Bruyant. Zoll†. Renshall. Brun. O. Brun. Metcalf.

GRIND. VM/CMS. THRESH. Windows. 121 . FROM PAST TO FUTURE : C OMPUTING Languages and Operating Systems Many software engineers thought that physicists were renegades and would only ever write in FORTRAN? But things moved. Linux Digital and Norsk Data died for not recognising Unix quickly enough And in the future …? What role for Open Source? Some important CERN software suites REAP. Unix. etc. MAD and other accelerator design packages. POOL and all the other software that will form the basis of LHC analysis software Not to forget things like the AIS software.F IFTY YEARS OF RESEARCH AT CERN. ROOT. EDMS. and also C (portable assembler) and C++ (object orientation) Basic Java for interactivity Not to forget scripting languages – primarily to interact with OSs SCOPE. etc. SLICE SUMX and then HBOOK PATCHY and ZEBRA GEANT3 and then GEANT4 Leading onto PAW. VMS. MVS+Wylbur. F-77.

W ILLIAMS PAW Discussed in MEDDLE at the September 1986 meeting (first meeting with John Thresher present as responsible Director) Towards a Physics Analysis Workstation (December 1985) – Bock.D. Pape and Revol Brun. O. As the world moved to graphics and interaction. trying to bring “everything” together to create a convenient interactive interface for the physicist doing analysis A real de facto standard and huge success René Brun presentation of PAW at 1April 1989 CHEP in Oxford 122 . Brun.


21 124 .15 (end ‘91) and 3. O.D.14 (early days).21 (Spring ‘94) were some of the most important for LEP Effort then evolved/switched to GEANT4 (OO) OPAL GEANT 3. W ILLIAMS GEANT The basis of much (not all) of CERN’s physics simulation work A series of releases aimed at increasing functionality and reliability … Of which 3. 3.


and spark-) Initially punching paper-tape – later controlled by computer Then machines with more/less automation and different scanning technologies. W ILLIAMS MEASURING MACHINES As we have seen these formed a critical part of the early computing load Multiple versions IEPs were needed for many different films and chambers (both bubble. FSDs (mechanically generated spot scans). But lots of good mechanical and electronic engineering 126 . ERASME) Until the arrival of machines of the PDP-6/10 class there was a continuing problem about how much programmed feedback could be made available to the operator – not enough memory space & floating point power to run real reconstruction. PEPR.D. Spiral Readers (spiralling slit scan starting from vertex) and CRT scanners (Luciole. O.


.D. Missing: Gerard. Atherton . Sambles. Celnikier. Joosten. Celnikier Altaber. Quercigh. Sambles. Altaber. Joosten. French. Lord. Evershed. Evershed. Antonioz Howie 128 . Morrison. O. W ILLIAMS Hough and Powell Various HPD (bubble chamber) programmers and users:users:Moorhead. Quercigh. Howie. Ferran (2x). Antonioz-Blanc.

FROM PAST TO FUTURE : C OMPUTING Rinus Verkerk. Adolfo Fucci and one other (kneeling) ONLINE COMPUTING AND DAQ The online connection of CERN experiments to the Computer Centre had a long history (as we have already seen) 129 .F IFTY YEARS OF RESEARCH AT CERN.

W ILLIAMS IBM 360/44 Per Scharff-Hansen at the console OMEGA CII 10070 130 . O.D.

Gerard-Weights standing on right FOCUS in August 1969 131 . FROM PAST TO FUTURE : C OMPUTING ONSITE NETWORKING Focus RIOS Terminals CERNET The world of TCP/IP Yule at console.F IFTY YEARS OF RESEARCH AT CERN.

and an unknown person at the printer UA1 analysis? Note the Gandalf terminal switches 132 . O. Anton Frölich. W ILLIAMS 1972 – a proto-RIOS in B112 (West Area) Michael Metcalf.D.

F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING CERNET Modcomp team Gigaswitch interoperability tests 133 .

W ILLIAMS OFFSITE NETWORKING Broadly speaking we have moved from external connectivity at kbps in the early 1980s to Mbps in the early 1990s to Gbps in the early 2000s. We will need that to handle the grid data flows. By LHC start-up or soon after we are likely to reach a total of 100 Gbps.D. Francois Flűckiger connecting to Transpac? 134 . O.

. 40 seconds • 10.037 kilometers • Results 2.06 terabit-meters/second • 135 .F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING Internet2 Land Speed Record IPv4 Single and Multiple Streams Raw Metrics 27 February 2003 • 1.888...by way Chicago • 23.38 gigabits per second • Sunnyvale to Geneva .1 terabytes • 1 hour. 1 minute.

CERN’ I could find more SPS than PS pictures 136 . especially during the startstartup of new accelerators and their subsequent operation The SPS control system was responsible for several important computing advances (though the PS always had stronger realrealtime constraints). O. And the Nord story is an important part of CERN’s European TT history.D. W ILLIAMS Record breaking team CONTROLLING ACCELERATORS Control systems play a crucial role.

F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING Michael Crowley-Milling Nord-10s for the SPS controls Raymond Rausch 137 .

O.D. W ILLIAMS SPS console Bent Stumpe at console Controls Display in 1976 138 .

FROM PAST TO FUTURE : C OMPUTING Controls Display (p-bar) in 1991 VARIOUS OTHER THINGS ADP (and then AIS) EU-funded computing activity Griddery Control systems at experiments And still many other things 139 .F IFTY YEARS OF RESEARCH AT CERN.

O. W ILLIAMS Note the very early model IBM 3330 disks Lionel Grosset (invisible) at the console. Jean Gros standing ADP 360/30 in March 1968 ADP 360/50 in April 1971 140 .D.

Santiago. In our case we used the Meiko to run NA-48 data taking etc.F IFTY YEARS OF RESEARCH AT CERN. etc. FROM PAST TO FUTURE : C OMPUTING IT database specialists Shiers. Dűllmann (Moorhead inset) EU-funded computing activities Bob Dobinson was a pioneer and was involved in several different projects (especially transputers and advanced networking) Meiko CS-2 was a European example of the classic US “place a supercomputer at a national lab and they will do something useful with it” approach. And since 2001 DataGrid and EGEE (Fabrizio Gagliardi) Not to forget the partial support for direct CERN connectivity to GEANT (we share the Swiss connection with the Swiss national network SWITCH) And DataTAG and others 141 . Segura.

W ILLIAMS Grids See next Summary LCG-2 is running as a production service Anticipate further improvements in infrastructure Broadening of participation and increase in available resources In 2004 we must show that we can handle the data .D. O.meeting the Data Challenges is the key goal of 2004 2006 Installation and commissioning Initial service in operation 2004 Demonstrate core data handling and batch analysis 2005 Decisions on final core middleware 2007 Courtesy: Oliver Keeble LCG Deployment Status 25 May 2004 first data 25/05 2004 .128 142 .

program and data “management”. FROM PAST TO FUTURE : C OMPUTING Some different sorts of computing for HEP Calculations – for experiments – accelerators – theory – engineering Data acquisition Data storage and processing Simulation Interactivity Graphics VR Community computing – Email. (Web access) management” – Remote conferencing and collaborative tools Information access (Web) Databases Networking Symbolic computing Control systems Lattice QCD calculations DATA HANDLING The major challenge by far for CERN computing lies in the data 143 .F IFTY YEARS OF RESEARCH AT CERN.

D. W ILLIAMS Tapes being sent up from B513 basement What they found when they got upstairs 144 . O.

F IFTY YEARS OF RESEARCH AT CERN. FROM PAST TO FUTURE : C OMPUTING 3480 cartridge (200 MB) STK silos in 2000 145 .

Semantic Web . O.D.and perhaps some intelligence (we have a bad track record!) to all sorts of information.can we get wide consensus on how to address our data? INFORMATION HANDLING More of what I just mentioned The world is hoping that computing can bring more order . W ILLIAMS Data handling software Moving between files structures (understood by machines) and structures of events grouped into various stages of analysis FATMEN at LEP A crucial determinant for Grid software .Semantic Grid etc. 146 .

info sharing for LEP experiments We had the infrastructure .reasonable networks had just started We had just enough smart people And we managed to put the critical software in the public domain 147 . by Michel Goossens The Web Something that we (especially Tim BL) did right We had the problem . FROM PAST TO FUTURE : C OMPUTING Text Processing Something that we/I tend to forget All over the world scientists “did it themselves” and took part in standardisation efforts A good review in the 35th Anniversary Computer Newsletter Edition.F IFTY YEARS OF RESEARCH AT CERN.

W ILLIAMS Robert Cailliau (+Tim BL and Ted Nelson) Photo: Hakon Lie 148 .D. O.

c.) Assume that every 10 l.o. 21000 possible routes through our code 21000 = (210)100 = 10300 routes! A googol cubed.say 10’000 lines of code (l.” 149 . spends most of their time foreseeing – or foreseeing discovering – possible difficulties and programming the computer to deal with them. therefore. The computer programs for bubble chamber experiments start with start elegant and simple ideas.c. That is a very conservative estimate.c.F IFTY YEARS OF RESEARCH AT CERN. on the other hand. The unprogrammer.o. There are.) 10’ (l. and end up complex and sophisticated. where the l.” sophisticated. or on the prior state of calculations. we encounter a 2-way logical decision. LHC experiments will use a total of some 107 lines of code!! [Bock and Zoll in the 1972 CERN Courier – Central Computers in Bubble Chamber analysis] “The computer spends most of its time processing un-problematic events.o. FROM PAST TO FUTURE : C OMPUTING PROSPECTS FOR THE FUTURE (COMPUTING AT CERN) Complexity Computing will never be easy because of complexity We probably don’t explain the source of the complexity enough don’ Take a serious application from 1970 .o.c. 2route taken depends on the input data.

W ILLIAMS The fight against complexity Decompose problems – several small and (fairly) simple systems tend to beat complex monoliths Try to think clearly Be prepared to handle errors (easier said than done) The natural tension between particle physics and computing CERN experiments have to be mission-driven CERN computer specialists cannot be entirely mission-driven – They must pay attention to how computing technology (all different different wares) is evolving – And how the economics of the industry evolves – In order to find optimal solutions “for tomorrow” tomorrow” Some elements of CERN computing need to be handled centrally. O.D. and others must be handled directly by the experiments Finding the correct balance in that respect is an ongoing job for the IT and PH department managements 150 .

so must be told – in all gory detail – exactly what it must do. FROM PAST TO FUTURE : C OMPUTING The impact of HEP computing on the wider world? Impact on other sciences Impact on society in general Impact on developing countries The +ve side of computing – Info access – Great help for organising meetings and travel – Enriching The –ve side – Spam – Crooks – Mind-numbing Mind- Programmers To be a good programmer some part of your mind must sympathise with a machine – which has to do what it is told.F IFTY YEARS OF RESEARCH AT CERN. If you can communicate well with a machine can you also communicate well with people? And can you communicate without your machine? 151 .

Cray. Norsk Data – At various times DEC and IBM had Joint Project teams of a serious serious size Now Open Lab provides a more general possibility for such interactions Is that all only on the side of the hardware? What about software suppliers – Oracle.D. HP. VAX) NordPackages from outside “pure HEP” HEP” Why did we have less impact (overall) on the development of early early Internet standards? – CERNET (1973 (concepts) to 1977 (deployment) to 1988 (turn-off)) served (turnour onsite needs superbly but was CERN-specific CERN– HEP deployed DECnet very successfully in the wide area (plus wide use of mainframe solutions) and was slow to look at “peer-to-peer” peer. HP. GRIND.peer” – But we were probably slow to catch on to INTER-networking INTER- 152 . SLICE HBOOK PAW GEANT GEANT4 Data acquisition software (PDP. THRESH. DEC/Digital. O. IBM. W ILLIAMS The companies In 50 years of CERN computing several companies have been major players – and we should probably say major contributors to the work CDC. Numerical Algorithms Group (NAG)? Our successes and maybe some failures Web has to be the biggest success But also CERNLIB as a sharing mechanism And a lot of widely-used packages widely– – – – – And our external networking (especially CIXP) REAP. Microsoft?. Nord-10(0) and 50(0).to.

getting people to collaborate on a solution But the success predominate By a long way And it was:– – – – Challenging Fascinating Lots of hard work And great FUN 153 . and software development distributed over the wide-area. widebut have never been recognised as such by the broad community. in my opinion.F IFTY YEARS OF RESEARCH AT CERN. Why? – – – PATCHY was extremely advanced As were many of the ideas for machine independence behind “CERN Fortran” Fortran” And distributed software development I think that our problem here has been that – (a) the people involved were very busy – driven by experiment deadlines and – (b) we did not see it as our job to make this information available to others – available either potential users in other sciences or to computer scientists and scientists (commercial) software engineers who should have been kept informed informed Computing tends to be too introverted As I said that the start – The real problem has always been. FROM PAST TO FUTURE : C OMPUTING … some failures We have been quite strong in the “industrial” deployment of software industrial” development.

W ILLIAMS WIM KLEIN Intro for the younger members of the audience:audience:In the early days computing was quite difficult. Wim Klein was a “human computer” who provided numerical assistance.D. O. especially computer” to generations of CERN theoretical physicists He retired from CERN in 1976 154 .

We have to find the money to do this – it’s our contribution to preserving a minimum “historical heritage” 155 . FROM PAST TO FUTURE : C OMPUTING FINAL REMARKS The PhotoLab Archives – time is running out There is a lot of fascinating photographic material in the Photolab Archives. We urgently need to do a mass scan of the photo archive and make it Web-available so that past and present CERN staff are able to provide descriptions. but only a tiny fraction has been scanned.F IFTY YEARS OF RESEARCH AT CERN. and the rest is almost entirely without any descriptions. From the early years of CERN (1950s and 60s) many of the people shown on the photos are now >80 (or dead) and even the people who have some idea what the pictures might be about will soon retire.

O. servers. the end of what I dare detain you with 156 . network bandwidth and performance monitoring) since that is vital for each persons professional working efficiency And make sure that computing stays FUN for both users and providers THE END Or rather.D. W ILLIAMS Take-home messages Computing for particle physics is challenging – and it is likely to stay that way One big change of the next decade will be the move to fully mobile computing We can also hope that there will be moves to improve the way in which most information is organised and made accessible It will be important for CERN (and particle physics) to invest enough in computing infrastructure (including people.

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->