CERN Accelerating science

This website is no longer maintained. Its content may be obsolete. Please visit http://home.cern/ for current CERN information.

Academic Training Programme 2002-2003 - Proposed Topics

Copyright © 2002 CERN -- Academic Training

TOPICS


High Energy Physics

T0 Theory and general subjects

T1 Introduction to General Relativity and Black Holes
Introduction to the conceptual foundations of General Relativity and its mathematical framework. Discussion of exact solutions, of gravitational waves, and the experimental tests on General Relativity. The physics of black holes, energetics, and no-hair theorems. Quantum properties of black holes, Bekenstein entropy, and Hawking temperature.
T2 Cosmology and particle physics
Observations of the density of the Universe from stars, to galaxies, galaxy clusters, and even larger structures; observations of the fluctuations in the microwave background. Attempts to describe the observed structure of the Universe in terms of primordial fluctuations of the metric and of the inflaton field. Successes and open problems of the approach; extraction of the relevant cosmological parameters from present and future data.
T3 Status of Grand Unification
After a review of Grand Unified Theories, there will be a discussion of the present theoretical open problems and future experimental tests on some of the GUT predictions. Experimental consequences such as the proton stability will be discussed.
T4 Overview on Strings and Branes
A simple introduction to the fundamental concepts of string theory aimed at non-specialists, especially experimentalists. The lectures will also include recent developments in brane theory, explaining the connection with theories with large extra dimensions.
T5 Everything you always wanted to know about the quantum vacuum
The definition of the vacuum in quantum mechanics and in quantum field theory. The problem of divergences and renormalization. Physical consequences of the quantum vacuum and the Casimir effect. The cosmological constant: its impact in the evolution of the Universe, its emergence from data, and the theoretical attempts to understand its size.
T6 The theory of heavy ions
A review of the physics principles underlying heavy ion collisions, as well as of the theoretical tools available today to describe them, will be presented. The aims of the high-energy heavy ion programme at RHIC and LHC will be discussed and their impact on the understanding of the high-temperature and high-density states of matter will be illustrated

P0 Phenomenology and experiments

P1 CP violation and flavour physics
The data of the B-factories is overwhelming. These lectures will discuss in detail the most recent progress on CP violation and flavour physics based on data from BABAR and BELLE, as well as from other experiments and facilities. Expectations for future measurements will also be addressed.
P2 Testing the foundations of quantum mechanics and usage in quantum optics
These lectures should summarize, explain and put in perspective the developments of the last decades concerning the basics of quantum phenomena: Bell inequalities and the related experimental activities, quantum decoherence and its study with atoms and photons in a cavity, quantum entanglement and possible applications, quantum information, Bose-Einstein condensates and related developments,etc....
P3 Physics at linear e+e- collider
There is worldwide consensus that a linear e+e- collider with an energy of at least 500 GeV, covering up to about 1 TeV, will be the ideal machine to complement the LHC. These lectures will expand on the physics program of such a collider. An overview will be given of the different projects proposed. The physics program of such a facility will be discussed in detail, in particular Higgs physics, supersymmetry, extra dimensions, and strong electro-weak symmetry breaking scenarios.
P4 Physics at future colliders beyond the LHC and a TeV class linear collider
These lectures will address physics questions which could be tackled by a next generation of future collider, after the LHC and a TESLA-like collider. Examples of such machines are a multi-TeV linear e+e- collider such as CLIC, a Very Large Hadron Collider (VLHC) with a CMS energy of up to 200 TeV, and a muon collider.
P5 Deep inside the proton: the unpolarised and polarised proton structure
During the last few years the measurements of both the polarised and unpolarised structure of the proton have become increasingly more precise. HERA proton structure function measurements are very relevant for parton scattering processes at the LHC. How precise do we know the structure of the proton? Furthermore, do we understand the polarised proton data, or is there still a spin puzzle?
P6 Measurements of dynamicaly evolving fundamental constants
After a review (2002) of the fundamental constants of physics in the static limit (zero energy), we review the quantities whose ultraprecise knowledge is a key for testing the SM. Besides the Z mass, known to 20 ppm, one needs to know the fine-structure constant, alpha, evaluated at the weak scale (Mz), and the muon decay constant, Gmu. At higher order, the SM testing requires also the knowledge of the strong coupling constant, and of fermion masses, evaluated at the right scale. A pedagogical introduction, the summary of the developments of the last decade and the promises of ongoing and future programs will be given.
P7 The Hunt for the Higgs particle
After the closure of LEP, the quest for the enigmatic Higgs particle is on for the high energy hadron colliders. The increased luminosity of the Tevatron offers it a chance for a light Higgs discovery. Intriguinly, the Tevatron could accumulate a sufficiently large data set around the same time the LHC is supposed to start-up. The Higgs discovery potential and precision of Higgs property measurements will be discussed in detail for both colliders.
P8 Light meson spectroscopy from low energy e+e- storage rings
The field of light-meson spectroscopy has received recently a new boost from a large quantity of data emerging from the phi factories and from the study of heavy quark decays. Several issues, such as the existence of a glueball, the nature of the f0(400-1200) or sigma meson, are still open to investigation. This series will review the current status of the field, both from its experimental and theoretical perspectives.
P9 Monte Carlos for the LHC
We propose a review of the rapidly evolving field of Monte Carlo event generators for high-energy hadronic collisions. A summary of the theoretical ingredients for such calculations, of the existing limitations and of the work taking place to improve these tools will be given.

I0 Instrumentation

I1 Gaseous Detectors: Then and Now
Since their introduction by George Charpak in the late sixties, multi-wire gaseous detectors have matured with major implications in particle physics experiments; they have also been very successfully developed for their use in other fields: X-ray and medical imaging, UV and single photon detection, neutron and crystal diffraction studies etc. Their major limitation has been a modest rate capability. In the last decade several high rate micro-pattern position sensitive gas devices have been introduced with an inherently improved rate capability and localization accuracy. A state-of-the-art from the old to the new generation of gaseous detectors will be reviewed.
I2 Developements on solid state detectors
Solid state detectors particulalry for tracking have become standard equipement for high energy physics experiments. The LHC experiments will make use of such detectors at a large scale. These lectures will give a review of solid state detectors, discuss recent developments such as diamond detectors, and show some applications other than in large HEP detectors
I3 Crystal detectors and their applications
Scintillating crystals play a growing role in Particle Physics, and in other fields, like medical instrumentation. A pedagogical introduction to the mechanisms governing the features of scintillation and the behaviour of the scintillating material under irradiation will be given. The main developments achieved or under investigation on various scintillating materials will be described, as well as the different methods used for reading-out the light. Some applications to calorimetry for HEP will be described in detail. The evolution of techniques resting on crystals, like computerized tomography and positron emission tomography, will be illustrated.
I4 Triggering and DAQ challenges at the LHC
The trigger and DAQ challenge for the experiments at the LHC (ALICE, ATLAS, CMS and LHCB) is huge. The designs of the trigger and the DAQ system for the experiments enter in their decisive phase within the coming year. An introduction will be given on the challenges, followed by a discussion on the planned solutions.
I5 Detection of Cosmic Rays
Cosmic rays are still enigmatic with several intriguing unanswered questions. These questions will be addressed with forthcoming new experiments, such as the Auger Project. An overview of the experimental techniques in present and future cosmic ray detection will be presented.
I6 General positioning systems
General position monitoring via satellites is becoming an increasingly important topic in science and every day life. Recently Europe has launched its own position system satelite Galileo. Modern developments of GPS systems will be detailed and discussed.
I7 New developments in Astronomy experiments (ground and space based)
A review of the major telescopes and observatories in operation or foreseen, ground-based or embarked (balloon or satellite), listed according to the nature of their survey (X-rays, optical, UV, IR, microwave, radio, etc..) will be made. Their main physics goals and their most recent results will be underlined and explained. The key techniques involved will be described. A special accent will be put on Large Scale Facilities.

Applied Physics Group

A0 Accelerators

A1 Radioactive ion beam science at CERN
On-Line Isotope Separator ISOLDE is a facility dedicated to the production of a large variety of radioactive ion beams for a number of different experiments in the fields of nuclear and atomic physics, solid-state physics, life sciences and material science. The facility is located at the Proton Synchroton Booster (PSB) of CERN. Physics at ISOLDE is pursued in several directions. The results obtained have implications for the basic understanding of the structure of the atomic nucleus, but also for related fields like nuclear astrophysics and weak-interaction physics. The possibility of pure radioactive implants opens acess, to the investigation of problems in solid-state physics.
Currently, the work is going on in accelerating radioactive ions further to energies high enough for nuclear reactions to occur. This experiment called REX-ISOLDE is starting its operation this year and will address novel phenomena predicted to occur in extreme neutron-rich isotopes of light elements and astrophysical processes.
At the end of the presentation the future of the ISOLDE facility will be discussed with a special emphasis on the potential synergies at CERN.
(see also http://isolde.web.cern.ch/ISOLDE/)
A2 Intercepting High Power Beams
The next generation of accelerators will have to master high power beams. Conceptual and engineering challenges of collimator systems, beam absorbers and secondary particle production targets will be addressed.
Today's beam intercepting components cope with beams in the kW range. Those of future neutrino factories, spallation sources and compact linear colliders will be exposed to beam energies in the MW range, and will require handling the risks of high energy densities, elevated radiation doses, acoustic vibrations and liquid metals.
A3 SASE — Next generation Free Electron Lasers
Advances in the physics ans technology of the photoinjectors, linear colliders, insertion devices and free-electron laser make it now possible to generate coherent radiation in the x-ray region by means of Self-Amplified-Spontaneous-Emission (SASE) process.
This radiation has much higer brigtness, shorter pulses and coherence than present third generation light sources.
The status of the physics and technology involved in a radiation source based on SASE will be reviewed, together with an overview of the on-ging activities on this field around the world.
A4 Magnetic field measurement and mapping technologies
The lecture series will cover the following subjects:
  • Methods of using magnetic measurements for beam guidance in accelerators and for particle detection in spectrometer magnets.
  • Type and characteristics of devices and related mechanical and electronic equipment.
  • Magnetic field positioning and alignment.
The different topics may be grouped into 3 parts:
  1. Part I
    • Field measurement techniques Magnetometers (Hall, NMR, EPR, fluxgate, magnetoresistors)
    • Fluxmeters (pick-up coils, rotating coils)
    • Other techniques (Faraday effect, particle beam trajectory, magnetostriction)
    • Combined magnetic and geometric measurements for alignment
  2. Part II
    • Magnetic measurements as production control
    • Geometric multipoles in accelerator magnets
    • Tolerance control for components
    • Magnetic properties of accelerator magnet components
    • Persistent currents in accelerator magnets
  3. Part III
    • Magnetic measurements as characterization
    • Field map in detector magnets
    • Injection behaviour in accelerator magnets
    • Multipoles during ramps, tracking of magnets
A5 Plasma Physics Application to Heavy Ion Accelerators
Extraction and formation of high brightness ion beams from plasmas is one of the hot topics in physics of modern ion sources for heavy ion accelerators. With this respect much of experimental and theoretical efforts of numerous groups in accelerator laboratories and in universities are centered on investigation of plasma phenomena determining the charge state distribution and the intensity of ion beams. Generation of intense beams, in particularly of multiply charged ions, is strongly influenced by equilibrium between multiple electron collision ionization and recombination in plasma. Emphasis of related R&D activities is being directed primarily to measurement of plasma parameters and to development of adequate diagnostic techniques.
Plasma conditions and plasma generation methods specific for different types of ion sources like LIS, MEVVA, EBIS, ECR are under consideration. The advantages and application area for each type of ion source are discussed.
In the context of the next-generation accelerator facilities at CERN, ITEP-Moscow and GSI-Darmstadt highly charged ion generation in laser produced plasma is emphasized.
Research into extreme state of matter with respect to temperature and pressure characterized as dense non-ideal plasma is the most challenging aspect for plasma generated by intense heavy ion accelerators. Special attention is paid to the plasma generated by intense heavy ion beams related to fundamental aspects of High Energy Density in Matter Physics and to Heavy Ion Inertial Fusion.
A6 The LHC Injector Chain
Linac-Booster-PS-SPS will all be used for LHC injection. The lectures will review the features of these faithful machines and underline the modifications required for the LHC era.

C0 Computing

C1 Neural Systems, Genetic Algorithms
Genetic Algorithms (GA) are a method of "breeding" computer programs and solutions to optimization or search problems by means of simulated evolution. Processes loosely based on natural selection, crossover, and mutation are repeatedly applied to a population of binary strings which represent potential solutions. Over time, the number of above average individuals increases, and better fit individuals are created, until a good solution to the problem at hand is found.
GA are especially adequate for searches in large state space, multi-modal state space, or n-dimensional surface, where they may offer significant benefits over more typical search or optimization techniques.
GA have been used in adaptive systems design, adaptive control, finite automata They also play an important role in parameter specification for neural networks.
C2 Managing data - the issues today and tomorrow
For many years the dominant database technology has been the relational model, as implemented in products such as Oracle (deployed at CERN for nearly 20 years), Microsoft Access and Open Source packages such as PostgreSQL and MySQL.
More recently, systems referred to as Object Databases, e.g. Objectivity/DB, also deployed at CERN, or Object-Relational systems, have challenged the dominance of pure Relational systems.
Given this wide spectrum of systems, from Open Source to commercial, from pure Relational, through Object-Relational to pure Object, how does one chose the most appropriate technology for the task at hand?
Database technology includes data mining; the discovery of hidden facts contained in databases. The techniques such as statistical analysis, modelling techniques to finds patterns and subtle relationships in data and infers rules that allow the prediction of future results will be covered.
The proposed lectures will start with a brief overview / definition of the various technologies followed by some guidelines for choosing the most appropriate system for a number of tasks. Finally, the evolution of mainstream database management systems over the past few years will be given, together with predictions for the future.
C3 Internet Security Techniques
These series address the technical aspects of the technology involved in Internet Security, and how the different problems are solved. More specifically:
  • Web Surfing: What security risks am I exposed to when I surf on the Internet? Are my files safe? Can my private information be exposed? Can I trust information on the Internet?
  • Copying files: Is it safe to copy software from the Internet and is it legal? What are "plug-ins" and what can happen when I click "yes" to install them?
  • E-commerce: Is it safe to use my credit card on the Internet? What are the risks and what precautions can I take?
  • Viruses: What is a virus? What should I do when I receive a virus warning? How do I know if my computer is infected? How do I know if it is safe to open an attachment? Why do I need to run anti-virus software and what does it do?
  • SPAM e-mail: What is SPAM? Why do I receive so many unwanted e-mails? What can be done to prevent them?
  • Passwords: How are passwords discovered and how can I best protect mine?
  • Security Rules: What are CERN's security rules? What are my obligations and how can I be sure I am following them? Where can I get more information?
  • Security risks: What types of incidents are common on the Internet and specifically at CERN?
  • Technology Jargon: What is a firewall, intrusion detection ... ?
C4 High Performance Networking
The series will start with an historical introduction about what people saw as high performance message communication in their time and how that developed to the now to day known “standard computer network communication”.
It will be followed by a far more technical part that uses the High Performance Computer Network standards of the 90’s, with 1 Gbit/sec systems as introduction for an in depth explanation of the three new 10 Gbit/s network and interconnect technology standards that exist already or emerge. If necessary for a good understanding some sidesteps will be included to explain important protocols as well as some necessary details of concerned Wide Area Network (WAN) standards details including some basics of wavelength multiplexing (DWDM).
Some remarks will be made concerning the rapid expanding applications of networked storage.
C5 Everything you always wanted to know about GRID and never dared to ask
Sometimes the Grid is called the next-generation Web. The Web makes information available in a transparent and user-friendly way. On the other hand the grid goes one step further in that is enables members of a dynamic, multi-institutional virtual organisation to share distributed computing resources to solve an agreed set of problems in a managed and coordinated fashion. With the grid, users should be unaware whether they are using the computer or data on their own desktop or any other computer or resource connected to the international network. Users get the resources they need, anytime, and from anywhere, with the complexity of the grid infrastructure being hidden from them.
The technology needed to implement the grid includes new protocols, services, and APIs for secure resource access, resource management, fault detection, and communication. Moreover, one introduces application concepts such as virtual data, smart instruments, collaborative design spaces, and meta-computations.
All over the world national and international grid initiatives have been funded. In high-energy physics recently the first phase of the LHC Computing Grid Project has been set up. Its role is to prepare, coordinate, and manage the international infrastructure needed to share and handle on the grid the unprecedented amount of data (several peta-bytes per year) that the LHC experiments will generate starting around 2007. Architectures and resources have to be defined to fulfil the needs of the various participating scientific and engineering communities of over 6000 physicists and engineers coming from more than fifty different countries in Europe, the Americas, Asia and elsewhere.
Experience and know-how has to build up in the area of linking tens of thousands of commodity components combined into tiers of variant complexity (from tens of thousands to a few tens of nodes linked to the Grid). These managed components include CPU, disk, network switches, massive mass storage, plus the needed manpower and other resources to make the whole setup function. Issues of scale, efficiency and performance, resilience, fault tolerance, total cost (acquisition, maintenance, operation), usability, and security have to taken into account.
C6 Data Challenges for the LHC era: data bandwidth and storage
The LHC detectors have an unprecedented precision and granularity. One physics event needs a few Megabytes of storage. Experiments expect to store 109 events per year and about half this amount for simulation purposes.
The lectures give an in-depth view of the challenges to be met when running a few thousand CPU's in parallel per LHC-experiment, when storing a few dozen Petabytes of data for analysis (in an effectively structured way) and when providing CPU power and data volumes easily accessible for physicists wherever they work.
C7 New Network Architectures
The network infrastructures we use today, in particular the Internet, rely on an architectural model which resulted from guiding principles laid down in the mid 70's. At that time, mobility, wireless access, predictable quality of service, scalability to trillions of connected devices, security were not considered as stringent requirements. Over the past two years, active research has been conducted to revisit the network architecture, to determine whether it can be changed to align with current and better requirements. This series of lecture will expose the state of the art of this work and will discuss status and future trends in network architectures.

E0 Engineering

E1 Modern Project Management Methods
To achieve objectives efficiently, any scientific endeavour should be a mixture of both creativity and rigour. In particular for very complex experiments involving large numbers of people, a minimum of quality assurance becomes a prerequisite for the success of the project!
Modern project management techniques are important tools that can be used to optimise results, on time and on budget.
This seminar covers a practical set of modern project management tools for organizing, structuring, estimating & costing, planning and scheduling, and for assessing the risks of projects in general, but most specifically of projects carried out in the fields of information systems, high technology, and scientific research.
E2 Fracture Mechanics and Risk Analysis
Precise computation of stresses by Finite Element Methods and the application of high strength structural alloys in modern structures resulted in weight saving and reduction of safety factors. Fracture mechanics aims to limit operation stresses through design and to guarantee the expected service life of a structure containing flaws.
E3 Surface Engineering and Surface Analysis
A review of surface engineering techniques (as treatments of the surface and near surface regions to perform functions that are distinct from the bulk material functions), coating techniques and thin films analysis and characterization, with focus on HEP applications
E4 Real Time Process Control
After giving an introduction to the hardware and programming of micro computers for the surveillance and control of processes in a real-time mode the term "real-time system" and the hardware structures typical for such systems will be discussed in detail.
A major point of focus is the critical assessment of optimal hard- and software solutions for a given simple problem related to the handling or real-time processes. The lecture may be subdivided into five parts:
  1. Definition of "real-time processes".
  2. Realisiation of real-time systems.
  3. Hardware for real-time systems.
  4. Interrupt handling.
  5. Programming of real-time systems.
This lecture is neither a training course on PLCs or other peripheral devices, nor will it favour a particular type of software or operating system. It will, however, try to give in-depth knowledge about how real-time process control can best be conceived. In understanding the basic requirements for real-time process control the attendee of the lectures may better assess real-time process control that is about to be purchased or proposed for a particular installation or process.
E5 Applications of Finite Elements Methods in High Energy Physics Equipment Design
Finite Element Analysis (FEA) is a computer based method for simulating or analysing the behaviour of engineering structures or components. By being able to predict the performance of a proposed design (mechanical, thermal, magnetic and electrical), FEA can assist in the development of research apparatus for high energy physics, providing engineering information for a more efficient and safer design, which cannot be obtained by traditional means.
E6 Nanotechnology — Manipulating Atoms
Todays manufacturing methods are very crude at the molecular level. Casting, grinding, milling and even lithography move atoms in great thundering statistical herds.
In the future, nanotechnology will let us take off the boxing gloves. We'll be able to snap together the fundamental building blocks of nature easily, inexpensively and in almost any arrangement that we desire. This will be essential if we are to continue the revolution in computer hardware beyond about the next decade, and will also let us fabricate an entire new generation of products that are cleaner, stronger, lighter, and more precise.
It is worth pointing out that the word "nanotechnology" has become very popular and is used to describe many types of research where the characteristic dimensions are less than about 1000 nanometers. For example, continued improvements in lithography have resulted in line widths that are less than one micron: this work is often called "nanotechnology." Sub-micron lithography is clearly very valuable (ask anyone who uses a computer!) but it is equally clear that lithography will not let us build semiconductor devices in which individual dopant atoms are located at specific lattice sites. Many of the exponentially improving trends in computer hardware capability have remained steady for the last 50 years. There is fairly widespread confidence that these trends are likely to continue for at least another ten years, but then lithography starts to reach its fundamental limits.
If we are to continue these trends we will have to develop a new "post-lithographic" manufacturing technology which will let us inexpensively build computer systems with mole quantities of logic elements that are molecular in both size and precision and are interconnected in complex and highly idiosyncratic patterns. Nanotechnology will let us do this.
The lectures will give an overview of the present status of nanotechnology both in fundamental and applied research.

R0 Other Topics in Applied Physics

R1 LHC technologies
The LHC will be, upon its completion in 2006 and for the next 20 years, the most advanced research instrument for high energy physics, providing access to the energy frontier above 1 TeV per elementary constituent. The LHC will make use of advanced superconducting technology — high-field Nb-Ti superconducting magnets operated in superfluid Helium and a cryogenic ultra-vacuum system — to bring into collision intense beams of protons and ions at unprecedented values of the com energy and luminosity.
After a recall of the physical goals, performance challenges and design choices of the machine the course will focus the characteristics of the major technical systems with particular emphasis on the relevant advances in the key technologies of superconducting and cryogenics devices.
R2 Modern Geodesy: Alignment of Machine and Detector Elements
Modern survey techniques, principles, instrumentation, and applied geodesy with their application in the alignment of accelerator and detector elements will be presented in the course. A synopsis of the different ways to define a reference system (from Global Geodetic to Local Object Systems) will also be included. Examples will be taken from the situation at CERN, including the evolution of the CERN Coordinate System (PS to SPS to LEP to LHC/CNGS) and the update of the vertical geodetic reference surface (datum), currently in progress. Subjects such as the reference targets for machine and detector elements, tracking of ground motion will be treated, and the issues associated with very long neutrino beam lines will be presented.
R3 Radiation Protection at High Energy Accelerators
Operating high-energy accelerators inevitably leads to prompt radiation, to activation of accelerator components, detector and shielding materials and to activation of the environment (including rock and earth), of cooling and of ground water and air. Measures must be taken to protect man and the environment as well as radiation sensitive equipment used in the accelerator construction.
During the construction phase of an accelerator, predictions are required of all parameters likely to cause exposure situations to ensure that radiation levels remain below internationally accepted limits. Monte-Carlo cascade calculations are the basic tool. From the results of radiation transport simulations, predictions can be made of shielding requirements and measures can be taken to mitigate the dangers of exposures to high levels of induced radioactivity for maintenance personnel. The quantities of radioactive air and water produced can be used as the source terms for environmental dispersion calculations in order to assess the exposure of persons living near the facility.
During operation of accelerators and associated facilities (including shutdown periods), the main objective of radiation protection is to maintain checks on the levels of radiation using area, environmental and personal monitors and to ensure that exposures are kept at a minimum.
Finally, accelerators will have to be dismantled and radioactive components disposed of as waste in a manner that is safe for future generations.
This series of lectures will provide an overview of the physical phenomena causing prompt and remanent radiation fields at high-energy particle accelerators, including the propagation of cascades induced by beam particles in matter. The implications for shielding and design and operational constraints are discussed, especially in view of current trends to significantly increase beam intensities in future facilities. Various aspects of operational radiation protection, based on CERN's own and host states regulations, are explained and discussed within the general perspective of industrial safety and hygiene. In particular, the three basic principles of justification, optimisation and limitation of exposure (and thus risk) will be discussed.
The lectures will also provide a résumé of current knowledge of effects induced by ionising radiation in biological matter, in particular in humans. It will be shown how this knowledge is used to assess radiation risks and how regulations, including dose limits, are derived.

O0 Physics and Society
O1 Patents: from ideas to royalties
The early identification and protection of intellectual property generated by the people working at CERN is essential to promote the idea that the Organization is a Centre of Excellence for Technology. In addition to its scientific reputation in High Energy Physics, CERN must seek larger public recognition. Means such as the promotion and transfer of technologies to industry and to domains of high importance for society such as medicine and energy are used to reach this goal. Published patents, copyrights, licenses and trademark are the widely accepted means of communicating technical innovation to industry and society. As stated in CERN/FC/4126 the reasons for CERN to pursue such a very proactive policy include:
  • “to ensure that CERN’s technical work and expertise are available to industry in its Member States, as far as is consistent with its scientific mission,
  • to make sure that the interest and usefulness of CERN’s technological work is widely understood,
  • to attract the best industrial partners, suppliers and individual collaborators,
  • to keep CERN at the forefront of relevant technologies.”
In addition, the identification and recognition of intellectual property is essential for the assessment of the contributions of individuals to the scientific program of the Organization.
After an introduction on intellectual property and property rights, the overall patenting process will be addressed starting from the identification and assessment of in house technologies, covering the licensing mechanism and strategy, concluding with the exploitation of the revenues generated by such a process. Other mechanisms such as collaboration agreements resulting from the joint exploitation of intellectual property with other research institutes and with industry will also be addressed and placed in the context of the overall technology transfer strategy.
O2 Energy concepts for the 21st century (2)
The lecture series held in spring 2000 provided an overview of the energy problem related to demography, risk evaluation, waste and cost and discussed alternative energy sources like the energy amplifier, americium and wind energy.
The new lecture series should complement by addressing solar energy, ocean stream energy, energy storage and classical fossil energy, its risks and the possibility of its replacement.
O3 Physics Experiments in the Space Program
Laboratory Experiments have been performed on board of satellites. What has been the outcome in terms of new results, precision and the scientific interest?
O4 Meteorites
The word meteorite inspires fear and curiosity in the population. Several movies have shown them as carriers of catastrophes and disasters, but not everybody is afraid of them. They are fascinating objects full of information studied by hundreds of scientists around the world.
This seminar will show the techniques used to analyze their composition, their history and also the trace that they have let in our planet and in other planets. We will see the RAMAN spectroscopy method, the mathematical models for simulating the impact of meteorites, and many other techniques.
O5 Concepts for transport in the 21st century
Will future transport systems be safer and less energy hungry?
During the 19th and the first half of the 20th century railroads were constructed, taking a major part in land transport. During the 20th century road transport took over, but at the start of the 21st century the limits of road traffic appear, such as negative effects on the earth environment due to the combustion engines, a heavy death toll paid every year due to transport accidents, overloading of the road network, increasing risks and creating jams at mountain tunnels and bridges.
How do traffic experts see the picture for transport in the coming decades?
O6 Complex systems, chaos and prediction
or
«Complex systems, chaos, and the interface between physics and social science»

The analysis of complex systems represents a great challenge in many areas of pure and applied science. Different techniques have been developed during the past few years, including mechanisms for generating deterministic chaos, which have produced new theoretical insights. These have improved our understanding of data analysis, in general, and the prediction of phenomena in environmental and economic sciences, in particular


Last Updated on Monday, 22-Apr-2002 15:51:39 CEST
Ref © 2000-2003 CERN -- European Organization for Nuclear Research