• Aucun résultat trouvé

Commissioning of the ATLAS detector and combined beam test results

N/A
N/A
Protected

Academic year: 2021

Partager "Commissioning of the ATLAS detector and combined beam test results"

Copied!
21
0
0

Texte intégral

(1)

HAL Id: in2p3-00117272

http://hal.in2p3.fr/in2p3-00117272

Submitted on 30 Nov 2006

HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

Commissioning of the ATLAS detector and combined beam test results

P. Perrodo

To cite this version:

P. Perrodo. Commissioning of the ATLAS detector and combined beam test results. X Pisa Meeting on Advanced Detectors: Frontier Detectors for Frontier Physics, May 2006, La Biodola, Italy. pp.113- 116, �10.1016/j.nima.2006.10.286�. �in2p3-00117272�

(2)

P. Perrodo, LAPP-IN2P3-CNRS/CERN, on behalf of the ATLAS collaboration

Commissioning of the ATLAS detector and

combined test beam results

Xth Pisa Meeting, May 21-27 2006, Isola d’Elba

(3)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

ATLAS road map

2005 2006 2007 2008

Detector installation Combined test beam

(1% of ATLAS)

Integration, from detector to off-line cosmic runs

Global cosmic run

First beams

(4)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Installed today

Liquid argon and TILES calorimeters:

All in the cavern, Barrel LARG cryostat is cold,

Barrel Front end electronics all here, expect the power supplies A small fraction of the readout is possible yet

Will grow in size with more power supplies Muon spectrometer (barrel and forward)

Chambers under installation.

Very small fraction of the readout available Forward wheels coming next

Inner detector (SCT+TRT, pixel later)

SCT+TRT barrel integrated, tested on surface

Good fraction of the readout present already used.

Then long installation in the cavern

Magnets:

Solenoid: ~cold Barrel toroid:

pumping

Endcap toroid:

end Sept. 06

(5)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

ATLAS combined test beam

x z

y

(6)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Test beam results with muons

Tracking Muon Spec.- Inner Detector

Back extrapolation of a muon track to the inner detector (Pixel + SCT)

Validation of the reconstruction software

Alignment procedure

Measure of the tracking performances

Z (ID) mm

– Slope = 1.02 ± 0.04 – Offset = -7.95 ± 0.42

Z (MS) mm

Performances Calo.-Muon Spec.

Muons (~300 GeV) with a

Bremsstrahlung in the Calorimeters Validation of the reconstruction software

Evaluate inter-calibration Calo – Muon spectrometer

Quality of the simulation (Geant 4)

(7)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Detector integration

Exercise various combinations of ATLAS sub-systems:

Detectors, DAQ and online databases. Idem with DCS (Slow ctrl.) Calorimeters and the calorimeter trigger. Calibration, integration of the trigger.

Magnets:

Barrel toroid tests. Exercise all installed detectors. Functional tests. Cosmic runs.

Solenoid mapping. Exercise the calorimeters electronics.

Functional tests: test the performances: operational, new errors, recovery procedures, stability of the data taking,

calibration procedures

Run with cosmics:

Data taking, on line monitoring, full analysis chain exercised.

Detector study:bad channels.

Repeat the exercise when the readout system grows in size.

(8)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Detector schematics

Detector 1 Front-end elec,

Local DAQ 1

TTCVi LTP 1

TTCVi LTP 2

Detector 2 Front-end elec,

Local DAQ 2

DAQ Slow Control

(DCS)

L1 receivers Trigger logic

Possibly chaining LTP

(Local Trigger Processors)

Configuration Condition Data bases Load

parameters

Data Storage

Possibly cosmics L1A

Offline Software Store

parameters

Data analysis Online

monitoring

¾On-line databases:

¾COOL as condition DB

¾Interfaces between COOL and PVSS

¾Configuration from ORACLE to PVSS

¾Various choices for

configuration/conditions according to the features of the detectors.

(9)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Commissioning of the 150 ROS (Read- Out System) completed this year: LARG, TILES, L1Muons

Pre-series test with ~10 % of the full system, all functionalities present.

3Q06: 32 SFIs, 12 DFMs, 2 L2SV, switch

Modular system: more and more PCs and switch cards will arrive between this end 06 and May 2007

Read-Out Subsystems

(ROSs) LVL2

Super- visor

Cavern Surface

Timing Trigger Control (TTC) 1600 Read- Out Links

10-Gigabit Ethernet

RoI Builder DataFlow

Manager

Event Filter

(EF)

pROS

~ 500

~1600

Regions Of Interest

~150 PCs

Event data requests Delete commands

Requested event data

dual-CPU nodes

~100

~30

Network switches Event rate

~ 200 Hz

Local Storage

SubFarm Outputs

(SFOs)

LVL2 farm

Network switches Event

Builder

SubFarm Inputs

(SFIs)

ATLAS detector Read-

Out Drivers

(RODs) First- level

trigger UX15

Dedicated links

VME

Data of events accepted by first-level trigger

DAQ components

TILES+ DAQ HLT

(10)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Event Builder only. Comparison pre-series with model in various configurations.

Read-Out Subsystems

(ROSs) LVL2

Super- visor

Cavern Surface

Timing Trigger Control (TTC)

10-Gigabit Ethernet

RoI Builder DataFlow

Manager

Event Filter (EF)

pROS

~ 500

~1600

Regions Of Interest

~150 PCs

Event data requests Delete commands

Requested event data

dual-CPU nodes

~100

~30

Network switches Event rate

~ 200 Hz Local Storage

SubFarm Outputs

(SFOs)

LVL2 farm

Network switches Event

Builder

SubFarm Inputs

(SFIs)

DAQ pre series results

Combined system performance from measurements and model

0 500 1000 1500 2000 2500 3000 3500 4000 4500 5000

0 2 4 6 8 10 12 14 16 18 20 22 24

#L2PU EB Rate (Hz)

AcceptRatio: 3.5%

at2sim model 3.5%

~ nb ev./s/SFI

Event Builder + Level2. Dummy L2 algorithm.

8 ROS, 8 SFI. Comparison pre-series with model in various configurations.

(11)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Long runs (->24hrs). 8 ROS, 8 SFI, 20 L2, dummy algorithms. Stability observed.

Read-Out Subsystems

(ROSs) LVL2

Super- visor

Cavern Surface

Timing Trigger Control (TTC)

10-Gigabit Ethernet

RoI Builder DataFlow

Manager

Event Filter (EF)

pROS

~ 500

~1600

Regions Of Interest

~150 PCs

Event data requests Delete commands

Requested event data

dual-CPU nodes

~100

~30

Network switches Event rate

~ 200 Hz Local Storage

SubFarm Outputs

(SFOs)

LVL2 farm

Network switches Event

Builder

SubFarm Inputs

(SFIs)

DAQ pre series results

Performance of 8x8x20, TS=14, Acc=1

0 1000 2000 3000 4000 5000 6000 7000 8000

0 50 100 150 200 250 300 350 400 450 Time (min)500

EB Rate (Hz)

Test with Event Filter. Real algorithms

(Online 10.0.06) with ROS emulation

sending Geant events

(12)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

LCS EBC

LCS BC

LCS BA

LCS EBA

LCS 1 LCS 2 LCS 3 LCS 4

Local Control Stations (LCS)

OPC

Cooling LV HV ... HEC

HV

OPC

Temp

OPC

Barrel HV

OPC

FE Crates

OPC

HV LV

OPC

Purity

PVSS Connection CAN Fieldbus

Front-End Systems Magnet DIP

CERN

LHC DSS

Global Control Stations (GCS)

Data

Viewer Alarm Status Web

Operator Interface DCS_IS

Tile Pixel SCT TRT MDT TGC RPC CSC LAr Subdetector Control Stations (SCS)

LAN

CIC

DAQ

DCS components

DBs

(13)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Grouping of Sub-System Partitions

CTP (Central Trigger Proc.)

Pixels SCT TRT

L1 Calorimeter trigger LAr Barrel

Tile Barrel Tile Extended LAr EMEC

LAr HEC/FCAL

MDT Barrel

RPC TGC-A CSC

MDT End-Cap

TGC-C

= LTP (Local

Trigger Processor)

= LTP Interface

Cosmic Trigger

Partitioning of the detector. Used for:

-Commissioning -Calibration of the detectors during the LHC inter-fills

•Run independent groups of

partitions thanks to a special

interface board

•This provides a large flexibility for commissioning

•The CTP can

receive cosmic

trigger signals

(14)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

MuCTPI CTP

CTP

MuCTPI

Level 1 trigger

Muon Barrel trigger Muon Endcap trigger Calorimeter trigger Muon Barrel trigger Muon Endcap trigger Calorimeter trigger

CTP

MuCTPI

June 06: final RODs

July 06: ROiBuilder, HLT Sept 06: CTP integration

June 06: run with lower

sector, CTP, HLT cosmics July 06: electronics for the TGC trigger (M1-C)

Sept 06: first final sector logic

June 06: CTP in place July 06: Conf databases, combined with RPCs, combined with HLT

Aug 06: Add thecalorimeters

June 06: ROS

integration, ROiBuilder and RPC

(15)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Cosmic runs

Exercise the full functionalities:

Conf DB, Trigger, DAQ, Slow control, HLT, on-line monitoring, event display, control room, shifts

Full calibration procedures. Treatment of the bad channels

Detectors available:

LARG barrel, TILES barrel (limited readout). L1CALO trigger

Muon spectrometer lower sector

Physics goals

Amplitude inter calibration Timing studies

Bad channels characterization

(16)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Cosmic trigger with Tiles calo.

A B

C D

1 2

3

4 ( )

) (

D C

B A

A: 12 SD B: 12 SD

D: 12 SD

C: 12 SD Four Coinc.

Boards

Estimate Rate < Hz

(coinc.

~400 Hz * 100 ns

~ 1E-5)

Goal for June:

8 SD top *2

8 SD bottom * 2

(17)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Cosmic runs

Internal LARG inter-calibration

Amplitude vs η

Needs 40000 muons/cell for 0.5%

precision

Rate 0.04(0.15 non projective) Hz 100 muons/cell -> ~100 days of DAQ But can understand timing at 0.6 ns

TILES response to MIP

At the combined test beam In the Cavern (LV power supplies different ->noise

(18)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Cosmic runs

LARG-TILES inter-calibration

Response to MIP at the combined test beam, compared to simulation, for layers and total

3% agreement

TILES timing

From comparison of various cells Time resolution found of 1.7 ns

TILES time difference (top- bottom)

After correction fits with the geometrical estimate

Precision of 1.8 ns

(19)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Some rates

TILES

1/16 of the barrel: 1GB/day

LARG

Electronic calibration ramps (100,000 channels in the barrel): 5.2TB (transparent), 42GB (averaged locally in the LARG DAQ)

Calibration signals recording: 650 GB

Cosmics at 10Hz

TILES: 1.4MB/s

LARG 15MB/s. Maximum recording 20 MB/s

Muons (lower sector)

20 Hz and 2kB/event

Autumn 06:

end cap calorimeter. Need the Event Builder to take the data flow.

(20)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Barrel toroid test

Test the BT as a separate object

Exercise everything already installed in presence of magnetic field

Infrastructure (LV, gas, cooling) already installed around the detector

Operate the Front-End electronics: LARG, TILES calorimeter, Barrel Muons (MDT, RPC) chambers Muon spectrometer: Alignment system, precise

measurement of the field (B at 1-2mT for Bl at 4 10-3), effect of the surrounding structures

Take cosmics with muon spec., TILES and LARG

Similar issues for the solenoid mapping

run Goal Current ramp time total recovery

[kA] [hours] [hours] [days]

1 test at low current 5 0.3 3.9 --

2 test at 1/4 of full energy 10 0.7 2.9 --

3 test at 1/2 of full energy 15 1.0 3.6 --

4 test at 3/4 of full energy 18 1.3 3.5 --

5 test at full energy 20.5 1.4 3.8 --

6 fast dump low current 5 0.3 0.9 ??

7 fast dump (quench) 15 1.0 1.3 ??

8 steady state test 20.5 1.4 11.8 --

(21)

X Pisa Meeting, 22 May 06 ATLAS commissioning, P. Perrodo

Global cosmic run to first beams

Toward a global cosmic run (spring 2007)

Integrate the detectors and systems as they come, when they grow in size, debug the full chain from shifts to data analysis.

Cosmics can be used for the barrel part. Use of cosmics for the end-caps is under investigation.

Beam gas

can be used for for the end-caps: alignment, timing, inter- calibration.

Run at high L1 trigger rate with real events. DAQ challenge.

Very first collisions: detector debugging and performances With ~10-100 pb-1, ~ 10

4

Z->ee, Z-> μμ, also tt->blν bjj,

Trackers, Calorimeters, muon alignment, jet energy scale and b-tag A lot of work and fun is coming!

Thanks to: L. Chevalier, R. Nikolaidou, L. Pontecorvo, R. Teuscher, G. Unel, Th. Wengler

Références

Documents relatifs

The technology of the tile edges matting helps to improve the situation as shown in Figure 12 for the light collection efficiency calculated for a plastic tile of the inner

On the other hand, muons have well-understood energy deposition mechanisms and, unaffected by non-uniformities in the absorbers, they give a direct sensitivity to the width of

For this reason and because it was not possible to separate pions and protons in 2004 combined test-beam, the response for positive and negative beams have been represented

Shift of the vertical waist position as a function of the momentum offset of the incoming beam, measured at the Laser- Compton monitor. Squares are measured before the correction of

Coincident air-shower events 16 4.1 Independence and validation of coincident events 17 4.2 Energy and distance distributions for coincident events 17 4.3 Relative detection

Other contextual elements are perceived or inferred dynamically during the execution of a given process (e.g. in query processing, the availability of data sources is

The ATLAS Liquid Argon Calorimeter: Construction, Integration, Commissioning and Combined Test Beam Results..

The ATLAS detector performance allows to find the Higgs boson(s) of the SM, of the MSSM and of the Little Higgs Model over a very wide range of masses. ATLAS is getting ready for