Fermilab

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

MEMORANDUM OF UNDERSTANDING

 

Between the

 

MINOS Experiment and the Computing Division

 

 

 

 

June, 2000

 

 

 

 

 

 

 

 

TABLE OF CONTENTS

 

INTRODUCTION 3

I. PERSONNEL AND INSTITUTIONS 3

II. FERMILAB COMPUTING DIVISION 3

III. SPECIAL CONSIDERATIONS 5

SIGNATURES 6

APPENDIX I - PREP REQUEST 7

APPENDIX II - SUPPORT FOR NON-PREP ELECTRONICS 14

APPENDIX III - DAQ SYSTEM DESCRIPTION & REQUEST 21

APPENDIX IV - DATABASE RELATED REQUEST 25

APPENDIX V - NETWORKING RELATED REQUEST 26

APPENDIX VI - COMPUTING OFF-LINE ANALYSIS MODEL 27

 

APPENDIX VII - NUMI BEAM MONITORING DAQ REQUEST 34

 

INTRODUCTION

 

 

This is a memorandum of understanding between the Fermi National Accelerator Laboratory Computing Division and the experimenters of MINOS (E-875). The memorandum is intended solely for the purpose of providing a budget estimate and a work allocation for Fermilab, the funding agencies and the participating institutions. It reflects an arrangement that currently is satisfactory to the parties; however, it is recognized and anticipated that changing circumstances of the evolving research program will necessitate revisions. The parties agree to negotiate amendments to this memorandum which will reflect such required adjustments.

 

I. PERSONNEL AND INSTITUTIONS

 

Spokesperson: S. Wojcicki

Deputy Spokesperson: D. Ayres

MINOS Project Manager: A. Byon-Wagner

 

MINOS Computing Off-line liaison: J. Urheim

MINOS Computing On-line liaison: G. Pearce

 

Relevant MINOS System Managers:

Front-End Electronics J. Thron, Argonne

(Institutions: Harvard, Oxford, Argonne, Fermilab/PPD)

Trigger and DAQ System G. Pearce, RAL

(Institutions: RAL)

Database P. Border, Minnesota

(Institutions: Minnesota, IHEP-Protvino, Fermilab)

Detector Control System M. Marshak, Minnesota

(Institutions: Wisconsin, Minnesota)

PREP Electronics A. Para

(Institutions: Fermilab)

Computing at Soudan Lab: J. Meier

(Institutions: Soudan Mine/Minnesota, Fermilab)

 

 

  1. Fermilab Computing Division

 

    1. The Computing Division liaison is E. Buckley-Geer.
    2.  

       

    3. The attached off-line analysis plan contains the experiment's present understanding of its analysis model from code development, through production, stripping, final data analysis and Monte Carlo. A more detailed quantitative description is given in Appendix V. The Computing Division cannot guarantee, at this time, that these resources can be made available. The Computing Division, guided by priorities set by management, will attempt to allocate on a quarterly basis, the available resources. The present request and amendments will be used in attempting to plan the laboratory's computing acquisition strategies.
    4.  

    5. A plan for short-term test stands is currently being developed and a list based on the current understanding has been included. MINOS will negotiate an amendment to this agreement with Computing Division as the plan is being finalized. The number and type of test stands will also be the subject of an amendment to this MOU.
    6.  

    7. A plan for NuMI Beam Monitoring DAQ system is currently being developed. MINOS will submit an amendment as Appendix VI, as the design and plans are being finalized.
    8.  

    9. Cost Accounting FY00 and beyond: The Fermilab Computing will maintain budget codes specific to the MINOS experiment, and all MINOS associated expenditures will be charged to those codes. Copies of the appropriate Laboratory Accounting System budget reports (2X-L4, ACT022, ACT070, ACT041) for all MINOS specific budget codes will be supplied to the MINOS Project Management on a monthly basis.

 

 

* Summary of estimated resources needed

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

PREP request

Support for non-PREP electronics

DAQ system request

Database related request

Networking related request

Computing Off-line

NuMI Beam Monitoring DAQ

Total

 

 

III. SPECIAL CONSIDERATIONS

 

    1. For the purpose of estimating budgets, specific products and vendors may be mentioned within this memorandum. At the time of purchasing, the Fermilab procurement policies shall apply. This may result in the purchase of different products and/or from different vendors.
    2.  

    3. The experiment spokesperson will undertake to ensure that no PREP and computing equipment be transferred from the experiment to another use except with the approval of and through the mechanism provided by the Computing Division management. He/she also undertake to ensure that no modifications of PREP equipment take place without the knowledge and consent of the Computing Division management.
    4.  

    5. Each institution will be responsible for maintaining and repairing both the electronics and the computing hardware supplied by them for the experiment. Any items for which the experiment requests that Fermilab performs maintenance and repair should appear explicitly in this agreement.
    6.  

    7. If the experiment brings to Fermilab or Soudan Laboratory on-line data acquisition or data communications equipment to be integrated with Fermilab owned equipment, early consultation with the Computing Division is advised.

 

3.5 At the completion of the experiment: The spokesperson is responsible for the return of all PREP, equipment, Computing equipment and non-PREP data acquisition electronics. If the return is not completed after a period of one year after the end of running the Spokesperson will be required to furnish, in writing, an explanation for any non-return.

 

 

 

SIGNATURES

 

 

 

 

 

 

 

_________________________________

M. Shaevitz, Fermilab Associate Director for Research

 

 

 

 

_________________________________

M. Kasemann, Head of Fermilab Computing Division

 

 

 

 

 

________________________________

S. Wojcicki, MINOS Spokesperson

 

 

 

 

 

________________________________

A. Byon-Wagner, MINOS Manager

 

 

 

APPENDIX I - PREP REQUEST

 

 

Type 1: The following equipment will be in used until the completion of the experiment.

    1. LeCroy 1440 HV system for Near and Far detectors
    2. test equipment needed at Near and Far detector halls

Location

 

Detector Hall at Fermilab

 

LRS 1440 system

Quantity

CLASS_NAME

TC

 

1

LRS: 1441(POWER SUPPLY,LV,1440 SYS

FA

 

2

LRS: 1442(POWER SUPPLY,LV,1440 SYS

FA

 

16

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

 

1

LRS: 1449M(MAINFRAME,HV,1440 SYS

FB

 

1

LRS: 1445(CONTROLLER,HV,1440 SYS

FI

 

1

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

 

1

LRS: 2132(INTERFACE,HV,CAMAC

FB

Test equipment

2

Oscilloscope (plus cart, probes)

 
 

1

Digitizing scope (plus cart, probes)

 
 

3

Digital voltmeter

 
 

2

Pulse generator

 
 

2

pocket pulser

 

CAMAC crates

2

CAMAC Crate

 
 

2

GPIB CAMAC Crate Controller

 
       
       

Location

Far

Detectorl Hall at Soudan

 

LRS 1440 system

Quantity

CLASS_NAME

TC

 

9

LRS: 1441(POWER SUPPLY,LV,1440 SYS

FA

 

18

LRS: 1442(POWER SUPPLY,LV,1440 SYS

FA

 

108

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

 

9

LRS: 1449M(MAINFRAME,HV,1440 SYS

FB

 

9

LRS: 1445(CONTROLLER,HV,1440 SYS

FI

 

5

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

 

1

LRS: 2132(INTERFACE,HV,CAMAC

FB

Test equipment

2

Oscilloscope (plus cart, probes)

 
 

1

Digitizing scope (plus cart, probes)

 
 

3

Digital voltmeter

 
 

2

Pulse generator

 
 

2

pocket pulser

 

CAMAC crates

2

CAMAC Crate

 
 

2

GPIB CAMAC Crate Controller

 

 

 

Type 2: The following equipment will be returned by end of year 2003

    1. PMT test stands at Texas, Oxford, UCL, Tufts and Athens

    Location

    Texas-Austin

       

    TAG#

    SERIAL#

    CLASS_NAME

    TC

    36535

    418

    KSC: 3388-G1A(INTERFACE,GPIB/CAMAC

    CC

    49998

    94363

    LRS: 1441(POWER SUPPLY,LV,1440 SYS

    FA

    513507

    A37007

    LRS: 1442(POWER SUPPLY,LV,1440 SYS

    FA

    513509

    A32553

    LRS: 1442(POWER SUPPLY,LV,1440 SYS

    FA

    45460

    97656

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    49876

    A02639

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    49884

    A02668

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    49892

    A02637

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    60232

    A30465

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    60349

    A30386

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    60877

    A30450

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    61424

    A30416

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    44567

    69529

    LRS: 1445(CONTROLLER,HV,1440 SYS

    FI

    61350

    A39246

    LRS: 1449M(MAINFRAME,HV,1440 SYS

    FB

    46549

    93199

    LRS: 2132(INTERFACE,HV,CAMAC

    FB

    63490

    A41711

    LRS: 222(GENERATOR,GATE,2CH,NIM

    AN

    63380

    A60915

    LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

    CG

    523366

    B23924

    LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

    CG

    24731

    1892

    ORTEC: 109A(PREAMPLIFIER,X1,X10 FET

    AH

    13670

    198

    ORTEC: 114(POWER SUPPLY,PREAMP,12V,24V

    FA

           

    Location

    Oxford

    TAG#

    SERIAL#

    CLASS_NAME

    TC

    27352

    157

    JORWAY: 41(REGISTER,12B,OUT,CAMAC

    CP

    561719

    473

    JORWAY: 73A(CONTROLLER,CRATE,CAMAC,SCSI-2

    CI

    3180

    6495

    LRS: 128L(FAN-OUT,2CH,8-OUT,LIN,NIM

    AG

    44540

    69644

    LRS: 1441(POWER SUPPLY,LV,1440 SYS

    FA

    506441

    69232

    LRS: 1442(POWER SUPPLY,LV,1440 SYS

    FA

    513504

    A37059

    LRS: 1442(POWER SUPPLY,LV,1440 SYS

    FA

    38398

    62261

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    42408

    92903

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    57432

    94238

    LRS: 1445(CONTROLLER,HV,1440 SYS

    FI

    60158

    98522

    LRS: 1449M(MAINFRAME,HV,1440 SYS

    FB

    61974

    A52139

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    28349

    36052

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    28351

    36050

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    33706

    54165

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    36165

    58924

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    515607

    A61920

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    60658

    3085

    PHILLIPS: 710(DISCRIMINATOR,8CH,UPDATE,150MHZ

    AD

    68853

    6119

    PHILLIPS: 755(LOGIC,4CH,4-FOLD,MAJOR,150MHZ

    AE

    Location

    Tufts

    TAG#

    SERIAL#

    CLASS_NAME

    TC

    20056

    16101

    LRS: 222(GENERATOR,GATE,2CH,NIM

    AN

    9989

    7661

    LRS: 365AL(LOGIC,2CH,4-FOLD,MAJORITY,NIM

    AE

    34136

    54184

    LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    533314

    A43267

    LRS: 623B(DISCRIMINATOR,8CH,UPDATE,100MHZ,INH

    AD

    47912

    679

    VK: 5900(POWER SUPPLY,HV,2CH,NEGATIVE,MWPC

    FC

    Location

    UCL

    TAG#

    SERIAL#

    CLASS_NAME

    TC

    19204

    14776

    LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

    CG

    19995

    15648

    LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

    CG

    21409

    17692

    LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

    CG

    22857

    19862

    LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

    CG

    23066

    14794

    LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

    CG

    24897

    25315

    LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

    CG

    34137

    54185

    LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    36171

    58950

    LRS: 612A-AC(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    Location

    Athens

       

    QUANTITY

    CLASS_NAME

    TC

    1

    JORWAY: 41(REGISTER,12B,OUT,CAMAC

    CP

    2

    JORWAY: 73A(CONTROLLER,CRATE,CAMAC,SCSI-2

    CI

    1

    NIM Scaler

    VS

    1

    LRS: 1441(POWER SUPPLY,LV,1440 SYS

    FA

    1

    LRS: 1442(POWER SUPPLY,LV,1440 SYS

    FA

    4

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    1

    LRS: 1445(CONTROLLER,HV,1440 SYS

    FI

    1

    LRS: 1449M(MAINFRAME,HV,1440 SYS

    FB

    1

    LRS: 2132(INTERFACE,HV,CAMAC) & cable

    FB

    3

    LRS: 222(GENERATOR,GATE,2CH,NIM

    AN

    2

    LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

    CG

    1

    LRS: 2301 (QVT-CAMAC INTERFACE) & cable

    FB

    2

    LRS: 365AL (MAJORITY LOGIC)

    FB

    1

    LRS: 429A (LOGIC FAN-IN/FAN-OUT)

    FB

    3

    LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

    AH

    3

    LRS: 821(DISCRIMINATORS)

    AH

    1

    NIM/TTL LEVEL TRANSLATOR

     

  1. Vertical Slice Test stand at Rutherford
  2. Location

    RAL

    QUANTITY

    CLASS_NAME

    TC

    1

    LRS: 1441(POWER SUPPLY,LV,1440 SYS

    FA

    2

    LRS: 1442(POWER SUPPLY,LV,1440 SYS

    FA

    2

    LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

    FB

    1

    LRS: 1445(CONTROLLER,HV,1440 SYS

    FI

    1

    LRS: 1449M(MAINFRAME,HV,1440 SYS

    FB

     

  3. Test stand for Scintillator module production factories at ANL, Caltech, UMN

Location

Caltech

TAG#

SERIAL#

CLASS_NAME

TC

20625

134

JORWAY: 41(REGISTER,12B,OUT,CAMAC

CP

30925

42493

LRS: 2232A(ADC,32CH,12B,DIFF VOLTAGE,SCAN,CAM

CG

30926

44360

LRS: 2232A(ADC,32CH,12B,DIFF VOLTAGE,SCAN,CAM

CG

19207

13811

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

19999

15730

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

21402

16996

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

24678

25324

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

25676

28530

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

25691

28523

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

25695

28482

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

71298

A49163

LRS: 2301(INTERFACE,QVT,CAMAC

CY

60772

A35460

LRS: 2365(LOGIC,MATRIX,8CH,CAMAC

CY

64944

A66845

LRS: 2365(LOGIC,MATRIX,8CH,CAMAC

CY

65186

A35308

LRS: 2371(REGISTER,DATA,ECL,CAMAC

CF

65188

A35316

LRS: 2371(REGISTER,DATA,ECL,CAMAC

CF

28733

37322

LRS: 429A(FAN-IN/OUT,4CH,LOGIC,NIM

AF

544562

C62908

LRS: 429A(FAN-IN/OUT,4CH,LOGIC,NIM

AF

37561

4643

LRS: 4416(DISCRIMINATOR,16CH,CAMAC

CW

36826

3517

LRS: 4416A(DISCRIMINATOR,16CH,CAMAC

CW

36833

3474

LRS: 4416A(DISCRIMINATOR,16CH,CAMAC

CW

66635

70832

LRS: 4616(CONVERTER,16CH,ECL/NIM/ECL,NIM

AR

25189

27595

LRS: 612(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

25192

27579

LRS: 612(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

30733

42779

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

33698

54177

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

36166

58926

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

59647

97670

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

516704

A80396

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

34129

54170

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

61977

A52164

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

521313

LRS: ADP/QVT60(CABLE,QVT INTERFACE

CD

Location

ANL

TAG#

SERIAL#

CLASS_NAME

TC

561714

478

JORWAY: 73A(CONTROLLER,CRATE,CAMAC,SCSI-2

CI

501496

148

JORWAY: BH-1M(CABLE,BRANCH HIGHWAY,PARALLEL,1

CD

Location

Minnesota

TAG#

SERIAL#

CLASS_NAME

TC

549407

6013

BERTAN: 320N(POWER SUPPLY,HV,0-10KV@2MA

FB

502331

31

BLP: 1012(POWER SUPPLY,6@5A,12@2A,24@1A,NIM

AB

61381

7080

DSP: 860C(CRATE,CAMAC

CA

513612

7080

DSP: 860F(FAN,CRATE,CAMAC

CA

513291

7016

DSP: 860P(POWER SUPPLY,CAMAC,6@50;12@3;24@6

CB

8808

FERMI: 11X2562(DIVIDER,HIGH VOLTAGE

FD

10774

FERMI: ES-7092(DIVIDER,HIGH VOLTAGE

FD

21099

132

FERMI: ES-7109(POWER SUPPLY,HV,2CH,NEGATIVE,M

FC

22757

251

FERMI: ES-7109(POWER SUPPLY,HV,2CH,NEGATIVE,M

FC

6656

1.41E+04

FLUKE: 415B(POWER SUPPLY,HV,3KV@30MA

FB

63110

404

JORWAY: 1880B(SCALER,2CH,VISUAL,NIM

AO

21156

A28710

LAMBDA: LPD-425A-FM(POWER SUPPLY,HV,2CH,0-250

FA

44536

69256

LRS: 1441(POWER SUPPLY,LV,1440 SYS

FA

59958

A37050

LRS: 1442(POWER SUPPLY,LV,1440 SYS

FA

514485

A62602

LRS: 1442(POWER SUPPLY,LV,1440 SYS

FA

42478

97749

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

45410

97703

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

45418

97695

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

52939

A16582

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

57132

A28986

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

59943

A28969

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

60347

A30506

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

60348

A30469

LRS: 1443NF/12(CARD,HV,16CH,NEG,1440 SYS

FB

59953

A28304

LRS: 1445(CONTROLLER,HV,1440 SYS

FI

23063

19907

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

31762

47210

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

68116

A75674

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

27640

3.41E+05

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

30721

43371

LRS: 2551(SCALER,12CH,24B,100MHZ,CAMAC

CE

12075

9633

LRS: 365AL(LOGIC,2CH,4-FOLD,MAJORITY,NIM

AE

14955

11849

LRS: 621AL(DISCRIMINATOR,4CH,100MHZ,NIM

AD

15688

13059

LRS: 621AL(DISCRIMINATOR,4CH,100MHZ,NIM

AD

14407

12288

ORTEC: 401A(BIN,NIM

AA

537732

13268

PHILLIPS: 776(AMPLIFIER,16CH,X10,PMT,275MHZ

AH

51848

B031560

TEK: 2465(OSCILLOSCOPE,4CH,350MHZ

EA

 

 

Type 3: Will be returned by end of year 2002

    1. 4PP readout system and cosmic ray test stand at New Muon Lab

Location

New Muon

Lab

TAG#

SERIAL#

CLASS_NAME

TC

21047

4119

BNC: 8010(GENERATOR,PULSE,50MHZ,NIM

AU

503224

29811

FERMI: 029029(FAN,CRATE,NIM

LC

503242

29774

FERMI: 029029(FAN,CRATE,NIM

LC

504355

P072186

FERMI: 029065(FAN,CRATE,NIM

LC

509997

FERMI: 2107(FAN,CRATE,NIM

LC

18762

FERMI: ES-7092(DIVIDER,HIGH VOLTAGE

FD

21583

FERMI: ES-7092(DIVIDER,HIGH VOLTAGE

FD

29022

31

INTERSTD: DTM-299(DISPLAY,CRATE DATAWAY,CAMAC

CC

43355

217

JORWAY: 41(REGISTER,12B,OUT,CAMAC

CP

58589

287

JORWAY: 41(REGISTER,12B,OUT,CAMAC

CP

61983

A39217

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

540034

97778

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

63488

A41860

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

25206

26917

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

50700

A04448

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

25751

27626

LRS: 2341S(REGISTER,16CH,COINC,CAMAC

CF

8744

8417

LRS: 364AL(LOGIC,2CH,4-FOLD,MAJORITY,W/VETO,N

AE

14964

11276

LRS: 365AL(LOGIC,2CH,4-FOLD,MAJORITY,NIM

AE

15719

23531

LRS: 365AL(LOGIC,2CH,4-FOLD,MAJORITY,NIM

AE

63166

A63534

LRS: 365AL(LOGIC,2CH,4-FOLD,MAJORITY,NIM

AE

28751

38000

LRS: 429A(FAN-IN/OUT,4CH,LOGIC,NIM

AF

20325

15789

LRS: 620BL(DISCRIMINATOR,8CH,100MHZ,NIM

AD

30806

42075

LRS: 620BL(DISCRIMINATOR,8CH,100MHZ,NIM

AD

30815

46012

LRS: 620BL(DISCRIMINATOR,8CH,100MHZ,NIM

AD

32817

49853

LRS: 620BL(DISCRIMINATOR,8CH,100MHZ,NIM

AD

15970

13301

LRS: 621AL(DISCRIMINATOR,4CH,100MHZ,NIM

AD

15701

13014

LRS: 621AL(DISCRIMINATOR,4CH,100MHZ,NIM

AD

515366

891011

MECHTRON: 201(POWER SUPPLY,6V@10A,12V@3A,24V@

AB

511097

MECHTRON: 3034(BIN,NIM

AA

515369

892002

MECHTRON: 3034(BIN,NIM

AA

20797

505046

PD: 1570-M4(POWER SUPPLY,HV,3KV@40MA

FB

23584

512046

PD: 1570-M4(POWER SUPPLY,HV,3KV@40MA

FB

506096

211361

PD: AEC-320-9(POWER SUPPLY,6@10A,12@3A,24@1.5

AB

70847

7593

PHILLIPS: 726(TRANSLATOR,LVL,TTL/NIM/ECL,150M

AR

73255

8979

PHILLIPS: 794(GENERATOR,4CH,GATE/DELAY

AN

49756

5843

SEC: 850C(CRATE,CAMAC

CA

508725

5841

SEC: 850F(FAN,CRATE,CAMAC

CA

509624

6371

SEC: PCS850(POWER SUPPLY,6@50A,12@3A,24@6A,CA

CB

502301

18644

FERMI: 029029(FAN,CRATE,NIM

LC

14280

FERMI: 11X2562(DIVIDER,HIGH VOLTAGE

FD

23836

FERMI: ES-7092V(DIVIDER,HIGH VOLTAGE,VERNIER

FD

517046

FERMI: RFDVS(SCALER,VISUAL,3CH,100MHZ,RACKMOU

NA

9910

55425

FLUKE: 415B(POWER SUPPLY,HV,3KV@30MA

FB

12208

2361

GENRAD: 1340(GENERATOR,PULSE,0.2HZ TO 20MHZ

GA

31116

315

JORWAY: 1880B(SCALER,2CH,VISUAL,NIM

AO

50072

364

JORWAY: 1880B(SCALER,2CH,VISUAL,NIM

AO

41503

209

JORWAY: 41(REGISTER,12B,OUT,CAMAC

CP

57457

3677

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

61980

A39155

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

21407

17007

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

68119

A75691

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

68115

A75680

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

61735

A34329

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

523326

B23950

LRS: 2249W(ADC,12CH,10B,CHRG,WIDE GATE,CAMAC

CG

8749

8425

LRS: 364AL(LOGIC,2CH,4-FOLD,MAJORITY,W/VETO,N

AE

15799

11812

LRS: 364AL(LOGIC,2CH,4-FOLD,MAJORITY,W/VETO,N

AE

17762

15791

LRS: 428S(FAN-IN/OUT,4CH,LIN,NIM

AG

544279

B51868

LRS: 429A(FAN-IN/OUT,4CH,LOGIC,NIM

AF

544576

B51844

LRS: 429A(FAN-IN/OUT,4CH,LOGIC,NIM

AF

544575

B51839

LRS: 429A(FAN-IN/OUT,4CH,LOGIC,NIM

AF

62215

A29734

LRS: 465(LOGIC,3CH,4-FOLD,COINC,W/VETO,NIM

AE

544020

B38935

LRS: 465(LOGIC,3CH,4-FOLD,COINC,W/VETO,NIM

AE

61970

A52135

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

34131

54176

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

59656

97665

LRS: 612A(AMPLIFIER,12CH,X10,PHOTOMULT,NIM

AH

17578

12586

LRS: 620AL(DISCRIMINATOR,8CH,100MHZ,NIM

AD

17583

13400

LRS: 620AL(DISCRIMINATOR,8CH,100MHZ,NIM

AD

15918

13286

LRS: 621AL(DISCRIMINATOR,4CH,100MHZ,NIM

AD

72027

904014

MECHTRON: 151(BIN,NIM

AA

517007

902012

MECHTRON: 201(POWER SUPPLY,6V@10A,12V@3A,24V@

AB

543183

901015

MECHTRON: 201(POWER SUPPLY,6V@10A,12V@3A,24V@

AB

543182

905001

MECHTRON: 3034(BIN,NIM

AA

41118

4144

SEC: 850C(CRATE,CAMAC

CA

509865

6448

SEC: 850F(FAN,CRATE,CAMAC

CA

507862

5401

SEC: PCS850(POWER SUPPLY,6@50A,12@3A,24@6A,CA

CB

 

Type 4: Other test stand equipment, which will be returned by end of year 2000

Location

Pittsburgh

TAG#

SERIAL#

CLASS_NAME

TC

501942

391

BLP: 1011(POWER SUPPLY,6@5A,12@2A,24@1A,NIM

AB

502398

74

BLP: 1012(POWER SUPPLY,6@5A,12@2A,24@1A,NIM

AB

5198

FERMI: ES-7092(DIVIDER,HIGH VOLTAGE

FD

21316

204

FERMI: ES-7109(POWER SUPPLY,HV,2CH,NEGATIVE,M

FC

22774

225

FERMI: ES-7109(POWER SUPPLY,HV,2CH,NEGATIVE,M

FC

25076

138

FERMI: ES-7125(POWER SUPPLY,HV,2CH,POSITIVE,M

FC

25095

109

FERMI: ES-7125(POWER SUPPLY,HV,2CH,POSITIVE,M

FC

11724

155

JORWAY: 1880(SCALER,2CH,VISUAL,NIM

AO

549360

408

JORWAY: 73A(CONTROLLER,CRATE,CAMAC,SCSI-2

CI

508248

E99198

LAMBDA: LCS-2-01(POWER SUPPLY,LV,0-7V@550MA

FA

515592

E35914

LAMBDA: LCS-2-01(POWER SUPPLY,LV,0-7V@550MA

FA

26640

30254

LRS: 222(GENERATOR,GATE,2CH,NIM

AN

31668

47266

LRS: 2228A(TDC,8CH,11B,CAMAC

CH

67810

A78125

LRS: 2229(TDC,8CH,11B,ECL,CAMAC

CH

68939

A78124

LRS: 2229(TDC,8CH,11B,ECL,CAMAC

CH

68940

A78123

LRS: 2229(TDC,8CH,11B,ECL,CAMAC

CH

68941

A78126

LRS: 2229(TDC,8CH,11B,ECL,CAMAC

CH

72671

A98804

LRS: 2229(TDC,8CH,11B,ECL,CAMAC

CH

20005

15738

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

55036

A10216

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

20004

15724

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

24901

26966

LRS: 2249A(ADC,12CH,10B,CHRG,CAMAC

CG

1988

6793

LRS: 364(LOGIC,2CH,4-FOLD,MAJORITY,NIM

AE

7035

6742

LRS: 364(LOGIC,2CH,4-FOLD,MAJORITY,NIM

AE

40257

5287

LRS: 4434(SCALER,32CH,24B,ECL,CAMAC

CE

544033

B49750

LRS: 4616(CONVERTER,16CH,ECL/NIM/ECL,NIM

AR

11985

10336

LRS: 621L(DISCRIMINATOR,4CH,NIM

AD

11997

10575

LRS: 621L(DISCRIMINATOR,4CH,NIM

AD

513888

872024

MECHTRON: 3034(BIN,NIM

AA

16790

ORTEC: 401A(BIN,NIM

AA

24040

78

ORTEC: AD811(ADC,8CH,12B,PEAK VOLTAGE,CAMAC

CG

19282

26

ORTEC: ND027(REGISTER,OUTPUT,12B,CAMAC

CP

53491

6375

SEC: 850C(CRATE,CAMAC

CA

507972

5417

SEC: 850F(FAN,CRATE,CAMAC

CA

509053

5983

SEC: PCS850(POWER SUPPLY,6@50A,12@3A,24@6A,CA

CB

 

 

 

 

 

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total

 

 

APPENDIX II - SUPPORT FOR NON-PREP ELECTRONICS

 

Background

 

The electronics to be maintained by the CD consists of:

 

The life cycle of any complex electronic module includes the initial development and certification, integration into the experiment’s data acquisition system, initial commissioning and achieving long term stability under operational running conditions. At any of these stages, problems with a module may be discovered and may require significant resources to understand and correct.

 

To plan for the timely solution of such problems, CD and MINOS agree on the following points:

 

 

Support Levels

 

There are differing levels of support that the Computing Division provides for any specific module:

 

Expediting Service: CD maintains spares for exchange and sends the broken module for repair to the vendor or repair organization. CD will then follow up with the vendor to ensure timely repair and return of the modules.

 

Limited Repair Support: The Computing Division will perform initial checks and verification of performance of the module. It will perform well defined repairs of easy to diagnose, fix and test, failure modes. For more complex failure modes, CD will return the module to an expert for diagnosis or repair.

 

Full Repair Support: CD personnel are fully trained to support modules classified at this level. The module is repaired and fully tested before returning it to the experiment for use. Repair parts are kept in stock or are readily available.

 

 

Equipment Base/Quantity

 

The smooth operation of the MINOS experiment and the CD repair process requires a suitable number of spare modules to be available at the experiment to ensure fast replacement during operations and in the CD PREP Pool to avoid the need for unreasonably short module repair time. The planned spares are listed below for each module covered by this agreement. Typically, 10% spare modules are planned for large-quantity modules, while the spare fraction is larger for small-quantity items. Equipment designated as "counting room" spares at the experiment should not become part of the established readout system or be deployed in new or existing systems, except to replace defective units.

 

 

Support Infrastructure & Scheduling

 

The required test stands, related equipment and software will be provided as detailed specifically below. The CD repair organization will review modules at their prototype stage, and may request changes to the testing procedures and software before accepting responsibility for the repair or diagnosis of any module. The actual acceptance of responsibility for support of a specific module by the CD will be through an E-mail to this effect from the PREP pool manager to the CD and MINOS management.

 

 

Field Modifications/Engineering Change Orders

 

The CD repair organization will be part of the planning for any change orders needed on the production modules. Manpower for implementation of resulting changes will be negotiated as needed.

 

 

ES&H Issues

 

A representative of the CD will participate in the "MINOS Printed Circuit Board Review Committee" review of all new non-commercial PC boards covered under this MOU to ensure the CD ES&H concerns are addressed at an early stage. This representative will supply the relevant documentation to the CD repair organization.

 

 

Logistics

 

The logistics for processing and repair of modules covered under this MOU will follow the normal PREP Pool channels. Any module sent off-site for repair will be shipped through the PREP expediting service. All modules covered under this MOU will either be tracked by the Computing Division equipment database or be regarded as consumables. MINOS will adopt the PREP inventory database for the inventorying, tracking, changing and movement of the module. MINOS will review generated reports, pay attention to their accuracy and perform QA on the data presented.

 

 

Restrictions

 

A module may be deemed uneconomical to repair if the resources required are excessive. This decision to retire a module is made on a case by case basis, jointly by CD and MINOS.

 

 

Reporting

 

Quarterly reports will be provided to Computing Division and MINOS management as to the number of modules serviced, the resources and costs expended, and to provide charge backs for expensive parts. This is an extension to the currently existing process used to charge costs associated with repair and support of electronic equipment. It is expected that hybrids and other special components spares from production fabrication will be supplied by the design engineers for support of the modules.

 

 

 

 

Type 5: Commercial hardware which will need PREP maintenance support

 

 

 

VME 9U crates

 

Quantity: 40

Support Infrastructure: CD will develop a check-in verification procedure. This procedure will be generated by the manufacturer, ESD and ETT/PPD.

Type of CD Service: Limited Repair Support. Initial inspection will be provided in house. Minimal electrical maintenance will be provided and most of the mechanical maintenance is provided during initial check-in. Any failure beyond basic diagnosis would be returned to the vendor.

Inventory Tracking: Tracked in the PREP inventory database.

CD & MINOS Contacts: Carl Schirtzinger (CD), Craig Drennan (ETT/PPD)

 

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

8

Near

CD

CD

16

Far

CD

CD

6

Test stands

CD

CD

1

Testbeam module

CD

CD

9

Spares

CD

CD

 

 

VME Crate Power Supplies

 

Quantity: 36 power supply assemblies

Support Infrastructure:

Type of CD Service: Support level depends on type of supply

Inventory Tracking: Tracked in the PREP inventory database.

Open Issues & Notes: There will be two models of power supplies which are defined by their output voltages and current ratings. These two models will have an identical electrical input and physical packaging (housed in one relay rack mounted enclosure. Detailed information on contact and fabricator will be updated fall of 2000. )

CD & MINOS Contacts: Hank Connor (CD), Craig Drennan (ETT/PPD)

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

8

Near

16

Far

CD

CD

12

Spares

CD

CD

 

 

 

VME Crate Fan units

 

Quantity: 32 Fan tray assemblies for VME 9U crates

Support Infrastructure:

Type of CD Service: Support level depends on type of fan assembly

Inventory Tracking: Tracked in the PREP inventory database.

Open Issues & Notes: There will be one type fan tray assembly (Detailed information on contact and fabricator will be updated fall of 2000.)

CD & MINOS Contacts: Hank Connor (CD), Craig Drennan (ETT/PPD)

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

8

Near

16

Far

CD

CD

8

Spares

CD

CD

 

 

6U VME crates

 

Quantity: approx. 57 crates

Support Infrastructure: CD will develop a check-in verification procedure. This procedure would be generated by the manufacturer and ESD.

Type of CD Service: Limited Repair Support. Initial inspection would be provided in house. Minimal electrical check-out will be provided and most of the mechanical maintenance would be provided during initial check-in. Any failure beyond basic diagnosis would be returned to the vendor.

Inventory Tracking: Tracked in the PREP inventory database.

Open Issues & Notes:

CD & MINOS Contacts: Carl Schirtzinger (CD), Gary Drake (MINOS/ANL)

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

48

Near

CD

CD

6

Spares

CD

CD

3

Test stands

CD

CD

 

 

 

Rack Monitor

 

Quantity: 74 of BiRa RPS-8884 Rack Protection System

Support Infrastructure: CD will develop a check-in verification procedure. This procedure will be generated by the manufacturer, ESD and ETT/PPD. If any non-generic modules or equipment are needed for the test setup, MINOS will provide the fund needed.

Type of CD Service: Assistance of the procurement (fund for purchase of units will be coming from MINOS project). Limited Repair Support. Initial inspection will be provided in house. Minimal electrical maintenance will be provided and most of the mechanical maintenance is provided during initial check-in. Any failure beyond basic diagnosis would be returned to the vendor.

Inventory Tracking: Tracked in the PREP inventory database.

CD & MINOS Contacts: Carl Schirtzinger (CD), Marvin Marshak (MINOS/UMN)

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

32

Near

30

Far

CD

CD

12

Spares

CD

CD

 

 

 

 

Type 6: Custom hardware which will need PREP maintenance support

 

 

 

VARC modules for Far Detector

 

Quantity: 72 9U VME readout boards

Support Infrastructure: Harvard will provide documentation for how to build a simplified test stand. Harvard will also provide copies of testing software developed at Harvard. This software will also be used for testing of VMM and VFB modules. MINOS will supply a crate and specific/custom clock board or any other non-generic equipment needed for the test stand.

Type of CD Service: Full Repair Support, but hard failures will be returned to Harvard.

Inventory Tracking: Tracked in the PREP inventory database.

Open Issues & Notes: CD will be provided a prototype with the software soon.

CD & MINOS Contacts: Tim Kasza (CD), J. Oliver (Harvard)

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

Software

48

Far

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

3

Testbeam module

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

8

Teststand

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

13

Spares

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

 

VMM modules for Far Detector

 

Quantity: 330 mezzanine board plug-ins for VARC modules

Support Infrastructure: Harvard will provide documentation for how to build a simplified test stand. Harvard will provide copies of testing software developed at Harvard.

Type of CD Service: Full Repair Support, but hard failures will be returned to Harvard.

Inventory Tracking: Tracked in the PREP inventory database.

Open Issues & Notes: CD will be provided a prototype with the software soon.

CD & MINOS Contacts: Tim Kasza (CD), J. Oliver (Harvard)

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

Software

242

Far

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

16

Testbeam module

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

24

Teststand

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

48

Spares

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

 

 

VFB Far Detector Front End Board

 

Quantity: 626 stand-alone boards

Support Infrastructure: Oxford will provide documentation, test stand with a test jig and spare parts (custom IC’s and connectors). Harvard will provide copies of testing software developed at Harvard. Oxford will train and be available for consultation as needed.

Type of CD Service: Full Repair Support, but hard failures will be returned to Oxford.

Inventory Tracking: Tracked in the PREP inventory database.

CD & MINOS Contacts: Tim Kasza (CD), A. Weber (Oxford)

 

 

Quantity

Area (System)

Designer

Contact

Fabricator

Tester

Support

Software

484

Far

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

32

Testbeam module

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

48

Teststand

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

60

Spares

Harvard

Harvard

Harvard

Harvard

CD/Harv

Harvard

 

 

 

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total

 

 

APPENDIX III - DAQ SYSTEM DESCRIPTION and REQUEST

 

The UK, specifically the Rutherford Appleton Laboratory, has responsibility for designing and providing the data acquisition and triggering system (DAQ) for both MINOS detectors (Near and Far). While the front end electronics for the Near and Far detectors are substantially different, the DAQ system has been designed to be essentially the same for both. This affords greater simplicity to the system as a whole and greatly reduces overheads in developing, maintaining and running the data acquisition. The design is modular, uses off-the-shelf commercial hardware and has a strong software component, giving the system a high degree of flexibility, which copes comfortably with the MINOS requirements.

 

The MINOS detectors use a simple and continuously active readout architecture. The amplitude and times of all photo-multiplier signals are digitized by the front end electronics and buffered for readout by the DAQ system. Up to 16 crates of readout are envisaged in the baseline design for both detectors though fewer than this are likely to be needed at the Near Detector. The crates have a VME backplane and each contains an on-board processor (the Readout Processor, ROP) that is responsible for the readout, control and monitoring of the front end electronics and for completing time stamping using a GPS synchronised clock. Data are taken from the Front End Readout Boards (ROBs) in synchronised time blocks of fixed but programmable length and buffered for transfer over a fast PVIC network (a high speed, PCI-to-PCI transparent interface) to a farm of Linux PCs (the Trigger Farm). Here the software event selection (triggering) is performed along with some data processing. The use of software rather than hardware for triggering provides valuable flexibility in the selection of events and avoids the complexities and expense of hardwired trigger logic. Events satisfying the MINOS trigger criteria along with data from monitoring and calibration analyses are transmitted over a network to the DAQ computer for further processing and analysis before being archived and distributed to online consumers. The VxWorks operating system is used in the VME processors where true real time performance is required and Linux is used on the DAQ PC's (including the farm). The low level DAQ software will be written in C/C++ and C++/ROOT will be used in the high level software.

 

The conceptual design of the DAQ, Triggering, Clock and Timing systems was fully documented for the Dec '99 design review and is available on the MINOS web site. This should be referenced for further details.

 

The data will be written directly to mass storage in the FCC from both sites (for which support will be required) with local storage devices provided as a backup solution (probably DLT). The DAQ system is designed to cope with a maximum raw data rate of 5MB/s at the Far Detector and 2.5MB/s at the Near Detector; the major contribution to these raw data rates comes from radioactivity and single photo-electron PMT noise. The triggered event rate is dominated by the Near Detector where the average rate for the high energy beam will be <0.5MB/s; data compression is expected to reduce this considerably. The Far Detector event rate is very small ( <2.5 KB/s). Further discussion of the size and processing of the data sample may be found in Appendix VI.

 

Some major milestones are given below. The MINOS schedule is unusual in that it covers two essentially independent detectors at two different sites and requires some systems to be operational a full 2.5 years before the neutrino beam to allow sufficient time for the installation and commissioning of the two large detectors.

ONLINE MONITORING

 

The purpose of the online monitoring system is to provide a means of assessing the quality of the data in the MINOS detectors, to provide summaries of detector performance and to raise alarms if the data quality falls below an acceptable level. A basic system is expected to be in place at the beginning of far detector installation in March 2001.

 

The system will perform elementary reconstruction of data using algorithms provided by the offline software framework. ROOT will be used to provide the graphical display of monitored quantities and the monitoring data should be accessible both onsite and at remote institutions.

 

The processing requirements of the online monitoring system are somewhat difficult to estimate at this time. However, much of the monitoring information (for example, hit maps) will be compiled from off-spill cosmic muons, thus avoiding the high instantaneous rates and event separation/reconstruction issues that place high processing demands on on-spill data. One PC at each site should be sufficient to meet the online monitoring processing needs. These PCs will be provided by the UK group as part of the DAQ hardware.

 

We will require support from Fermilab Computing Division for the use of ROOT (a similar request is made in the "Software and Personnel Resources" section of Appendix VI).

 

MILESTONES

 

Oct 2000 Far Detector Vertical Slice tests begin at RAL

Mar 2000 Begin installation of basic DAQ at the Far Detector

Apr 2000 Commission with first Far Detector plane(s)

Jul 2000 Cosmic ray read out from 10 planes of detector

May 2002 Far Detector SM1 complete and operational

May 2002 Near Detector Vertical Slice tests begin

Sep 2002 Near Detector installation and commissioning begins

May 2003 Far Detector SM2 complete

Sep 2003 Near Detector complete

 

DAQ Equipment Provided by UK(RAL)

 

 

 

DAQ Related Request

 

The PC's and PC related equipment required for the DAQ system will be sourced in the U.S.A. for reasons of price and support, as will PCs for other components of the MINOS detector system. We request:

 

  1. Feedback and advice from Linux PC evaluations performed by Fermilab CD;
  2. Advice on sourcing PC equipment and assistance with ordering through Fermilab;
  3. Maintenance support for PC's, disks, tape drives, KVM switches;
  4. Disk array at each site for local, short term data storage (RAID or equivalent); capacity to be specified.
  5. Two DLT (or equivalent if agreed) tape drives at each site (four total) as a backup solution for the network links to the FCC mass storage.
  6. Two laptop computers for roving diagnostics, one at each detector site;
  7. A VME bus analyser for each site;
  8. A PCI bus analyser for each site;
  9. Test stand support;

Software consultancy and support for a C++/ROOT system is requested by the offline group; this will be sufficient for the DAQ and Monitoring.

 

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total

 

 

 

APPENDIX IV – DATABASE RELATED REQUEST

 

 

The current plans are to use the ORACLE database for storing the calibration information. We will require the following resources from Fermilab

 

  1. Four computers for use as database servers at the Soudan mine and Fermilab. There will be one server per site with the third and fourth used as hot spares. They will be either Intel machines running Linux or Sun machines running Solaris, we have not made this choice yet. The amount of disk required is not precisely known but is expected to be small. We expect to use RAID disk for increased reliability. Some type of backup procedures will be required.
  2.  

  3. Use of 25 of the concurrent user licenses available at Fermilab.
  4.  

  5. 100 hours/year of database consulting help from the Fermilab Computing Division.
  6.  

  7. 100 hours/year of FIX and FIX database consulting help from the Fermilab Computing Division
  8.  

  9. Database administration

 

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total

 

 

 

 

 

APPENDIX V – NETWORKING RELATED REQUEST

 

Some amount of help will be requested on design and implementation of Networking to and for the Far and Near Detector Halls. We will require the following resources or support from Fermilab.

 

  1. Design and implementation of LAN at Soudan and Near detector sites
  2.  

  3. Data network connection to FCC mass storage from Soudan and Near detector sites
  4.  

  5. Network management for specifying, installing and running the networks at Soudan and Near detector hall.
  6.  

  7. Design/define and implementation of site cyber security and security advice to subsystems.

 

 

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total

 

 

APPENDIX VI - COMPUTING OFF-LINE ANALYSIS MODEL

 

 

INTRODUCTION

 

This appendix deals with the offline computing needs for the MINOS experiment. The offline needs can be broken into the following areas:

 

 

This document will address the resources required in each of the preceding areas. These resources include number of tapes, amount of CPU, disk space and number of people.

 

 

EXPERIMENTAL DATA

 

In Tables I and II we list the data volumes per year that we expect to record to tape for MINOS from the far and near detectors respectively assuming a high energy beam. The number of neutrino interactions from the far detector is small, on the order of 22,000 per year. The rate from the far detector is dominated by 1Hz of cosmic ray m interactions. The neutrino interaction rate from the near detector will be somewhat larger, namely 105 Hz for the high energy beam. Of this 105 Hz, a small fraction, < 1 Hz , is in the central 25cm of the target region and will be used for neutrino oscillation studies. A somewhat larger region will be used for studies of conventional neutrino physics (about 4 Hz). There will also be 11 Hz of events where a neutrino interacts upstream in the rock and produces a m in the detector. The near detector DAQ system is capable of recording the full 250 Hz of cosmic rays m seen by the near detector but it is expected that we will record only a fraction of these for full reconstruction and this is reflected in the numbers in Table II. The far detector assumes 3´ 107 seconds in one year (cosmic rays are always there) and the near detector assumes an effective year of 107 seconds. For simplicity, 1Kbyte º 1000 bytes. The data is expected to expand by about a factor of 5 after processing.

 

 

Data handling and storage

 

The total raw + processed data volume per year from both near and far detectors is about 6.5 Terabytes. Assuming a tape capacity of 20 Gbytes (current STK technology) we would require a minimum of about 330 tapes/year to archive the complete data from both detectors (see note below about writing partially full tapes). This is about 5% of the current STK library per year. The physics data, both neutrino oscillation and conventional, will be stored permanently on disk (about 170 Gbytes/year). How much of the remaining data is stored permanently on disk is yet to be determined. As an example, given current disk capacities and prices of 50 Gbytes and $28/Gbyte we would require about 110 disks/year for the full 5.5 Terabytes of processed data at a cost of about $160,000 per year for JBOD. We expect disk capacities to continue to increase and prices to continue to drop making it feasible to store all the data on disk by the time MINOS begins data taking, should this be desirable.

 

 

Sample

Rate/second

(Hz)

Events/year

 

Event

Size (Kbytes)

Data

Volume

/year (GB)

Processed Data

Volume

/year (GB)

n events

0.0022

22000

2.4

0.05

0.25

Cosmic

ray m

1

3´ 107

2.4

72

360

Total

1

3´ 107

72

360

 

Table I Event rates for the far detector

 

 

Sample

Rate/second

(Hz)

Events/year

(107)

Event

Size (Kbytes)

Data

Volume

/year (GB)

Processed

Data

Volume

/year (TB)

n events

105

105

0.8

840

4.2

m events

11

11

0.8

88

0.44

Cosmic

ray m

11

11

0.8

88

0.44

Total

127

127

1016

5.08

Oscillation n

0.26

0.26

0.8

2

0.01

Conventional n

4

4

0.8

32

0.16

 

Table II Event rates for the near detector

 

 

The default mode of operation will be to record data on disk locally at the far detector and transfer it over the network to Fermilab/FCC (Feynman Computing Center) for archiving to tape. There will need to be a machine in FCC to receive the data and transfer it to the robot. This machine will require a local disk cache as a rate adapting buffer because the speed of data transfer it too low to transfer it directly to tape. The low rates mean that it will take a very long time to accumulate enough data to fill a tape, therefore we will most likely not write full tapes. We will probably also desire to make several copies of the raw data. We expect to make use of existing Fermilab tools (Enstore) to do the tape archiving. As a backup against network failures there will also be the capability to write data tapes at the far detector. Depending on the length of the downtime the data may just be transferred over the network once it is available again, or tapes may be shipped to Fermilab to be read and archived - note that the tape drives at the detectors will most likely not be the same as those in the robot. The choice of tape drives has not yet been made. It may be possible to use the same type of drives as chosen for Run II. The near detector data will be logged to disk locally at the experimental hall and then transferred to FCC to be archived to tape. Local tape drives will also exist as a backup in case of extended unavailability of the network connection or the robot. The same machine will be used to do the archiving in FCC as for the far detector. The rates are 100 Kbyte/sec and 2.4 Kbyte/sec for the near and far detectors respectively.

 

 

DATA COPYING

 

The data copying needs are not yet known. The samples for oscillation studies are small and can probably be transferred over the network. Collaborators at overseas institutions such as the UK and Russia may want a complete copy of all the data. In the worst case, if we assume that we need to make 5 copies of the complete sample then this will be 27.5 TB per year which is about 460 tapes assuming an export media such as Exabyte Mammoth and 60 Gbyte cartridges. This is a modest amount (about 3-4 weeks) which should be able to be satisfied by a central tape copy facility.

 

 

OFFLINE DATA PROCESSING

 

The offline processing for both the near and far detectors will be done at Fermilab. A summary of the processing needs is given in Table III.

Sample

SpecInt sec/event

Events/year

(107)

SpecInt /year

 

Number of CPUs/year

Near detector

7

127

600

27

Far detector

18

3

36

2

Total

25

130

636

29

 

Table III Processing needs for near and far detectors

 

 

These numbers are based on the existing MINOS code which is in Fortran. The code is currently being re-written using Object Oriented techniques and C++. This may cause some slow down of the code. For the purposes of this document the processing time per event will be given in SpecInt seconds per event (SpecInt sec/event) and the CPU requirements will be given in SpecInt. We are using the SpecInt95 measurements that are available at http://www.spec.org. To calculate the number of CPUs we have used an Intel motherboard SE440BX2 550 MHz Pentium III rated at 22 SpecInt95. In the early stages it will be necessary to reprocess the data as the algorithms are being refined at the same time as new data is being processed. Two complete processing passes through the data are assumed.

 

 

OFFLINE ANALYSIS

 

We plan to perform physics analysis of the data at Fermilab as well as at other MINOS collaborating institutions.

 

We expect Fermilab to provide a central machine(s) in FCC with access to a central mass storage system for data that does not fit on permanently on disk. This machine(s) will act as a data server with the data being available via AFS or similar distributed means so that it is accessible to the collaborators offsite and to desktops at Fermilab. This machine(s) will also act as a central location for the MINOS software. We expect that people will make extensive use of desktop computing given the modest size of most of the data samples and the continued growth of disk capacity.

 

 

MONTE CARLO GENERATION AND STORAGE

 

There are two types of Monte Carlo required for MINOS, simulation of neutrino interactions in the detector for oscillation measurements/conventional neutrino physics and simulation of the neutrino beam to understand features of the beam such as beam profiles, flux etc. In both cases the requirements are not precisely known so the numbers here are based on present knowledge. We assume here that we will generate the samples at Fermilab but the possibility may also exist to generate them at collaborating institutions and ship them to Fermilab for storage.

 

Physics Monte Carlo

For studies of oscillations we expect to need about 10 times more Monte Carlo events than data for the far detector, this is a negligible amount (220,000 events/year). For the near detector we require that the statistical accuracy should be negligible compared to the statistical accuracy of the far detector data. Hence a sample equal to the near detector data sample will be sufficient (the relevant rate is 0.26 Hz in the central 25 cm radius of the target region). For studies of conventional neutrino physics the Monte Carlo sample needs to be about twice the data sample size as the measurements will be dominated by systematic uncertainties. The current execution time for the simulation is 103 SpecInt sec/event. If we include the reconstruction then it becomes 110 SpecInt sec/event to simulate and reconstruct a physics event. The needs per year are summarized in Table IV. To calculate the number of CPUs we have used an Intel motherboard SE440BX2 550 MHz Pentium III rated at 22 SpecInt95. This data will also need to be stored in the central mass storage system at Fermilab and be accessible on the MINOS central analysis machine. The samples for the oscillations studies will be kept permanently on disk (about 500 Gbytes/year).

 

 

 

 

 

Events/year

8.8 ´ 107

SpecInt/year

323

Number CPUs/

year

15

Event size

57 Kbytes

Data stored/year

5 TB

 

Table IV Physics Monte Carlo needs per year

 

 

 

Beam Monte Carlo

 

Currently a single run takes about two CPU weeks on an SGI O200 180 MHz CPU rated at 8 SpecInt95 to obtain sufficient statistics. The data volume produced per run is about 240 Mbytes. It is expected that a factor of 10 times longer runs will be needed for the physics analysis and that a few hundred of these runs will be required. To calculate the number of CPUs we have used an Intel motherboard SE440BX2 550 MHz Pentium III rated at 22 SpecInt95. The needs are summarized in Table V.

 

 

SpecInt sec/run

9.7´ 107

Runs/year

200

SpecInt/year

645

Number

CPUs/year

30

Storage/year

480 Gbytes

Table V Beam Monte Carlo needs per year

 

 

About 17 CPUs per year would be required to simulate the approximately 108 events that these 200 runs would produce using the detector simulation. The overall data volumes and processing needs for MINOS for 5 years of data taking are summarized in Table VI.

 

 

 

Full Data Sample

Events

Raw Data

Processed

Data

Processing

Near detector

63.5´ 108

5.5 TB

27.5 TB

3000 SpecInt

Far detector

1.5´ 108

0.36 TB

1.8 TB

180 SpecInt

Full Monte Carlo samples

Events

Processed

Data

Processing

Physics Monte Carlo

4.4´ 108

25 TB

1615 SpecInt

Beam Monte Carlo

1000 runs

2.4 TB

3225 SpecInt

n Oscillation samples

Events

Processed

Data

Processing

Near detector data

13´ 106

15 GB

31 SpecInt

Far detector data

110,000

0.25 GB

negligible

Monte Carlo

40´ 106

2.3 TB

110 SpecInt

Conventional n samples

Near detector data

2´ 108

800 GB

94 SpecInt

Monte Carlo

4´ 108

23 TB

1465 SpecInt

 

Table VI Summary of data volumes and processing needs for 5 years of data taking

 

 

 

SOFTWARE and PERSONNEL RESOURCES

 

The MINOS experiment has a working Fortran simulation and reconstruction package that is currently being used for detector optimization and physics studies. The decision was made in summer 1999 to replace this system with a new Object Oriented system based on C++. It was felt that, although there is a working Fortran system, the tools it is based on such as ZEBRA and ADAMO would not be supported over the life of the experiment and it would therefore be sensible to make the transition to an OO system.

 

The software group in MINOS is in the process of specifying what this new system will look like. This means it is hard to make definitive and complete requests for personnel and support. However, we can say some things about this area.

  1. The current proposal is to use ROOT as the batch/interactive framework. This is a somewhat more widespread use than intended by the Run II experiments. We will need support from the Fermilab ROOT team to cover this mode of using ROOT.
  2. During the requirements phase we would greatly benefit from the expertise of the C++ experts (Jim Kowalkowski and Marc Paterno) in giving us advice. We have already had some informal meetings with them and would like to continue these as their schedule permits.
  3. Once we have a clearer picture of what the pieces of the system should look like we will be able to produce a schedule with milestones. This will allow us to specify projects where we would like design/programming help from Computing Division (CD) personnel. It may be appropriate to hire consultants for some tasks.
  4. We have not yet decided on a configuration management system. However, SRT (SoftRelTools) is a strong contender. Should we choose this we would expect support from CD including responding to bug fixes and requests for additional features.
  5. Help on database issues has already been covered in a previous section of this Appendix.
  6. We will want to make use of products such as leak checkers (PURIFY) and CASE tools in which case we will require licenses for MINOS on various platforms.MINOS currently has NO plans to use the KAI complier. The intention is to use egcs and native vendor compilers where appropriate. We would therefore require versions of products such as ROOT which were built with these compilers.
  7. We will require help from the ISD department in CD to set up the data archiving system as we will be using the tools that they have developed.

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total

 

 

 

APPENDIX VII – NUMI Beam Monitoring DAQ SYSTEM DESCRIPTION and request

 

 

NUMI beam monitoring DAQ SYSTEM DESCRIPTION will be added later time, including list of equipment from CD (& PREP) needed and milestones

 

 

 

* Estimated resources needed from this section

Cost (K$)

Manpower (Hours)

Task

FY01

FY02

>FY03

FY01

FY02

>FY03

Total