Computing for REDTOP

Computing infrastructure

Most of the computing for REDTOP is handled trough the Open Science Grid (OSG) project. In order to get access, you need to apply for an account and link it to the REDTOP project. Detailed instructions for the process are given below.

In addition to the OSG, we have plans to setup  two Tier1 systems at the Autonomous University of Puebla (MEX) and at Northern Illinois University (US). Instructions for running the software on those facilities will be provided as soon as the setup is complete.

Instructions for getting an OSG account

All the sign-up, login and running on the OSG is handles trough the OSG Connect front-end. In order to get an account, you have to sign-in by using this link. Fill in all the information being requested. The OSG Connect Project established for  REDTOP computing is: REDTOP. You have to specify that in the application form.

Once the form has been submitted, a ticket is opened at Globus and you will be contacted by a University of Chicago representative for a short telephone interview. Usually, within 24 hours, you account is ready and you will be able to login using the following link: https://www.globusid.org/login.

A complete documentation for the OSG is found at: https://docs.globus.org/resource-provider-guide/. Some instructions are also available here. if you have trouble to connect to the OSG, please check that your firewall is configured according to the following policy.

Running REDTOP software on the OSG

At present, only the Ilcroot simulation framework is set up to compile and run on the OSG. The slic simulation framwork and the lcsim reconstruction framework will be available shortly.

 

Basic instruction to run a ilcroot test job on OSG

Once you have the account on OSG:

  • login on login.osgconnect.net using your new accout: ssh your_user@login.osgconnect.net
  • copy /stash2/project/@REDTOP/users/test/jobs in your home directory and follow the procedure:
    cp -ia /stash2/project/@REDTOP/users/test/jobs .
    cd jobs
  • the directory content consists of:
    1. log directory, with the logs of the grid job;
    2. redtop-ilcroot-test.sh, which is the script that will run ilcroot on the grid worker node to simulate 10 events, each with 3 single particles at a random direction in REDTOP;
    3. redtop-ilcroot-test.grid, which is the configuration file of the grid.
  •  to run the test job use the following command:
    condor_submit redtop-ilcroot-test.grid
  •  when the jobs has been submitted, you will seen in the log directory a log file named redtop-ilcroot-test-########-0.log; that will contain the log of the grid job.
  •  to check the status of the job you can use the command
    condor_q your_username
  •  when the job is complete (it will take few minutes to run) the log file will provide some statistics about the jobs. Two more log files will appear in the log directory: redtop-ilcroot-test-########-0.error and redtop-ilcroot-test-########-0.output. These are the standard error and the standard output of the job you have just run.
  • the directory used to submit the job will also contains the file redtop-ilcroot-output.tar.gz that is the tar-ball of the job output, with the most relevant files produced by geant4. Namely, the hits for each subdetector of REDTOP, the MC truth information, and the log of the ilroot simulation of the hits in REDTOP:
    • Run_0/MUPOL.Hits.root
    • Run_0/OTPC.Hits.root
    • Run_0/ADRIANO.Hits.root
    • Run_0/Kinematics.root
    • Run_0/gilc.root
    • Run_0/TrackRefs.root
    • Run_0/sim_hits.log

If you have a local installation of ilcroot, you can copy the tar-ball on your local machine, decompress it and run the ilcroot event display to look at the events: MC particles and hits. Unfortunately the OSG node login.osgconnect.net has not graphic library installed and the ilcroot event display is disabled.

This will be just a test job. We are working to get a procedure more configurable to simulate the needed events on the OSG grid.

Data storage

All data generated for the simulation are stored at the following ftp address: ftp://t1015-svn.fnal.gov. The server is password protected and the data are read-only: ask your PI for that information. The data are grouped according to the detector geometry version used in the simulation. Below is a list of the file types currently available:

  • .hepevt: the primary interaction files from GenieHad in hepevt format
  • .stdhep: the primary interaction files from GenieHad in stdhep format
  • .slcio: the output file from Slic (geant4) simulation. These contain the hits and the Montecarlo truth for the event
  • .aida: reconstruction and/or analysis file obtained with the lcsim reconstruction framework.

Detailed information on the Montecarlo generated events for physics and detector studies is available at the Montecarlo Data page.

.