VictoriaSanchezMartinez - IFIC - ... in construction ...
Preparing your Digital Certificate
The followings steps should be done to install your certificate in the shell (see the twiki page
WorkBookStartingGrid ) either of LXPLUS or AFS
AtlasDataProcessingAtIFIC2015:
~> ssh -X vicsanma@ui06.ific.uv.es ### ssh -X vsanchez@lxplus.cern.ch
~> cd $HOME
~> mkdir -p .globus
~> cd .globus
~> cp /*/myCertificate.p12 .
~> openssl pkcs12 -in myCertificate.p12 -clcerts -nokeys -out usercert.pem
~> openssl pkcs12 -in myCertificate.p12 -nocerts -out userkey.pem
~> chmod 400 userkey.pem
~> chmod 444 usercert.pem
~> source /afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh
~> voms-proxy-init -voms atlas
usercert.pem is the certificate file and
userkey.pem is the private key file.
This certificate
myCertificate.p12 should be copied across to your laptop and imported into the browser:
- Preferences (or Tools) → Advanced → Encryption → View Certificates → Your Certificates → Import
To confirm that the browser certificate is installed and correctly working, we will confirm that you are able to log into
AMI (click on Account Validation). If you have any expired certificates this is a good time to remove them - but be careful not to remove current ones.
Athena
The
Athena framework (
AthenaFramework-I and
AthenaFramework-II) is an enhanced version of the Gaudi framework that was originally developed by the LHCb experiment, but is now a common ATLAS-LHCb project and is in use by several other experiments including GLAST and HARP. Athena and Gaudi are concrete realizations of a component-based architecture (also called Gaudi) which was designed for a wide range of physics data-processing applications. The fact that it is component-based has allowed flexibility in developing both a range of shared components and, where appropriate, components that are specific to the particular experiment and better meet its particular requirements.
The
ATLAS software is divided into packages, and these are managed through the use of a configuration management tool, CMT. This is used to copy ("check out") code from the main ATLAS repository and handle linking and compilation.
You can check all the athena's releases here:
~> ssh -X vicsanma@ui06.ific.uv.es
~> ls $VO_ATLAS_SW_DIR/software/
(Beware: before was
~> ls ${VO_ATLAS_SW_DIR}/prod/releases ).
Athena 15.X.Y
Valid for Athena's releases like
15.X.Y. You must prepare your account, in the following way. Note that the cmthome directory does not have to be in $HOME, it can be in any sub-directory, but if so you will need to amend all the following examples accordingly.
~> ssh -X vicsanma@ui06.ific.uv.es
~> cd $HOME
~> mkdir -p AthenaTestArea/15.X.Y/cmthome
~> cd AthenaTestArea/15.X.Y/cmthome
~> emacs requirements & ###this file is shown below
~> source $VO_ATLAS_SW_DIR/software/15.X.Y/cmtsite/setup.sh
~> cmt config
~>
~> cd $HOME
~> source AthenaTestArea/15.X.Y/cmthome/setup.sh -tag=15.X.Y,setup
~>
~> echo $TestArea
~> cd $TestArea
~> cmt show versions PhysicsAnalysis/AnalysisCommon/UserAnalysis
~> cmt co -r UserAnalysis-00-15-04 PhysicsAnalysis/AnalysisCommon/UserAnalysis
~> cd PhysicsAnalysis/AnalysisCommon/UserAnalysis/cmt/
~> cmt config
~> source setup.sh
~> cmt make
The
requirements file is like this:
#---------------------------------------------------------------------
set CMTSITE STANDALONE #establecer el cmtsite???#
###old###set SITEROOT ${VO_ATLAS_SW_DIR}/prod/releases/rel_15-24/
set SITEROOT ${VO_ATLAS_SW_DIR}/software/15.X.Y/
macro ATLAS_DIST_AREA ${SITEROOT}
macro ATLAS_TEST_AREA ${HOME}/AthenaTestArea
apply_tag projectArea
macro SITE_PROJECT_AREA ${SITEROOT}
macro EXTERNAL_PROJECT_AREA ${SITEROOT}
apply_tag simpleTest
set SVNROOT svn+ssh://vsanchez@svn.cern.ch/reps/atlasoff
apply_tag noSVNROOT
use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)
#set CMTCONFIG i686-slc5-gcc43-opt
#---------------------------------------------------------------------
Every time you close and open the terminal, you must
set up Athena, doing:
~> cd $HOME
~> source AthenaTestArea/15.X.Y/cmthome/setup.sh -tag=15.X.Y,setup
~> cd $TestArea
~> cd PhysicsAnalysis/AnalysisCommon/UserAnalysis/run
Athena 16.X.Y
Valid for Athena's releases like
16.X.Y.
~> cd $HOME
~> source $VO_ATLAS_SW_DIR/local/setup.sh
~> mkdir -p AthenaTestArea/16.X.Y ###old###export AtlasSetup=${VO_ATLAS_SW_DIR}/prod/releases/rel_16-2/AtlasSetup
~> export AtlasSetup=${VO_ATLAS_SW_DIR}/software/16.X.Y/AtlasSetup
~> alias asetup='source $AtlasSetup/scripts/asetup.sh'
~> asetup 16.X.Y --testarea=$HOME/AthenaTestArea --svnroot=svn+ssh://vsanchez@svn.cern.ch/reps/atlasoff --multitest --dbrelease "<latest>"
~>
~> cd $TestArea
~> pwd
~> cmt co -r UserAnalysis-00-15-04 PhysicsAnalysis/AnalysisCommon/UserAnalysis
~> cd PhysicsAnalysis/AnalysisCommon/UserAnalysis/cmt/
~> cmt config
~> source setup.sh
~> cmt make
~>
~> cd ../run
~> get_files AnalysisSkeleton_topOptions_AutoConfig.py ###equivalent to: cp ../share/AnalysisSkeleton_topOptions_AutoConfig.py .
Every time you close and open the terminal, you must
set up Athena, doing:
~> cd $HOME
~> source /lustre/ific.uv.es/sw/atlas/local/setup.sh
~> export AtlasSetup=${VO_ATLAS_SW_DIR}/software/16.X.Y/AtlasSetup
~> alias asetup='source $AtlasSetup/scripts/asetup.sh'
~> asetup 16.X.Y --testarea=$HOME/AthenaTestArea --svnroot=svn+ssh://vsanchez@svn.cern.ch/reps/atlasoff --multitest
~> cd $TestArea/PhysicsAnalysis/AnalysisCommon/UserAnalysis/run/
MadGraph
Twikis, webs, information about
MadGraph:
MadGraph is a matrix element creator. Given the process, MadGraph automatically creates the amplitudes for all the relevant subprocesses and produces the mappings for the integration over the phase space. This process-dependent information is packaged into
MadEvent, and a self containted code is produced that can be downloaded from the web site and allows the user to calculate
cross sections and to obtain
unweighted events. Alternatively, events can be also generated directly from the web1, by filling a form and letting the code run over our clusters. Once the events have been generated - event information, (e.g. particle id's, momenta, spin, color connections) is stored in the “
Les Houches Event Files” (LHEF) format -, may be passed directly to a shower Monte Carlo program (interface is available for
Pythia) or be used as inputs for combined matrix element-shower calculations. A series of standard plots and a rootfile contaning the parton level events are also automatically created for events generated over the web. Optionally, rootfiles containing event at pythia or after a generic simulation of a detector (PGS) can be available.
The code is written in Fortran77 and has been developed using the g77 compiler under Linux. The code is parallel in nature and it is optimized to run on a PC farm. At present, the supported PBS batch managing systems are respectively Torque, proPBS and Condor at Italian, American and Belgian clusters. Process specific codes are self contained and therefore do not need any external library. In principle the matrix-element creator can handle any user's request, however, they are limitations of the code are related to the maximum number of final state QCD particles. Currently, the package is limited to
ten thousands diagrams per subprocess. So, for example, W+5 jets which has been calculated, is close to its practical limit.
MadGraph4
MadGraph4. Once this package is compiled, it can be used in any other directory. This can be useful if you want to use the GRID. It was possible to install MG4 in local computers (evaluQ) and after run the package in the UserInterface's temporal, but it didn't achieve to work at MAC laptop.
~> ssh -X vicsanma@evaluQ.ific.uv.es
~> cd $DIRECTORY ###directory where you want to work
~> gunzip MG4_vX.Y.Z.tar.gz
~> tar -xvf MG4_vX.Y.Z.tar
~> cd MG4_vX.Y.Z/
~> make
~> mkdir myTest
~> cp -a Template/ myTest/
~> cd myTest/
~> emacs Cards/proc_card.dat & ###put the process which you want
~> bin/newprocess ###it may take a few minutes...
~> emacs Cards/param_card.dat & ###modify some properties like mass, width, couplings...
~> emacs Cards/run_card.dat & ###modify some properties like Nºevents, Ebeam, pdf, scales, cuts...
~> bin/generate_events
0
testName
The most important info after the generation is in
testName_unweighted_events.lhe.gz, within the folder
Events/. In this folder, there are other file named
testName_banner.txt, which is a summary about all cards.dat.
In
HTML/ folder, you can find a
crossx.html file, which contains more information and feynman diagrams of the process.
MadGraph5
MadGraph5. In order to use MG5 in Ubuntu, we must have
Python 2.6.5 y
gcc4.4.3 (for now, we couldn't make it work at UserInterface). With MG5 we can work in two different ways:
1) Like we worked with MG4
~> ssh -X vicsanma@evaluQ.ific.uv.es
~> cd $DIRECTORY ###directory where you want to work
~> gunzip MadGraph5_vX.Y.Z.tar.gz
~> tar -xvf MadGraph5_vX.Y.Z.tar
~>
~> cd MadGraph5_vX.Y.Z/
~> mkdir myTest
~> cp -a Template/ myTest/
~> cd myTest/
~> emacs Cards/proc_card.dat & ###put the process which you want
~> bin/newprocess ###it may take a few minutes...
~> emacs Cards/param_card.dat & ###modify some properties like mass, width, couplings...
~> emacs Cards/run_card.dat & ###modify some properties like Nºevents, Ebeam, pdf, scales, cuts...
~> bin/generate_events
0
testName
2) From bin/mg5
~> ssh -X vicsanma@evaluQ.ific.uv.es
~> cd $DIRECTORY ###directory where you want to work
~> gunzip MG4_vX.Y.Z.tar.gz
~> tar -xvf MG4_vX.Y.Z.tar
~>
~> cd MG4_vX.Y.Z/
~> bin/mg5
mg5> import model MODEL ###you pick the MODEL
mg5> generate p p > t t~ ###PROCESS you want
mg5> output myTest ###
mg5> exit
~> cd myTest/
~> emacs Cards/param_card.dat &
~> emacs Cards/run_card.dat &
~> bin/generate_events
0
testName
LHApdf
To use
mstw2008lo in MadGraph5, the users needs to install LHAPDF (see twiki page
Using mstw2008lo in MadGraph5 ), which is recommended by MC12.
~> cd $DIRECTORY
~> gunzip lhapdf-5.8.8b1.tar.gz
~> tar -xvf lhapdf-5.8.8b1.tar
~> cd lhapdf-5.8.8b1
~> ./configure
~> make
~> sudo make install
~> bin/lhapdf-getdata cteq
~> bin/lhapdf-getdata mstw
The users must check if all the libraries which they want have been copied correctly. If not, they should copy them manually. With these steps, LHAPDF has been installed right!
~> cd $DIRECTORY/MadGraph5_v1_3_33
~> mkdir myTest
~> cd myTest
~> cp -a Template/ myTest/
~> mv Template/* .
~> rm -r Template/
~> more README.lhapdf ## Follow these steps to get the libraries which you need
~> emacs Cards/proc_card_mg5.dat &
~> bin/newprocess_mg5
~> emacs Cards/param_card.dat &
~> emacs Cards/run_card.dat &
In your run_card.dat, you have to put the following :
# 'cteq6l1' = pdlabel ! PDF set
'lhapdf' = pdlabel ! PDF set
21000 = lhaid ! PDF number used ONLY for LHAPDF
NOTE: in folder
lhapdf_5.8.8b1/ you can find a file name
PDFsets.index. This file shows the correspondence between lhapdf's name and code.
Table1 -
This table compares the cross sections for samples with the same pdf but different center of mass energy. The samples have been generated with MadGraph5.v1.3.33
(the generated process was
pp > o1 > tt~, where o1=KKG with m=1TeV):
Table2 -
This table shows the
matching at different center of mass energy. The samples have been generated with MadGraph5.v1.3.33
(the generated process was
pp > o1 > tt~ (+) pp > o1 > tt~g, where o1=KKG with m=1TeV):
KKgluon with MSTW2008
KKgluon samples has been generated in three final states: dileptonic, semileptonic and allHadronic. In the following table, you can get the
cards.dat used at the generation with
MadGraph5_v1.3.33 & lhapdf_5.8.8b1(pdf=mstw2008) (using
topBSM_v4.tar):
Table3.1
(Beware: this cards are for Ecm=7TeV!!!)
Table3.2 -
This table shows some information obtained in the generation at
7TeV and
8TeV (ptj,pta,ptl=0 and ptjmax,ptamax,ptlmax=infinity):
Table3.3 -
This table shows some information obtained in the generation of
pp > o1 > tt~ process with
pdf=mstw2008 (ptj,pta,ptl=0 and ptjmax,ptamax,ptlmax=infinity):
(Beware: you have to change the values of
mass,
widht,
scales and
beam energy within the
param_card.dat and
run_card.dat).
NEW Matching Study
All the samples have been generation with
MadGraph5_v1.3.33 & lhapdf_5.8.8b1(pdf=mstw2008).
This is an example of how the matching value affects at the cross section. To carry out this study, we've used different processes, all of them at
semileptonic final state.
Table4.1 -
This table shows the different processes generated:
Table4.2 -
This table shows the
MATCHING STUDY using different initial process, for a KKG
mass=500GeV,
Ecm=8TeV and
xqcut=10GeV. The
param_card.dat used is the same for all the samples (
param_card.dat). At MadGraph, jet has been defined as
j = g u c d s u~ c~ d~ s~:
(you can find the ControlCards at /afs/cern.ch/user/v/vsanchez/public/Matching/ControlCards)
OLD Matching Study
This is an example of how the matching value affects at the cross section. To carry out this study, we've used differents processes:
Table5.1 -
This table shows the different initial process used:
Table5.2 -
This table shows the
MATCHING STUDY using different initial process, for a KKG
mass=1600GeV and
Ecm=7TeV. The param_card.dat used is the same for all the samples (
param_card.dat). At MadGraph, jet has been defined as
j = g u c d s u~ c~ d~ s~:
Table5.3 -
This table shows the
MATCHING STUDY using different initial process, for a KKG
mass=1600GeV and
Ecm=8TeV. The param_card.dat used is the same for all the samples (
param_card.dat). At MadGraph, jet has been defined as
j = g u c d s u~ c~ d~ s~:
Table5.4 -
This table shows the
MATCHING STUDY using different initial process, for a KKG
mass=500GeV and
Ecm=8TeV. The param_card.dat used is the same for all the samples (
param_card.dat). At MadGraph, jet has been defined as
j = g u c d s u~ c~ d~ s~:
To obtain the Truth Jets with pT>20GeV plot, you have to get both files
classTRUTH.C and
classTRUTH.h, and run them over the
my.truth.ntup.root file as follow:
~> cd FOLDER ###this folder contains my.truth.ntup.root and classTRUTH.*
~> root -l -q classTRUTH.C+
~> root
root> .L classTRUTH.C+
root> classTRUTH* t = new classTRUTH; t->Loop();
root> ### save canvas3.C
root> ### save canvas4.C

KKG's codes:
- MadGraph: 6000048.
- Pythia: 5100021.
Validation process (mc12)
To verify that the LHEF samples have been made properly and the new JobOption works, we have to run
Generate_trf.py (it takes you around 10min) and
Reco_trf.py (it takes you around 25min) over the
file.tar.gz (this is the unweighted.lhe file) and use your JobOption. With this
script you can obtain the
my.truth.ntup.root...
With this
package you can obtain the
my.truth.ntup.root and the validation plots...

To generate the new JOs files: get this two files (
script and
template) and run "
source New_JO_cambiarNombre.sh". Thus, you'll obtain all the JOs files for the kkg.

README:
~> Beware!!! log in to @lxplus440 (it works with SL5)
~> Download the package validation_MC12_vsm.tar and untar it.
~> Copy JobOptions file, group file (with extension .events and .tar.gz) inside the validation folder (this will be your work directory).
~> Be sure inside the folder the following files exist:
parse_evgen_log.py
MC12JobOpts-00-04-85_v0.tar.gz
(http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/EvgenJobOpts/)
(more /cvmfs/atlas.cern.ch/repo/sw/Generators/MC12JobOptions/tag --> MC12JobOptions-00-10-78/)
~> Open runGenerationValidation.sh file and choose the suitable variables:
Athena's release (17.2.4.8 for these files and scripts)
(ls /cvmfs/atlas.cern.ch/repo/sw/software/x86_64-slc6-gcc47-opt/17.8.0/AtlasProduction/17.8.0.3/)
Mass ($MASS)
RunNumber ($RUNNUMBER)
JobOptions name ($JOBOPTIONS)
GroupFile name ($GroupFile)
Work directory name ($WORKdirectory)
~> Inside the work directory execute "source runGenerationValidation.sh"
~> Finally, you'll obtain several plots...
~> ls /cvmfs/atlas.cern.ch/repo/sw/software/17.2.4/AtlasProduction/17.2.4.8/Generators/EvgenJobTransforms/scripts/
### Generate_trf.py
~> ls /cvmfs/atlas.cern.ch/repo/sw/software/17.2.4/AtlasProduction/17.2.4.8/Generators/MC12JobOptions/
### TODO bin cmt common gencontrol i686-slc5-gcc43-opt share
~> ls /cvmfs/atlas.cern.ch/repo/sw/software/17.2.4/AtlasProduction/17.2.4.8/InstallArea/jobOptions/MC12JobOptions/
### Pythia8_AU2_MSTW2008LO_Common.py ...
~> ls /cvmfs/atlas.cern.ch/repo/sw/software/x86_64-slc5-gcc43-opt/17.2.4/AtlasProduction/17.2.4.8/InstallArea/share/bin/Generate_trf.py
~> ls /cvmfs/atlas.cern.ch/repo/sw/software/x86_64-slc5-gcc43-opt/17.2.4/AtlasProduction/17.2.4.8/InstallArea/share/bin/Reco_trf.py
~>
JOBOPTSEARCHPATH=/cvmfs/atlas.cern.ch/repo/sw/Generators/MC12JobOptions/latest/common:$JOBOPTSEARCHPATH
JOBOPTSEARCHPATH=/cvmfs/atlas.cern.ch/repo/sw/Generators/MC12JobOptions/latest/gencontrol:$JOBOPTSEARCHPATH
JOBOPTSEARCHPATH=/cvmfs/atlas.cern.ch/repo/sw/Generators/MC12JobOptions/latest/share/DSID182xxx:$JOBOPTSEARCHPATH

README for SLC6: for now it doesn't work!!! The only supported MC12 evgen release is 17.2.X.Y. Also, for MC12 reco it is 17.2.X.Y again or 17.3.X.Y for
upgrade samples...
MC & DATA - produced samples
Input files generated with MadGraph5 and different PDFs.
See our private
ProducedSamples generated by Exotics Physics Group at IFIC with differents MadGraph's releases.
MC11 - KKgluon
These the produced KKgluon-MC-samples have these features:
- MG_ME_V4.4.57
- PDF set = cteq6l1
- Ecm=7TeV
- ptj=20; pta,ptl=10 and ptjmax,ptamax,ptlmax=infinity (1d5)
Table6.1 -
This table shows some features about the
KKgluon samples generated with MadGraph4.
These the produced KKgluon-MC-samples have these features:
- MadGraph5_v1.3.33
- PDF set = cteq6l1
- Ecm=7TeV
- ptj=20; pta,ptl=10 and ptjmax,ptamax,ptlmax=infinity (1d5)
Table6.2 -
This table shows some features about the
KKgluon samples. The
control_card.dat is a summary about the
proc_card.dat,
param_card.dat and
run_card.dat used in the production. You can run this script (
script) to get all the samples.
MC12 - KKgluon
All the produced KKgluon-MC-samples have these features:
- MadGraph5_v1.3.33
- lhapdf_5.8.8b1(pdf=mstw2008 - lhapdf=21000)
- Ecm=8TeV
- ptj,pta,ptl=0 and ptjmax,ptamax,ptlmax=infinity
Table7.1 -
This table shows some features about the
KKgluon samples. The
ControlCard.txt is a summary about the
proc_card.dat,
param_card.dat and
run_card.dat used in the production. You can run these two scripts (
script1 and
script2) to get all the samples, but you need to use these templates (
KKG_run_card_template.dat and
KKG_param_card_template.dat):
The cross section versus mass of the kkg is shown below:
The "peak" is related to the mass threshold effect.
Table7.2 -
This table contains some samples of
KKgluon with different widths. The
KKGancho_proc_card_mg5.dat is the same for all. You can run this
script to get all the samples, but you need to use these templates (
KKGancho_run_card_template.dat and
KKGancho_param_card_template.dat):
Table7.3 -
This table contains some samples of
nominal KKgluon with 1TeV mass. The generated process contains the
semileptonic final state
(
p p > o1 > t t~, (t > b w+, w+ > l+ vl), (t~ > b~ w-, w- > j j)). Here you can find the
cards.dat used (
proc_card.dat,
run_card.dat &
param_card.dat). The produced samples have these features:
- MadGraph5_v1.3.33
- lhapdf_5.8.8b1(pdf=mstw2008 - lhapdf=21000)
- Ecm=8TeV
- ptj,pta,ptl=0 and ptjmax,ptamax,ptlmax=infinity
KKgluon
Graviton
Running the Full Chain
In order to produce Monte Carlo events on which to perform analysis, a Full Chain of steps needs to be taken from Generation to the production of Analysis Object Data (AOD) as shown in the diagram below. Because of the time this process takes (especially the Simulation stage) it is unlikely that most users will produce many events themselves but will rely on centrally produced events. However limited test samples of specific channels may be necessary from time to time. Each of these stages is described separately but they can be combined if required. (See the ATLAS Computing Workbook
twiki )
You can short circuit the Full Chain by using
Atlfast, which provides a fast simulation of the whole chain by taking the generated events and smearing them to produce AOD directly. Atlfast can in fact take input from any of the event generator, simulation, digitization, or Event Summary Data (ESD) files.
Schematic representation of the Full Chain Monte Carlo production
Here are some commands to run the Full Chain step by step (at IFIC). This example is fot
Athena's release 15.6.14.13:
Preparing the work environment:
~> ssh -X vicsanma@ui06.ific.uv.es
~> source AthenaTestArea/15.6.14.13/cmthome/setup.sh -tag=AtlasProduction,15.6.14.13,setup ###setup Athena
~> get_files Evgen_trf.py ### if you want Atlfast use EvgenCorrAtlfast_trf.py
~> get_files MC10.115554.Pythia_MadGraph_KKGluonTTbar_1000.py ### here, the jobOptions which you need
~> get_files AtlasG4_trf.py
~> get_files Digi_trf.py
~> get_files Reco_trf.py
To make the steps more cleanly and avoid mistakes, you can create
five folders, one for each step, and then, put the corresponding file
*_trf.py inside.
1. GENERATION: If you need to know what means every argument, you can run
~> Evgen_trf.py -h and then see all the options.
~> Evgen_trf.py --omitvalidation=testEventMinMax ecmEnergy=7000. runNumber=115554 firstEvent=1 maxEvents=10 randomSeed=1324354657 jobConfig=MC10.115554.Pythia_MadGraph_KKGluonTTbar_1000.py outputEvgenFile=pythia.pool.root inputgeneratorfile=group10.phys-gener.madgraph.115554.KKGluonTTbar1000.TXT.v1._00001.tar > athena_gen.out 2>&1
2. SIMULATION:
~> AtlasG4_trf.py inputEvgenFile=pythia.pool.root outputHitsFile=g4hits.pool.root maxEvents=10 skipEvents=0 randomSeed=1234 geometryVersion=ATLAS-GEO-16-00-00 conditionsTag=OFLCOND-SIM-BS7T-02 > athena_sim.out 2>&1
3. DIGITIZATION:
~> Digi_trf.py inputHitsFile=g4hits.pool.root outputRDOFile=g4digi.pool.root maxEvents=10 skipEvents=0 geometryVersion=ATLAS-GEO-16-00-00 conditionsTag=OFLCOND-DR-BS7T-ANom-16 > athena_digi.out 2>&1
4. RECONSTRUCTION:
~> Reco_trf.py inputRDOFile=g4digi.pool.root outputESDFile=esd.pool.root maxEvents=10 skipEvents=0 geometryVersion=ATLAS-GEO-16-00-00 conditionsTag=OFLCOND-DR-BS7T-ANom-16 autoConfiguration='everything' > athena_rec.out 2>&1
5. PRODUCTION-AOD:
~> Reco_trf.py inputESDFile=esd.pool.root outputAODFile=aod.pool.root maxEvents=10 skipEvents=0 geometryVersion=ATLAS-GEO-16-00-00 conditionsTag=OFLCOND-DR-BS7T-ANom-16 autoConfiguration='everything' > athena_aod.out 2>&1
Creating a dataset
Since we have generated all our samples with MadGraph, we must change some particle's code as codes in madGraph are differents from codes in Pythia. The file's name should describe what it contains, for instance:
1. Changing the particles's code.
You have to replace all the KKG's codes in the file.lhe. You can do it in two ways:
~> emacs test_kkg_unweighted_events.lhe &
### Edit -> Search -> Replace
### 6000048 EnterKey 5100021 EnterKey !key
### close emacs
~> sed 's/6000048/5100021/' test_kkg_unweighted_events.lhe > new_test_kkg_unweighted_events.lhe
~> rm test_kkg_unweighted_events.lhe
~> mv new_test_kkg_unweighted_events.lhe test_kkg_unweighted_events.lhe
2. Changing the file's name and extension. The following steps should be performed.
~> mv test_kkg_unweighted_events.lhe group10.phys-gener.madgraph.115554.KKGluonTTbar1000.TXT.v1._00001.events
###this last name is inside the jobOptions MC10.115554.Pythia_MadGraph_KKGluonTTbar_1000.py
~> tar -cvf group10.phys-gener.madgraph.115554.KKGluonTTbar1000.TXT.v1._00001.tar group10.phys-gener.madgraph.115554.KKGluonTTbar1000.TXT.v1._00001.events
###be careful: tar -cvf NAME.tar NAME.events (no NAME.events.tar)
3. Creating a dataset. To create our dataset we are going to use DQ2, so we need to set up:
~> ssh -X vicsanma@ui06.ific.uv.es
~> voms-proxy-init -voms atlas ### or voms-proxy-init -voms atlas:/atlas/phys-gener/Role=production
~> source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
We change the value of local_site variable to store the dataset in IFIC's disk:
~> echo $DQ2_LOCAL_SITE_ID ### probably is CERN-PROD_SCRATCHDISK
~> export DQ2_LOCAL_SITE_ID=IFIC-LCG2_SCRATCHDISK ### IFIC-LCG2_LOCALGROUPDISK
To make tasks easier, let's create some
variables:
~> NAME="group.phys-gener.${GENERATOR}.${DATASETNUMBER}.${DESCRIPTION}.TXT.${mcXY}_${VERSION}"
~> DATASET=${NAME}_i11
~> CONTAINER=${NAME}/
~> directory=/lustre/ific.uv.es/grid/atlas/t3/vicsanma/ALLsamples
~> uploaddirectory=${directory}/samples_kkg
We
create the dataset in this way:
~> dq2-put -s ${uploaddirectory} ${DATASET}
To ensure that there is no problems with the file's names and datasets, we have to run a validation file (the file can be downloaded here
here):
~> python validate_ds2eos.py ${NAME}
If all went fine, we can freeze the dataset (Make sure that when the dataset had been freezed, it must be like COMPLETE):
~> dq2-freeze-dataset ${DATASET}
Now we should register the dataset and the container (the latter is optional):
~> dq2-register-subscription ${DATASET} CERN-PROD_PHYS-GENER
~> dq2-register-container ${CONTAINER}
~> dq2-register-datasets-container ${CONTAINER} ${DATASET}
and
it is ready for use. If all is right, you will able to get it with DQ2.
- If you want to add others datasets inside the container above:
~> dq2-register-datasets-container ${CONTAINER} Dataset_1 Dataset_2 Dataset_3
- If you need to remove a dataset from only a site:
~> dq2-delete-replicas DatasetName Site
- If you need to completely erase a dataset:
~> dq2-delete-replicas DatasetName Site
- If you need to change the dataset name you should do the following (beware, this step don't remove the old dataset, it creates a new one):
~> dq2-put -D OLDdatasetName NEWdatasetName
~> 0-20 ###here you have to put the files what you want (In this case, there are 20files and I want 20files)
~> y ### yes
~> y ###yes
Using a Good Run List to handle real DATA and MC samples
This steps have been done
@lxplus.cern.ch. I tried to do it at IFIC, but I didn't get it. (The last time these steps were done correctly was on 29/04/2012)
Getting the suitable GRL
If you are confused about which GRLs have to use with which version of the data, you can take a look at the table in the following
Good Run List for Analysis twiki. When you had found the suitable GRL, look at this
list and search more information in
GRL_LUMI.txt about your GRL (periods, runs, luminosity...).
For example, if you are working with quark tops, you can "download" the GRL from:
YourUserName@ui06.ific.uv.es~> /afs/cern.ch/user/a/atlasdqm/www/grlgen/Top/
Setting up CMT
(This subsection can be skipped without loss of generality in case a release has already been setup.)
1. Create a
cmthome directory in your home directory:
~> ssh -X vsanchez@lxplus.cern.ch
~> cd $HOME
~> mkdir -p GoodRunList/cmthome
~> cd GoodRunList/cmthome
2. Make your login
requirements file - use your favorite editor to create a file called
requirements and paste these lines within it:
#---------------------------------------------------------------------
set CMTSITE CERN
set SITEROOT /afs/cern.ch
set ATLAS_DIST_AREA ${SITEROOT}/atlas/software/dist
macro ATLAS_TEST_AREA ${HOME}/GoodRunList/scratch0/15.6.4
apply_tag 32
apply_tag AtlasOffline
apply_tag setupCMT
apply_tag setup
apply_tag oneTest
apply_tag simpleTest
use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)
#---------------------------------------------------------------------
Note: Line 4 (
macro ATLAS_TEST_AREA): Change the path so that it points to wherever you want to work. If you have scratch space (by default at CERN this is called scratch0) it is recommended that you work in there. Of course if you want to make a new directory inside this space, that is fine.
Save and exit this file
3. Now set up CMT as follows:
~> source /afs/cern.ch/sw/contrib/CMT/v1r20p20090520/mgr/setup.sh
~> cmt config
4. And then make a working directory:
~> mkdir -p $HOME/GoodRunList/scratch0/15.6.4
You only have to do this once.
5. Now you can setup release 15.6.4
~> cd $HOME
~> source GoodRunList/cmthome/setup.sh -tag=15.6.4,32
This you will have to do every time you want to use your new release. See also the next section.
Now check your environment:
~> echo $CMTPATH
At
cern this should look something like:
/afs/cern.ch/user/InitialOfYourUserName/YourUserName/GoodRunList/scratch0/15.6.4:/afs/cern.ch/atlas/software/builds/AtlasOffline/15.6.4:/afs/cern.ch/atlas/software/builds/AtlasSimulation/15.6.4:/afs/cern.ch/atlas/software/builds/AtlasAnalysis/15.6.4:/afs/cern.ch/atlas/software/dist/AtlasSettings/AtlasSettings-03-02-37/share:/afs/cern.ch/atlas/software/builds/AtlasTrigger/15.6.4:/afs/cern.ch/atlas/project/tdaq/prod/dqm-common/dqm-common-00-10-00:/afs/cern.ch/atlas/software/builds/AtlasReconstruction/15.6.4:/afs/cern.ch/atlas/software/builds/AtlasEvent/15.6.4:/afs/cern.ch/atlas/software/builds/AtlasConditions/15.6.4:/afs/cern.ch/atlas/software/builds/AtlasCore/15.6.4:/afs/cern.ch/atlas/project/tdaq/prod/tdaq-common/tdaq-common-01-14-00:/afs/cern.ch/atlas/software/builds/DetCommon/15.6.4:/afs/cern.ch/atlas/offline/external/GAUDI/v20r4p6:/afs/cern.ch/atlas/offline/external/LCGCMT/LCGCMT_56c
Then:
~> echo $TestArea
This should look like:
/afs/cern.ch/user/InitialOfYourUserName/YourUserName/GoodRunList/scratch0/15.6.4
6. Go into your
$TestArea, now we can install the good run list package(s).
Setting up the good run list package(s)
1. Go to your
$TestArea:
~> cd $HOME
~> source GoodRunList/cmthome/setup.sh -tag=15.6.4,32
~> cd $TestArea
2. Now we check out the recommended version of the good run list and related packages. If you are working on a remote installation, do not forget to get a Kerberos ticket:
~> kinit -5 CernUserName ###put the password once.
Then checkout the packages you may need. The latest tags are given at Recommanded release and tags.
~> cmt co -r GoodRunsListsUser-00-00-12 DataQuality/GoodRunsListsUser
~> cmt co -r GoodRunsLists-00-00-96 DataQuality/GoodRunsLists
~> cmt co -r LumiBlockComps-00-02-11 LumiBlock/LumiBlockComps
~> cmt co -r CoolRunQuery-00-03-35 Database/CoolRunQuery
3. Now we can compile:
~> cd $TestArea/Database/CoolRunQuery/cmt/
~> cmt make; source setup.sh
~> cd $TestArea/DataQuality/GoodRunsLists/cmt/
~> cmt make; source setup.sh
~> cd $TestArea/LumiBlock/LumiBlockComps/cmt/
~> cmt make; source setup.sh
~> cd $TestArea/DataQuality/GoodRunsListsUser/cmt/
~> cmt make; source setup.sh
4. It is useful to make the script
init.sh. Open your favorite editor to create
init.sh, and insert the lines:
~> echo $TestArea
~> source $HOME/GoodRunList/cmthome/setup.sh -tag=15.6.4,32
~> source $TestArea/Database/CoolRunQuery/cmt/setup.sh
~> source $TestArea/DataQuality/GoodRunsLists/cmt/setup.sh
~> source $TestArea/LumiBlock/LumiBlockComps/cmt/setup.sh
~> source $TestArea/DataQuality/GoodRunsListsUser/cmt/setup.sh
~> cd $TestArea
Then do
source init.sh. This you will have to do every time you want to use the good run list packages in a new shell.
5. You are now ready to use the good run list tools!
t-tbar asymmetries
Useful information, paper, notes... about
ttbar asymmetry:
- Workshop@CERN - Top physics: from charge asymmetry to the boosted regime (from 02/05/2012 to 04/05/2012).
- Summary of the article A charge asymmetry measurement for high mass t-tbar pairs (ATL-COM-PHYS-2012-786, not yet published) - slides in English or Spanish.
At hadron colliders, perturbative QCD predicts that the
top quark will be emitted preferentialy in the direction of the incoming
valence quark and the
antitop in the direction of the incoming
sea antiquark. Although
ttbar production is predicted to be symmetric under charge conjugation at leading order (LO), at next-to-leading order (NLO) the processes
q qbar → t tbar g (FSR) and
q g → t tbar q (ISR) exhibit a small asymmetry in the differential distributions of the top and antitop, due to interference between initial (ISR) and final (FSR) state gluon emission (negative contribution to our signal,
ttbar+1jet). The
q qbar → t tbar process also possesses an asymmetry due to the interference between the Born and box diagrams (positive contribution,
ttbar+0jet).
At the LHC with
pp collisions, the
dominant mechanism for ttbar production is the gg fusion process which is charge symmetric, while the
ttbar production via
qqbar or
qg is small in most of the phase space.
Nevertheless, QCD predicts a small excess of centrally produced antitop quark while top quarks are produced, on average, at higher absolute rapidities (on average, the valence quark carries a larger momentum fraction than the sea antiquark).
With top quark preferentially emitted in the direction of the initial quars in the
ttbar rest frame, the boost into the laboratory frame drives the
top mainly in the
forward or backward directions, while
antitops are kept more in the
central region, as it can see at the following figure:
The
ttbar cross section at high invariant masses is sensitive to
new physics (BSM) and sets constraints on the masses and couplings of any new particles giving rise to the
ttbar asymmetry. The
lepton charge asymmetry may be affected also by BSM physics, and several authors have propose this as a sensitive probe of the top quark polarization.
To compute the charge asymmetry, is necessary to count tops (or leptons) with specific properties (i.e, rapidity positive or negative, high transverse momentum...). If this count is done at central region of detector, it arises a net charge asymmetry called
central charge asymmetry. This is defined as follows:
where
y is the
rapidity.
However, ATLAS and CMS collaboration use the charge asymmetry:
where
η is the
pseudorapidity. The SM prediction for this asymmetry at the LHC is about 1%:
If we want to measure the lepton charge asymmetry, using leptons produced at top decay, we must take care because the central charge asymmetry can only be measured in dileptonic final states.
More...
Handling real data samples
Let's suppose we want to work with real data samples, stored in our computer. One of these samples is called
data_K_mu.ntuple_all.root. For our analysis, we need to work event by event, so we need to generate a skeleton analysis class for the tree inside the sample. The following steps should be done:
MakeClass
I stand in the folder where the sample is stored.
~> cd folderSamples/
~> root data_K_mu.ntuple_all.root
root [1] t2->MakeClass("ClassVIKI"); ### ClassVIKI.C and ClassVIKI.h are the files created by this function
root [2] .L ClassVIKI.C
root [3] ClassVIKI* t = new ClassVIKI;
root [4] t->Loop();
root [5] .q
Then, you have to modify
ClassVIKI.C and declare the new variables in
ClassVIKI.h. Every time you modify
ClassVIKI.C, you have to compile and:
~> emacs ClassVIKI.C &
~> emacs ClassVIKI.h &
~> root -l -q ClassVIKI.C+
~>
~> root
root [1] .L ClassVIKI.C
root [2] ClassVIKI* t = new ClassVIKI; t->Loop();
Running Ganga locally

Let's suppose we want to work with real data samples, stored in T3 space at IFIC. As the samples are very heavy, we need to run Ganga locally. The procedure is similar to above, but is needed to repeat all the steps (it doesn't work doing "CopyPaste" the file.C and file.h).
~> ssh -X vicsanma@ui06.ific.uv.es
~> cd /lustre/ific.uv.es/grid/atlas/t3/vicsanma/Asymmetries
~> root data_K_mu.ntuple_all.root
root [1] t2->MakeClass("ClassVIKIui"); ### ClassVIKIui.C and ClassVIKIui.h are the files created by this function
root [2] .L ClassVIKIui.C
root [3] ClassVIKIui* t = new ClassVIKIui;
root [4] t->Loop();
root [5] .q
~> emacs ClassVIKIui.C &
### let's save the histogram/canvas at the end of the file:
c1->Write();///LNA
c1->Update();
c1->SaveSource("canvas_ASYMMETRYhistui.C");
~> emacs ClassVIKIui.h &
~> root -l -q ClassVIKIui.C+
~> root
root [1] .L ClassVIKIui.C
root [2] ClassVIKIui* t = new ClassVIKIui; t->Loop();
If so far works, we are able to continue...

Inside the
ClassVIKI.h the following lines should be modified:
TFile f = (TFile)gROOT->GetListOfFiles()->FindObject("/lustre/ific.uv.es/grid/atlas/t3/sgonzale/elena/data_E_mu.ntuple_all.root");
if (f) {
f = new TFile("/lustre/ific.uv.es/grid/atlas/t3/sgonzale/elena/data_E_mu.ntuple_all.root");
}

Is necessary to create a Root and Ganga scripts, called
ClassVIKI.h and
ClassVIKI.h, for instance.
The
ejecutar.C file is like this:
###---------------------------------------------------------------------
#include <iostream>
#include <string>
void ejecutar()
{
gROOT->ProcessLine(".L ClassVIKIui.C");
ClassVIKIui* t = new ClassVIKIui; t->Loop();
}
###---------------------------------------------------------------------
The
LocalRoot.py file is like this:
###---------------------------------------------------------------------
j = Job()
j.name='Exercice Root'
j.application=Root()
j.application.script='ejecutar.C'
j.application.version='5.30.01'
j.inputsandbox=['ClassVIKIui.C','ClassVIKIui_C.so','ClassVIKIui_C.d','ClassVIKIui.h']
j.outputsandbox=['output.test.root']
j.backend=Local()
j.submit()
###---------------------------------------------------------------------

Now we are able to run our scripts, doing the following:
### set up Root
~> source /lustre/ific.uv.es/sw/ific/sw/root/5.30.01/slc5_amd64_gcc41/root/bin/thisroot.sh
### set up Ganga
~> source /afs/ific.uv.es/project/atlas/software/ganga/install/etc/setup-atlas.sh
~> export GANGA_CONFIG_PATH=GangaAtlas/Atlas.ini
~> root -l -q ClassVIKI.C+
~> ganga LocalRoot.py
~> ganga
In [1]: jobs
In [2]: jobs(XYZ).peek('stderr','more') ###to see the out errors
In [3]: jobs(XYZ).peek('stdout','more') ###to see the screen output
In [4]: jobs(XYZ).peek('output.test.root','root -l')
In [5]: ctrl+D
~> cd /afs/ific.uv.es/user/v/vicsanma/gangadir/workspace/vicsanma/LocalXML/XYZ/output/
~> root output.test.root
root [1] TBrowser V;
Inside
output.test.root you can see the saved canvas (in this case,
c1).

If we've run several jobs with differents inputs (data samples), we can merge all the outputs
output.test.root as follow:
~> cd DirectoryChoosed
~> hadd output.ALLtest.root $HOME/gangadir/workspace/vicsanma/LocalXML/*/output/output.test.root
My study

Here is my study and the results:
TopRootCore & TopD3PDBoosted - 2011
... Using TopRootCore modified by IFIC membership.
Twiki: TopD3PDBoosted twiki
Recommended: TopRootCoreRelease-11-00-00-05 or higher &
TopD3PDBoosted-11-00-15
Community e-group: atlas-phys-top-d3pd-analysis@cernSPAMNOT.ch & atlas-phys-top-TopD3PDBoosted
Working at: lxplus303
Data & MC: 2011

To download the
TopRootCore package (release 11-00-00-05) we are going to work at
lxplus. The necessary steps are:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh ### This script is below
~> export CERN_USER=vsanchez
~> mkdir TopRootCore-2011 && cd TopRootCore-2011
~> svn co svn+ssh://$CERN_USER@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/TopPhys/TopRootCoreRelease/tags/TopRootCoreRelease-11-00-00-05/ TopRootCoreRelease-11-00-00-05
~> cd
~> cd TopRootCore-2011/TopRootCoreRelease-11-00-00-05/share
~> ./build-all.sh
~> exit
Now TopRootCore package is ready to be used.

Every time you want to use TopRootCore for data and mc 2011, you must set up this one:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
~> cd TopRootCore-2011/
~> cd RootCore/scripts/
~> source setup.sh
~> cd ../../

To check TRC works correctly, we are going to run a litle example. I prefer to run code in a separate run directory, which we will need to setup with links to the data files and the binaries:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
~> cd TopRootCore-2011/
~> mkdir run && cd run
~> ln -s ../RootCore/data .
~> ln -s ../TopD3PDAnalysis/control .
~> ln -s ../TopD3PDAnalysis/bin .
~> cd ../
~> source RootCore/scripts/setup.sh
~>
~> cd PackageToCompile/
~> cd /cmt
~> make -f Makefile.RootCore

We need to put the input file at
file_list.txt.
~> cd TopRootCore-2011/run/ ###VERY IMPORTANT!!! Be sure you are at run/ directory when you run the cut-flow
~> cp control/file_list.txt .
~> emacs file_list.txt & ### add the NTUP*.root name and NEW LINE!!!
~>
~> DATASET=mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_TOPBOOST.e835_s1272_s1274_r3043_r2993_p841_tid706226_00
~> FILE=NTUP_TOPBOOST.706226._000961.root.1
~> dq2-get -f $FILE $DATASET
~>
~> mkdir OUT_mc11_$FILE
~>
~> CutFlow -f file_list.txt -o OutputHistos_CutFlow.root -mcType mc11c -p control/settings.txt > outTable_CutFlow.txt
~> CutFlowCorr -f file_list.txt -o OutputHistos_CutFlowCorr.root -mcType mc11c -p control/settings.txt > outTable_CutFlowCorr.txt
~>
~> mv OutputHistos_CutFlow.root OUT_mc11_$FILE/
~> mv outTable_CutFlow.txt OUT_mc11_$FILE/
~> mv OutputHistos_CutFlowCorr.root OUT_mc11_$FILE/
~> mv outTable_CutFlowCorr.txt OUT_mc11_$FILE/

To download the
TopD3PDBoosted-11-00-15 package (see the
official twiki page and the
releases), we have to run:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
~> export CERN_USER=vsanchez
~> cd TopRootCore-2011
###~> svn co svn+ssh://$CERN_USER@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/TopPhys/TopD3PDBoosted/tags/TopD3PDBoosted-11-00-15/ TopD3PDBoosted
~> svn co svn+ssh://$CERN_USER@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/TopPhys/TopD3PDBoosted/tags/TopD3PDBoosted-11-00-15/ TopD3PDBoosted-11-00-15
~>
~> cd
###~> cd TopRootCore-2011/TopD3PDBoosted/FilesToReplace
~> cd TopRootCore-2011/TopD3PDBoosted-11-00-15/FilesToReplace
~> cd seedToolRelated_TopD3PDCorr
~> source cp.sh
~> cd ../TopAnalysisBase
~> source cp.sh
~> cd ../TopD3PDSelection
~> source cp.sh
~>
~> cd
~> cd TopRootCore-2011/TopRootCoreRelease-11-00-00-05/share
~> ./build-all.sh
~>
~> exit
and now, the TopRootCore (2011) package is ready to be used with the Boosted Selection.
Running BoostedCutFlow 2011

In this
web page you can find all the MC and DATA
%NTUP_TOPBOOST% datasets.
NTUP_TOPBOOST contains the standard NTUP_TOP variables, plus extra "boosted" variables and the LCW-calibrated clusters to enable jet creation on the fly. For a full list of the most recent branches, see
Sept24_MC_branches.h and
Sept24_data_branches.h (last updated: Oct 6, 2012, with D3PDs created using BoostedTopD3PDMaker-00-00-11)

To run locally an example, you can get a file and save it at your lxplus' /tmp/:
~> ssh -Y vsanchez@lxplus303.cern.ch
~> cd /tmp/vsancehz/samples/MC
~> FILE=NTUP_TOPBOOST.706226._000961.root.1
~> DATASET=mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_TOPBOOST.e835_s1272_s1274_r3043_r2993_p841_tid706226_00
~> CONTAINER=mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_TOPBOOST.e835_s1272_s1274_r3043_r2993_p841/
~> dq2-get -f $FILE $DATASET
### ~> FILE=NTUP_TOP.00993413._000001.root.1
### ~> DATASET=mc12_8TeV.105200.McAtNloJimmy_CT10_ttbar_LeptonFilter.merge.NTUP_TOP.e1193_s1469_s1470_r3542_r3549_p1230_tid00993413_00
### ~> CONTAINER=

Every time you do some change in the files, you should compile the package:
[@lxplus303]~>
~> cd TopRootCore-2011/
~> ./RootCore/scripts/compile.sh ###to compile all the packages
~>
~> cd TopRootCore-2011/FOLDERtoCOMPILE/cmt/
~> make -f Makefile.RootCore
Running MC (NO correction) 2011:
~> BoostedCutFlow -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -jetTrigger
### BoostedCutFlow -f mc_ch_file.txt -mcType mc12 -boost -rndmType debug
For getting spreadsheet numbers. Use
-jetTrigger even if no trigger is apply. To “pick up” the Lumi blockes
- Debug will operate with SeedTool, due to the replace of files. (no correction)
- When jetTrigger is not specify then lepTrigger is the default one.
Running Data (NO correction) 2011:
~> BoostedCutFlow -f mc_ch_file.txt -boost -jetTrigger
Running D3PD2MiniSLBoost 2011
Running MC (with correction) 2011:
~> ./D3PD2MiniSLBoost -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -jetTrigger
For getting spreadsheet numbers. Use
-jetTrigger even if no trigger is apply.
- Debug will operate with SeedTool, due to the replace of files. (no correction)
- When jetTrigger is not specify then lepTrigger is the default one.
Running Data 2011:
~> ./D3PD2MiniSLBoost -f Data_file_list.txt -boost -jetTrigger
- This will produce a mini NTUP
- Same as running on the grid.
- Too add new variables go you, do it al a Root/MiniSLBoosted.cxx
VIKI
TopRootCore & TopD3PDBoosted - 2012
... Using TopRootCore modified by IFIC membership.
Recommended: TopRootCoreRelease-12-01-04 &
TopD3PDBoosted-12-00-08
Community e-group: atlas-phys-top-d3pd-analysis@cernSPAMNOT.ch &
atlas-phys-top-TopD3PDBoosted@cernSPAMNOT.ch
Working at: lxplus303
Data & MC: 2012

To download the
TopRootCore package we are going to work at
lxplus. The necessary steps are:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh ### This script is below
~> export CERN_USER=vsanchez
~> mkdir TopRootCore-2012 && cd TopRootCore-2012
~> TRCR=TopRootCoreRelease-12-01-17
~> svn co svn+ssh://$CERN_USER@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/TopPhys/TopRootCoreRelease/tags/$TRCR/ $TRCR
~> cd
~> cd TopRootCore-2012/$TRCR/share
~> ./build-all.sh
~> exit
Now TopRootCore package is ready to be used.
The
startRoot5-32.sh (or
setRoot.sh) script is like this:
###---------------------------------------------------------------------
#!/bin/bash/
cd $HOME
source /afs/cern.ch/sw/lcg/contrib/gcc/4.3/x86_64-slc5/setup.sh
cd /afs/cern.ch/sw/lcg/app/releases/ROOT/5.32.00/x86_64-slc5-gcc43-opt/root/
source bin/thisroot.sh
#source /afs/cern.ch/sw/lcg/app/releases/ROOT/5.32.00/x86_64-slc5-gcc43-opt/root/bin/thisroot.sh
export LD_LIBRARY_PATH=$ROOTSYS/lib:$LD_LIBRARY_PATH
export PATH=$ROOTSYS/bin:$PATH
cd $HOME
###---------------------------------------------------------------------

Every time you want to use TopRootCore, you must set up this one:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
~> cd TopRootCore-2012/
~> source RootCore/scripts/setup.sh

Every time you download a new package, you must compile it as:
ssh -Y vsanchez@lxplus303.cern.ch
~> cd cd TopRootCore-2012/PackageToCompile/
~> cd /cmt
~> make -f Makefile.RootCore
If you need help, you can contact with the community e-group
atlas-phys-top-d3pd-analysis@cernSPAMNOT.ch and
atlas-phys-top-TopD3PDBoosted@cernSPAMNOT.ch, consists of members of TopWorkingGroup.

To check TRC works correctly, we are going to run a litle example. I prefer to run code in a separate run directory, which we will need to setup with links to the data files and the binaries:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
~> cd TopRootCore-2012/
~> mkdir run && cd run
~> ln -s ../RootCore/data .
~> ln -s ../TopD3PDAnalysis/control .
~> ln -s ../TopD3PDAnalysis/bin .
~> cd ../
~> source RootCore/scripts/setup.sh
~>
~>
~> cd TopRootCore-2012/run/ ###VERY IMPORTANT!!! Be sure you are at run/ directory when you run the cut-flow
~> cp control/file_list.txt .
~> emacs file_list.txt & ### add the NTUP*.root name and NEW LINE!!!
~>
~> mkdir SAMPLES && cd SAMPLES/
~> DATASET=mc12_8TeV.105200.McAtNloJimmy_CT10_ttbar_LeptonFilter.merge.NTUP_TOP.e1193_s1469_s1470_r3542_r3549_p1230_tid00993413_00
~> FILE=NTUP_TOP.00993413._000001.root.1
~> dq2-get -f $FILE $DATASET
~>
~> cd ../
~> mkdir OUT_CutFlow_mc12_$FILE
~>
~> CutFlow -f file_list.txt -o OutputHistos_CutFlow.root -mcType mc12 -p control/settings.txt > outTable_CutFlow.txt
~> mv OutputHistos_CutFlow.root OUT_CutFlow_mc12_$FILE/
~> mv outTable_CutFlow.txt OUT_CutFlow_mc12_$FILE/
~>
~> CutFlowCorr -f file_list.txt -o OutputHistos_CutFlowCorr.root -mcType mc12 -p control/settings.txt > outTable_CutFlowCorr.txt
~> mv OutputHistos_CutFlowCorr.root OUT_CutFlow_mc12_$FILE/
~> mv outTable_CutFlowCorr.txt OUT_CutFlow_mc12_$FILE/
### VERY IMPORTANT!!! Be sure you are at run/ directory when you run the cut-flow.

The TopD3PDBoosted package provides a
single-lepton cut flow for the Boosted ttbar (%NTUPBOOST% D3PD's) event selection along with the reconstruction of the selected event to build the ttbar system. The package can be used by packing the D3PD information into the mini-EDM (event data model) defined in BoostedElectronSelection, BoostedMuonSelection, BoostedJetSelection and FatJetSelection classes.
To download the
TopD3PDBoosted-12-00-08 package (see the
official twiki page and the
releases), we have to run:
ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
### If you are using TopRootCoreRelease-12-01-04, TopD3PDBoosted-12-00-08 is INSIDE the package!
###~> cd TopRootCore-2012
###~> svn co svn+ssh://vsanchez@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/TopPhys/TopD3PDBoosted/tags/TopD3PDBoosted-12-00-09 TopD3PDBoosted
###~> cd
###~> cd TopRootCore-2012/TopD3PDBoosted/FilesToReplace
###~> cd 2012/
###~> cd seedToolRelated_TopD3PDCorr
###~> source cp.sh
###~> cd ../TopAnalysisBase
###~> source cp.sh
###~> cd ../TopD3PDSelection2
###~> source cp.sh
###~> cd
~>
~> cd TopRootCore-2012/TopD3PDBoosted/FilesToReplace/2012/
~> source cpall.sh ### WhichPeriod.cxx and WhichPeriodSvn.cxx must not be replaced. If these files are replaced, go into the folder which it contains, remove them and run "svn update"
~> cd
~>
~> cd TopRootCore-2012/TopRootCoreRelease-12-01-04/share
~> cp $HOME/TopRootCore-2012/TopD3PDBoosted/FilesToReplace/2012/packages.txt .
~> ./build-all.sh
~> exit
and now, the TopRootCore package is ready to be used with the Boosted Selection. Here, you can get also the following files:
packages.txt,
cp_TopD3PDSelection2.sh,
cpall.sh,
WhichPeriod_TRC.cxx.

We need to create two new files (
mc_ch_file.txt and
Data_file_list.txt) with the full path of the files you want to run over, inside the
run/ folder. Go at folder where are stored these files and do:
~> cd $HOME/samples
~> FILE=NTUP_TOP.00993413._000001.root.1
~> DATASET=mc12_8TeV.105200.McAtNloJimmy_CT10_ttbar_LeptonFilter.merge.NTUP_TOP.e1193_s1469_s1470_r3542_r3549_p1230_tid00993413_00
~> dq2-get -f $FILE $DATASET ### or ### dq2-get -n1 $DATASET
~>
~> DATASETmc=XXXXXXXXX
~> cd $DATASETmc
~> echo $PWD > mc_ch_file.txt
~> emacs mc_ch_file.txt & ### add the NTUP*.root name and NEW LINE!!!
~> cp mc_ch_file.txt $HOME/TopRootCore-2012/run/
~>
~> DATASETdata=XXXXXXXXX
~> cd $DATASETdata
~> echo $PWD > Data_file_list.txt
~> emacs Data_file_list.txt & ### add the NTUP*.root name
~> cp Data_file_list.txt $HOME/TopRootCore-2012/run/

To run locally an example, you can get a file and save it at your lxplus' /tmp/:
~> ssh -Y vsanchez@lxplus303.cern.ch
~> cd /tmp/vsancehz/samples/MC
~> FILE=NTUP_TOP.00993413._000001.root.1
~> DATASET=mc12_8TeV.105200.McAtNloJimmy_CT10_ttbar_LeptonFilter.merge.NTUP_TOP.e1193_s1469_s1470_r3542_r3549_p1230_tid00993413_00
~> dq2-get -f $FILE $DATASET ### or ### dq2-get -n1 $DATASET
### ~> FILE=NTUP_TOPBOOST.706226._000961.root.1
### ~> DATASET=mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_TOPBOOST.e835_s1272_s1274_r3043_r2993_p841_tid706226_00
### ~> CONTAINER=mc11_7TeV.105200.T1_McAtNlo_Jimmy.merge.NTUP_TOPBOOST.e835_s1272_s1274_r3043_r2993_p841/

Every time you do some change in the files, you should compile the package:
[@lxplus403]~>
~> cd TopRootCore-2012/
~> ./RootCore/scripts/compile.sh ###to compile all the packages
~>
~> cd TopRootCore-2012/FOLDERtoCOMPILE/cmt/
~> make -f Makefile.RootCore
Running BoostedCutFlow

In this
web page you can find all the MC and DATA
%NTUP_TOPBOOST% datasets.
NTUP_TOPBOOST contains the standard NTUP_TOP variables, plus extra "boosted" variables and the LCW-calibrated clusters to enable jet creation on the fly. For a full list of the most recent branches, see
Sept24_MC_branches.h and
Sept24_data_branches.h (last updated: Oct 6, 2012, with D3PDs created using BoostedTopD3PDMaker-00-00-11)

Be sure you are at
run/ folder and you've done the TRC and ROOT set up:
~> ssh -Y vsanchez@lxplus303.cern.ch
~> source startRoot5-32.sh
~> cd TopRootCore-2012
~> source RootCore/scripts/setup.sh
~> cd run/
Running MC (NO correction):
~> BoostedCutFlow -f mc_ch_file.txt -boost -mcType mc12 -rndmType debug > OUTscreen.txt
### BoostedCutFlow -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -jetTrigger > OUTscreen.txt
For getting spreadsheet numbers. Use
-jetTrigger even if no trigger is apply. To “pick up” the Lumi blockes
- Debug will operate with SeedTool, due to the replace of files. (no correction)
- When jetTrigger is not specify then lepTrigger is the default one.
Running Data (NO correction):
~> BoostedCutFlow -f mc_ch_file.txt -boost ### you can add -dataStream Egamma (or whatever you need)
Running D3PD2MiniSLBoost

D3PD2MiniSLBoost's options:
~> D3PD2MiniSLBoost --help
---- Starting D3PD2MiniSL
Usage: D3PD2MiniSLBoost [OPTION]
The list of options are:
-f <value> input file list file (default: "control/file_list.txt")
-h print this message
-mcType <value> set the MC type (mc11b, mc11c, mc12, atlfast2) (default: use directory name.)
-nf <value> number of input files to consider (default: 1)
-n <value> maximum number of events to process (default: all events)
-nocache turn off the TTreeCache, reduce read failures with DPM RFIO
-o <value> output ROOT file (default "OutputHistos.root")
-i <value> index of the output file name (default nothing)
-p <value> parameters file (default: "control/settings.txt")
-rndmType <value> reseeding strategy (once, file, debug) (default: once)
-skn <value> number of events to skip (default: 0)
-sknf <value> number of files to skip (default: 0)
-vin validate input files (default: false)
-dataStream data stream name: Egamma,Muons.
-useLooseElectrons use loose lepton definition.
-useLooseMuonss use loose lepton definition.
-boost use boosted branches (default: false)
-JetTrigger use Jet Trigger (default: false)
-useTruthParticles use TruthInfo or not (default: false)
-useClusteris use (default: false)
-findEvent <value> use this to run on one event number, this is good for debuging
Running boosted regime for MC and data:
~> D3PD2MiniSLBoost -f mc_file_list.txt -boost -mcType mc12 -rndmType debug > OUTscreen2.txt
###~> D3PD2MiniSLBoost -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -addGenerator > OUTscreen3.txt
###~> D3PD2MiniSLBoost -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -useTruthParticles > OUTscreen4.txt
~> D3PD2MiniSLBoost -f data_file_list.txt -boost -dataStream "VAL"
Where "VAL" stands for either Egamma or Muons
Running resolved regime for MC and data:
~> D3PD2MiniSLBoost -f MC_file_list.txt -mcType mc12 -rndmType debug
~> D3PD2MiniSLBoost -f data_file_list.txt -dataStream "VAL" ###(same as above)
Running OLD:
~> D3PD2MiniSLBoost -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -jetTrigger > OUTscreen2.txt
###~> D3PD2MiniSLBoost -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -jetTrigger -addGenerator > OUTscreen3.txt
###~> D3PD2MiniSLBoost -f mc_ch_file.txt -rndmType debug -mcType mc12 -boost -jetTrigger -useTruParticles > OUTscreen4.txt
For getting spreadsheet numbers. Use
-jetTrigger even if no trigger is apply.
- Debug will operate with SeedTool, due to the replace of files. (no correction)
- When jetTrigger is not specify then lepTrigger is the default one.
~> ./D3PD2MiniSLBoost -f Data_file_list.txt -boost -jetTrigger
- This will produce a mini NTUP
- Same as running on the grid.
- Too add new variables go you, do it al a Root/MiniSLBoosted.cxx
Skimmed samples
Our purpose is to make the
%NTUP_TOPBOOST% samples (data and MC) much smaller. So, we'll run this samples over TopD3PDBoosted (inside TopRootCore). This package provides a
single-lepton cut flow for the Boosted ttbar (%NTUPBOOST% D3PD's) event selection along with the reconstruction of the selected event to build the ttbar system. All the
ouputs.root obtained after running D3PD2MiniSLBoost will be used in all subsequent steps of the Charge Asymmetry analysis (unfolding, corrections...).
My analysis with TopD3PDBoosted & TopRootCore
See new twiki
VictoriaBoosted.
.git Repository (IFIC-IFAE)
Code to compute several charge asymmetries. It has been developed by IFAE charge asymmery group.
Input files
The input files, since they are very large (about 12 GB), must be hosted at
/tmp/:
~> ssh -Y vsanchez@lxplus403.cern.ch
~> asetup 16.7.0 ### if this doesn't work, try with "asetup 17.0.0"
~> cd /tmp/vsanchez/IFAE
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/asymmetry_ntuples
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/asymmetry_ntuples/Histogrammer-01-15-02.tgz .
~> tar -xvf Histogrammer-01-15-02.tgz
~> cd Histogrammer-01-15-02
~> cd nominal
~> ls MUON/Dibosons_MUON_OutputHisto.root ### for instance
If this doesn't work, I have saved a copy in my laptop:
MAC~> cd $HOME/Analysis/IFIC_IFAE/git_repositories
MAC~> scp Histogrammer-01-15-02.tgz vsanchez@lxplus403.cern.ch:/tmp/vsanchez/IFAE
The code
The work directory is
asymmetry.git/asymmetryAtlas/, where all the code that we need is stored:
~> ssh -Y vsanchez@lxplus403.cern.ch
~> asetup 16.7.0 ### if this doesn't work, try with "asetup 17.0.0"
~> cd IFIC_IFAE/asymmetry.git/asymmetryAtlas
~>
~> git checkout -b branchVIKI ### VERY IMPORTANT
~> emacs python/setup.py & ### add Location en L10
~> emacs python/Histogrammer.py & ### add hetaleptoncharge
~> emacs python/analyzer.py & ### add hetaleptoncharge
~> git commit -m 'change A,B' python/setup.py
~> git commit -m 'change A,B' python/ Histogrammer.py
~> git commit -m 'change A,B' python/ analyzer.py
~>
~> python python/analyzer.py nominal
Output files
The output files are saved at
plots/ folder:
~> ssh -Y vsanchez@lxplus403.cern.ch
~> asetup 16.7.0 ### if this doesn't work, try with "asetup 17.0.0"
~> cd IFIC_IFAE/asymmetry.git/asymmetryAtlas
~> cd plots/nominal
~> ls muon/*.root
~> ls muon/*.png
To copy the output files in my laptop:
MAC~> cd $HOME/Analysis/IFIC_IFAE/git_repositories/out_asymmetryAtlas/
MAC~> scp vsanchez@lxplus403.cern.ch:/afs/cern.ch/user/v/vsanchez/IFIC_IFAE/asymmetry.git/asymmetryAtlas/plots/nominal/muon/*.png .
MAC~> open *.png
Useful macros
Here are a few macros, from different areas, which may be useful:

If you want to
plot the massttbar, you should download this package:
ttbar_mass. You have to put the root file name in
macro_histogram_mtt.cxx, and run it with root.

If you need to generate many graphs automatically, download this package:
crearNtuplas_auto.tar and run the file
pasar_por_PloTop_viki.sh.

If you want to
create a file.root (like a ntuple)
from a file.lhe (MadGraph's output), you must use this package (
createFileRoot) and run:
~> cd createFileRoot/
~> make
~> ./plot_top_marcel

If you have several samples with similar name (for instance, testName.generator.X.Y.lhe, with X=1,2,3....10 and Y=10,20,30,40,50), you can
change file's names (for example, NewTestName.user.X.Y.lhe) with this
program. Whether if these samples has been generated with MadGraph or Pythia, you can also
change the particle's code. Also, if you want to merge all these samples, you can use this
script.

When you want to
compare a DATA sample with the equivalent MC samples, you must take into account the integrated luminosity, the cross section and the total number of events produced in MC to compute a "scale factor". This factor is necessary to compare DATA with MC. Here there is a small
program which allow you to do so.

If you have several ntuples with different extension to
.root, for instance .root1 or .root__DQ2_748574, you can change all these extensions to be equal to .root with this
executable file (you have to remove the extension .txt), and then
merge all ntuples. The following steps should be performed:
~> ssh -X vicsanma@ui06.ific.uv.es
~> cd $DIRECTORY #it may be /tmp/ or /T3/, and it should contain extension.py
~> cp /way/ALLfiles.root* .
~> python extension.py #here you are running the executable file downloaded

To draw a concrete branches from differents ntuples, it means, with data, with background, with your signal... you can use this
macro, or if you only need to compare two branches with a given cut, use this
one. Be careful to change all file names, its path...

To create a dataset registered in DQ2 with D4PD which are stored in T3 space (
script).

To reduce the number of branches in a D3PD and add new ones, use this
package, it contains a MakeFile.

To compile a MakeClass files (.C and .h) with a Makefile, you can use one of these packages:
packageLOCAL or
packageGRID. With local-package, only run
source cppmake.sh; ./skimD3PD_addTRUTHbranch. With grid-package, only run
source runPRUN.sh.

If....
fbu 0.0.1
PyFBU?: Implementation of the Fully Bayesian Unfolding algorithm described in
physics.data-an/1201.4612 . The software is based on the Markov Chain Monte Carlo sampling toolkit
PyMC?.
- quickstart - quick start instructions -->
- tutorial - Simple tutorial -->
- source - to download fbu-0.0.1.tar.gz (md5) -->
To do the first time:
cd /afs/cern.ch/work/v/vsanchez/
virtualenv fbu
cd fbu
source bin/activate
pip install numpy==1.7.0
pip install fbu
### git clone https://github.com/gerbaudo/fbu.git (Alternatively one can check out the development version of the code from the GitHub repository)
### cat requirements.txt | xargs pip install
### python tests/pymc_test/unfold.py
To do from then on:
cd /afs/cern.ch/work/v/vsanchez/fbu
source bin/activate
...
...
...
bin/deactivate
eos
~> ssh -Y vsanchez@lxplus403.cern.ch
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/TRC
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/VALIDATION
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/asymmetry_ntuples
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/MatchingSTUDY
~>
~>
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/validationWITHjets.tar .
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/7TeV/validation_vsm_MatchingStudy.tar .
~>
~> FILE=Plots_test1.tar
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/MatchingSTUDY/7TeV
~> xrdcp $FILE root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/7TeV/$FILE
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/7TeV/$FILE .
~>
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/MatchingSTUDY/8TeV
~> xrdcp $FILE root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/8TeV/$FILE
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/8TeV/$FILE .
~>
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/MatchingSTUDY/8TeV/mass500
~> xrdcp $FILE root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/8TeV/mass500/$FILE
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/MatchingSTUDY/8TeV/mass500/$FILE .
~>
~>
~> xrdcp FILE.tar root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/FOLDER/FILE.tar
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/FOLDER/FILE.tar .
~>
~> eos rm eos/atlas/user/v/vsanchez/FOLDER/FILEtoREMOVE.tar
~> eos rm -r eos/atlas/user/v/vsanchez/FOLDERtoREMOVE
~> xrdcp root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/TRC_testSamples_February2014/MC/117050/el.root .
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/TRC_testSamples_February2014/
~> eos mkdir /eos/atlas/user/v/vsanchez/Samples_AnalysisTop-1.5.0_June2014/Data/HadDelayed_Dominik
~> xrdcp $FILE root://eosatlas.cern.ch//eos/atlas/user/v/vsanchez/Samples_AnalysisTop-1.5.0_June2014/Data/HadDelayed_Dominik/$FILE
~> xrd eosatlas dirlist /eos/atlas/user/v/vsanchez/Samples_AnalysisTop-1.5.0_June2014/Data/HadDelayed_Dominik
Work space at CERN:
~>cd /afs/cern.ch/work/v/vsanchez
~>cd /afs/cern.ch/work/v/vsanchez/private
~>cd /afs/cern.ch/work/v/vsanchez/public
My Links
KK-gluon
Athena
DQ2
TopRootCore
BoostedTopRootCore
ATLAS offline software tutorial
Twiki rules
By default

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Victoria Sánchez Martínez
--- IFIC ---
Edificio Institutos de Investigación
Paterna, apartado 22085
E-46071 Valencia (Spain)

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.