Usefull Tricks

PixelOfflineToDo


###Annealing
SiPixelTemplateDBObject_phase1_38T_2018_v8

/afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN/MP/MPproduction/mp2853/jobData/jobm3/alignments_MP.db = TrackerAlignment_mp2853_forAnnealingStud


321832 (a while after annealing) [	 jobs]: crab_AnnealingStud_Run321832_SingleMuon | on eos with LA and reso, cleaned, give to Tanja |
--> copied
/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run321832_SingleMuon/
/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run321832_SingleMuon/

# ---------------------------------------------------------------------------------------------------------------------------------------------------
320841 (just after annealing) [212 jobs]: crab_AnnealingStud_Run320841_SingleMuon | LA is copied to eos |
rsync -azv --update LA* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run320841_SingleMuon/
--> LA copied
rsync -azv --update Re* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run320841_SingleMuon/
--> Residuals copied

# ---------------------------------------------------------------------------------------------------------------------------------------------------
319992 (Run-C, before-annealing) [248	 jobs]: crab_AnnealingStud_Run319992_SingleMuon_v3 | crab outputed
rsync -azv --update LA* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run319992_SingleMuon/
--> LA copied
rsync -azv --update Re* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run319992_SingleMuon/
--> Residuals copied

# ---------------------------------------------------------------------------------------------------------------------------------------------------
319528, Run-C [265 jobs]: crab_AnnealingStud_Run319528_SingleMuon_v2 | crab outputed
rsync -azv --update LA* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run319528_SingleMuon/
--> LA copied
rsync -azv --update Re* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run319528_SingleMuon/
--> Residuals copied

# ---------------------------------------------------------------------------------------------------------------------------------------------------
317512, Run-B [440 jobs]: crab_AnnealingStud_Run317512_SingleMuon_v2 | crab outputed
rsync -azv --update LA* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run317512_SingleMuon/
--> LA copied
rsync -azv --update Re* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run317512_SingleMuon/
--> Residuals copied

# ---------------------------------------------------------------------------------------------------------------------------------------------------
316879, Run-A [190 jobs]: crab_AnnealingStud_Run316879_SingleMuon_v1 | crab outputed
rsync -azv --update LA* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run316879_SingleMuon/
--> LA copied
rsync -azv --update Re* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run316879_SingleMuon/
--> Residuals copied

# ---------------------------------------------------------------------------------------------------------------------------------------------------
316457, Run-A [1042 jobs]: crab_AnnealingStud_Run316457_SingleMuon_v1 | crab outputed
rsync -azv --update LA* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run316457_SingleMuon/

--> LA copied
rsync -azv --update Re* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/Resolution/AnnealingStud_Run316457_SingleMuon/
--> Reo is copied


I use so far 320688  --> so that is the worse situation 
we need run after 320812 --> PCL is applied
one after they upload the ML alignment 
and one after all conditions are fixed ...
so four alltogether




ls -l . | awk '{if  ($5 < 6000000) print "rm "$9}'
scp LA_* tvami@lxplus.cern.ch:/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/Monitoring/2018/LA/AnnealingStud_Run319992_SingleMuon/.

Alignment tags 2018

with updated tracker alignment and APE conditions for the Summer18 ReReco of the eras ABC(D). The alignment constants are the result of a dedicated alignment campaign, and details and validation results have been presented at the tracker alignment meeting September 4, 2018 [*]. The tags also contain the previous offline alignment history. [*] https://indico.cern.ch/event/688852/#b-279830-tracker-alignment-tra

Template insights from Morris

The 1D template position estimation is insensitive to smallish changes of charge scale [+-10% should be no problem]. The actual algorithm does truncate pixel charges that are too large to avoid delta ray effects [it is formally sensitive to the charge scale]. The shape fits use the pixel by pixel charges to estimate the uncertainties in the denominators of the chisqure function. Finally, the estimated hit uncertainties and probabilities do depend very much on the charge scale [few percent changes are notable]. Lots of cluster charge indicates lots of delta ray activity and larger errors. Each template calibration should remove the scale effects.

How to do HipPy

For the startup, this is probably the best object:
https://cms-conddb.cern.ch/cmsDbBrowser/list/Prod/tags/TrackerAlignment_StartUp2018_v2
it's the first one we uploaded to the GT this year.


Instructions for running hippy:

We work in this directory:
/afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN2/HipPy
The easiest way to start is to cmsenv here, to start with, even if that's not the release you eventually want to use (but later the script will make a new CMSSW for you with the proper release):
/afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN2/HipPy/TS1highvoltage/CMSSW_10_1_6

Then you can run this script.
makeHippyCampaign.py foldername --scram-arch (...) --cmssw (...) --merge-topic hroskes:hippy-scripts --merge-topic hroskes:common-tkal-dataset

This takes a while and makes a folder /afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN2/HipPy/foldername, which contains your CMSSW release and folders called Jobs/ and run/.  Jobs/ is where the jobs eventually happen.  run/ is where all the scripts are.

Inside run/ there are a few things:
DataFiles - here you have to set up a txt file with a list of data files to run on.  To get the list, you can run this script.
writedatasetfile.py -d /.../.../ALCARECO --first-run ... --last-run ... --hippy myfilelist.txt -j neventsperjob -m maxevents

For collisions, neventsperjob=10000 typically makes each iteration take about 20 minutes.  I suggest starting with that and then checking how many jobs there will be (by doing wc -l myfilelist.txt).  If it's too many you can either increase neventsperjob or set maxevents.

Then you can get rid of the COSMICS and CDCs line in data_example.lst and set the minbias line to point to your txt file.

IOV - you want a file that points to the first run of your dataset

Configurations - the only thing you have to touch here is the "common" file.  Here you set the global tag, the conditions to override (templates, maybe starting alignment), and the alignables.  To align at ladder level, uncomment the lines "TrackerP1PXBLadder,111111" and "TrackerP1PXECBlade,111111".  The 111111 refer to x y z theta_x theta_y theta_z.  It's probably better to turn off z in FPIX, so change it to 110111.

You can also change minimumNumberOfHits if you want.  That's about all we normally change.

Then, in the main folder, the last thing to change is submit_template.sh.  You just have to edit the variables that start out commented.
hpnumber is the ID number of the alignment.  It refers to the name of the folder in /afs/cern.ch/cms/CAF/CMSALCA/ALCA_TRACKERALIGN2/HipPy/alignments.  You can just increment the last number by 1.
common, lstfile, IOVfile - these are the names of the files you set up
alignmentname - this is the name of a folder in Jobs where it gets run
niterations - typically we do 10
Then the script requires you to git commit what you've done.  It's a little git repository with all these scripts.

Then finally you can start running.  This has to be in a screen session.  I usually use this script to start the session:
/afs/cern.ch/work/h/hroskes/public/forTanja/startscreen
it prints out how to get back to the cmsenv and directory and emails me which lxplus I'm on so that I can get back to it.

Inside the screen you just do ./submit_whatever.sh and it should run.

Checking int lumi

brilcalc lumi -u /nb --begin 6469 --end 
6469 is the first run in 2018. To install have a look at https://cms-service-lumi.web.cern.ch/cms-service-lumi/brilwsdoc.html

New geometry to sqlite files

/data/vami/projects/Phase2/Telescope/CMSSW_10_0_0_pre1/src/Geometry/TrackerPhase2TestBeam/python
in file Phase2TestBeamGeometryXML_cfi.py modiy
    geomXMLFiles = cms.vstring(
        'Geometry/TrackerCommonData/data/trackerParameters.xml',
         # World volume creation and default CMS materials
then go to
/data/vami/projects/Phase2/Telescope/CMSSW_10_0_0_pre1/src/CondTools/Geometry/test
vim trackergeometrywriter.py
modify Geomerty line to
process.load('Configuration.Geometry.GeometryTrackerPhase2TestBeam_cff')


toPut = cms.VPSet(
    cms.PSet(record = cms.string('IdealGeometryRecord'),tag = cms.string('TKRECO_Geometry_Phase2Telescope')),
    cms.PSet(record = cms.string('PGeometricDetExtraRcd'),tag = cms.string('TKExtra_Geometry_Phase2Telescope')),
    cms.PSet(record = cms.string('PTrackerParametersRcd'),tag = cms.string('TKParameters_Geometry_Phase2Telescope'))

What I learned from 2D Templates

cmsrel CMSSW_10_1_0_pre2
cd CMSSW_10_1_0_pre2/src/
cmsenv
git cms-merge-topic 22458

To trigger the new CPE, uncomment last two lines in
RecoTracker/TransientTrackingRecHit/python/TTRHBuilderWithTemplate_cfi.py

To get the correct label modify
CalibTracker/SiPixelESProducers/plugins/SiPixel2DTemplateDBObjectESProducer.cc
and changed the line
std::string label = "";
to
std::string label = "denominator";

scram b

cmsDriver.py SingleMuPt10_pythia8_cfi --conditions 101X_upgrade2018_realistic_Candidate_2018_03_15_16_26_46 -n 10 --era Run2_2017 --eventcontent FEVTDEBUG --relval 25000,100 -s GEN,SIM --datatier GEN-SIM --beamspot Realistic25ns13TeVEarly2017Collision --geometry DB:Extended 

cmsDriver.py step2  --conditions 101X_upgrade2018_realistic_Candidate_2018_03_15_16_26_46 -s DIGI:pdigi_valid,L1,DIGI2RAW,HLT:@relval2017 --datatier GEN-SIM-DIGI-RAW -n 10 --geometry DB:Extended --era Run2_2017 --eventcontent FEVTDEBUGHLT --filein file:SingleMuPt10_pythia8_cfi_GEN_SIM.root

cmsDriver.py step3 --conditions 101X_upgrade2018_realistic_Candidate_2018_03_15_16_26_46 -n 10 --era Run2_2018 --eventcontent RECOSIM,MINIAODSIM,DQM --runUnscheduled -s RAW2DIGI,L1Reco,RECO,RECOSIM,EI,PAT,VALIDATION:@standardValidation+@miniAODValidation,DQM:@standardDQM+@ExtraHLT+@miniAODDQM --datatier GEN-SIM-RECO,MINIAODSIM,DQMIO --geometry DB:Extended --filein file:step2_DIGI_L1_DIGI2RAW_HLT.root --no_exec



open file step3_RAW2DIGI_L1Reco_RECO_RECOSIM_EI_PAT_VALIDATION_DQM.py
and include the line 
process.load('CalibTracker.SiPixelESProducers.SiPixel2DTemplateDBObjectESProducer_cfi')


for data
cmsDriver.py step3 --conditions 101X_dataRun2_Prompt_Candidate_2018_03_26_19_48_11 -n 10 --era Run2_2018 --eventcontent RECOSIM,MINIAODSIM,DQM --runUnscheduled -s RAW2DIGI,L1Reco,RECO,RECOSIM,EI,PAT,VALIDATION:@standardValidation+@miniAODValidation,DQM:@standardDQM+@ExtraHLT+@miniAODDQM --datatier GEN-SIM-RECO,MINIAODSIM,DQMIO --geometry DB:Extended --filein root://cms-xrd-global.cern.ch//store/data/Run2017F/ZeroBias/RAW/v1/000/305/406/00000/0049642B-86B7-E711-920F-02163E013657.root --no_exec --data


cmsDriver.py RECO -s RAW2DIGI,L1Reco,RECO --data --scenario pp --conditions 101X_dataRun2_Prompt_Candidate_2018_03_26_19_48_11 --era Run2_2017 --process NTUPLE --eventcontent RECO --datatier RECO --filein root://cms-xrd-global.cern.ch//store/data/Run2017F/ZeroBias/RAW/v1/000/305/406/00000/0049642B-86B7-E711-920F-02163E013657.root  --python_filename=run_Phase1PixelTree_Data_101X_cfg.py --runUnscheduled -n 10 --no_exec

Workflow tricks

A typical workflow for the relval for the 2017 workflow (using the phase1 pixel , also true for 2018, 2019 when the only relevant change is for hcal ) , can be found like this , let's say as example for SingleMuPt10 : 
runTheMatrix.py -n | grep 2017 | grep SingleMuPt10 
--> if you type yourself this command, you will see that the Workflow number is 10007 

Therefore you if you finally type 
runTheMatrix.py -l 10007 -ne 
--> it gives you the cmsdriver commands for this particular workflow, that are actually used in Pull request test and in relvals. 
--> you can notice typing the command above that the era actually used is --era Run2_2017 

Inserting the name of this era in the github search leads you to see the config listing the various detector eras to build a 2017 CMS era : 
please look here : 
https://github.com/cms-sw/cmssw/blob/master/Configuration/Eras/python/Era_Run2_2017_cff.py 
where you can see that for the pixel it is phase1Pixel 

So looks like what you did was good (I mean by adding the modifier) . 

For the workflow you were testing 11624 (2019 configuration) , you can try to same and find out that the era is --era Run3 , therefore you could look at : 
https://github.com/cms-sw/cmssw/blob/master/Configuration/Eras/python/Era_Run3_cff.py 
which is loading 
from Configuration.Eras.Era_Run2_2018_cff import Run2_2018 
which loading the 2017 one , so we are in the same config for the pixel as before. 


Something to think about on the pixel team side is whether we would not need to deploy eras according to the year (as hcal is doing) , so far the 2017 pixel is only different that the <2016 pixel (for obvious reasons :) ) but the same as in 2018 , 2019 - Perhaps one should think to have different reconstruction (or simulation) configurations/parameters depending on the years even for the same pixels. Maybe this is irrelevant , but in case the era mechanism will be easily to implement. 

Config files tricks

# Desciption: example how to set up many input files. After 255 files put the next files to the .extend part
import FWCore.ParameterSet.Config as cms
myfilelist = cms.untracked.vstring()
myfilelist.extend([
  'file:/data/vami/projects/pilotBlade/ppProcessing/Results/PilotMinBias_13TeV_cfi_py_RAW2DIGI_L1Reco_RECO2900Event.root',
  ])
#myfilelist.extend([  
#  ])
FileNames = myfilelist

DQM tricks

A way to get just the histograms you are interested in from the DQMGUI.
From lxplus, do:
cmsrel CMSSW_9_4_0
cd CMSSW_9_4_0/src/
cmsenv
dqm-access -w -c -f '/PixelPhase1/Phase1_MechanicalView/PXBarrel/digi_occupancy_*_PXLayer_*' -e "match('/ZeroBias1/Run2017G-PromptReco-v.*/DQMIO',dataset)" -s https://cmsweb.cern.ch/dqm/offline/data/json

dqm-access --help | less

The last command can be generalized, e.g.:

  • If you want to fetch histograms from all periods, use Run2017.* instead of Run2017G (this is a plain regular expression, you can assemble the one you like)
  • If you are interested in the forward too, I'd suggest making a second query (if you repeat the same command with a different histogram target, the output file in your local directory will be overwritten...)
  • if you are interested only in some specific run, you could use: -e "run == XYZ and match...."

LA tree production

Pixel tree production

Setup procedure is explained here
https://github.com/BenjaminMesic/PixelTree-ProductionTool
(instructions.txt is deprecated so ignore it please)

After you finish setup, open TreeProduction.py and set these four lines
https://github.com/BenjaminMesic/PixelTree-ProductionTool/blob/master/TreeProduction.py#L16-L19
(CMSSW is not mandatory but is ok for bookkeeping.)

After you did all that just run the script. It will create bunch of scripts (in batch folder) ready to be sent on batch.
Before sending you can just open one python script and see if everything is done correctly. You
can even change number of events to e.g. 5 and run script interactively.

How does script actually creates jobs? 
There is the template script which is used for making jobs.
https://github.com/BenjaminMesic/PixelTree-ProductionTool/blob/master/templates/pixel.py
(Note that there are many different templates but only one which is called pixel.py is actually used,
other templates are used for different configurations, i.e. cosmics, VCAL, etc... Most recent are
ones which have phase1 in its name. If you want to use other template, it must be called pixel.py)

If you are going to send jobs, they will be automatically stored on EOS
https://github.com/BenjaminMesic/PixelTree-ProductionTool/blob/master/TreeProduction.py#L33
I don't know if you have permissions to store there. Anyway, you need to manually create directory
(from line 33) on EOS. In case you wish to make files on EOS, please fill the google doc what you did so
that I can follow what is going on.

Right now, sending jobs is not configurable as a variable so you need to (un)comment 
https://github.com/BenjaminMesic/PixelTree-ProductionTool/blob/master/TreeProduction.py#L235
in order to send jobs.

LA plots production

root -l -b -q  'LALayer.C("input.list",1,"outfile.root")'

input list is the txt file with the list of files with the LA trees (it can be also one root file, in that case second argument should be 0). 
Third argument is the output root file with the histograms.
The 2017 LA trees are stored in the eos directory:
/eos/cms/store/group/dpg_tracker_pixel/comm_pixel/LATrees/2017/

Linux tricks

du -hsx * | sort -rh | head -10
locate "*.root" | grep "/data/vami/backup/vami/projects/*" > rootFiles.txt

Changing paylaoads

# ----------------------------------------------------------------------
process.GlobalTag.toGet = cms.VPSet(
  cms.PSet(
	    record = cms.string("#RecordName1"),
           tag = cms.string("#TagName1"),
           connect = cms.untracked.string("frontier://FrontierProd/CMS_CONDITIONS)
          ),
  cms.PSet(
	    record = cms.string("#RecordName2"),
           tag = cms.string("#TagName2"),
           connect = cms.untracked.string("frontier://FrontierProd/CMS_CONDITIONS")
          )
)
# ----------------------------------------------------------------------

edm Tricks

edmConfigDump your_cfg.py > dumped_cfg.py

CRAB notes

If we want to resubmit jobs in another task, we do - crab report which creates a file in the /results/ that should be added as a lumiMask in the next task.

config.JobType.pyCfgParams = ['globalTag=80X_mcRun2_...']
['valami=1', 'masik=2', 'harmadik=3']

AlCaDB contact notes

Misc

process.GlobalTag.DumpStat = cms.untracked.bool(True)

#!/bin/tcsh

set storage=/afs/cern.ch/work/t/tvami/public/BadComponentAtPCL/CMSSW_11_0_X_2019-06-09-2300/src

foreach era (`/bin/ls $storage | grep Run | grep -v sh`)
    #echo $era
    foreach subfolder (`/bin/ls $storage/$era | grep 3`)
	echo $era $subfolder
	echo $storage/$era/$subfolder/promptCalibConditions.db
	conddb_import -c sqlite_file:SiPixelQualityFromDbRcd_other_Ultralegacy2018_v0_mc.db    -f sqlite_file:$storage/$era/$subfolder/promptCalibConditions.db -i SiPixelQualityFromDbRcd_other    -t SiPix
elQualityFromDbRcd_other_Ultralegacy2018_v0_mc
	conddb_import -c sqlite_file:SiPixelQualityFromDbRcd_stuckTBM_Ultralegacy2018_v0_mc.db -f sqlite_file:$storage/$era/$subfolder/promptCalibConditions.db -i SiPixelQualityFromDbRcd_stuckTBM -t SiPix
elQualityFromDbRcd_stuckTBM_Ultralegacy2018_v0_mc
	conddb_import -c sqlite_file:SiPixelQualityFromDbRcd_prompt_Ultralegacy2018_v0_mc.db   -f sqlite_file:$storage/$era/$subfolder/promptCalibConditions.db -i SiPixelQualityFromDbRcd_prompt   -t SiPix
elQualityFromDbRcd_prompt_Ultralegacy2018_v0_mc
    end
end

Modifying IOV bondary in a local sqlite file:
$ sqlite3 TrackerSurfaceDeformations_v9_offline.db 
SQLite version 3.22.0 2018-01-22 18:45:57
Enter ".help" for usage hints.
sqlite> UPDATE IOV SET SINCE=1 WHERE SINCE=271866;
sqlite> .q

All DB test config

MinBias

 cmsDriver.py -s GEN,SIM,DIGI,L1,DIGI2RAW,RAW2DIGI,L1Reco,RECO --evt_type MinBias_13TeV_pythia8_TuneCUETP8M1_cfi --conditions auto:phase1_2017_realistic --era Run2_2017 --geometry DB:Extended --fileout file:GENSIMRECO_MinBias.root --python_filename=PhaseI_MinBias_cfg.py --runUnscheduled -n 10
https://github.com/jkarancs/PhaseIPixelNtuplizer/blob/1a635835a54364a467758f4550215798b1e28208/test/run_PhaseIPixelNtuplizer_MinBias_cfg.py

MuonGun

https://github.com/cms-analysis/DPGAnalysis-SiPixelTools/blob/master/PixelHitAssociator/phase1/run_PixelHitAssociator_PhaseI_MuPt10_cfg.py
Options to run this:
cmsRun phase1/run_PixelHitAssociator_PhaseI_MuPt10_cfg.py saveRECO=1 useTemplates=0 useLocalLASim=1 useLocalLA=1 useLocalGenErr=1 useLocalTemplates=1 maxEvents=10000 noMagField=1 outputFileName=gen_phase1_91X_MC0T_10k.root 
cmsRun phase1/run_PixelHitAssociator_PhaseI_MuPt10_cfg.py saveRECO=1 useTemplates=1 useLocalLASim=1 useLocalLA=1 useLocalGenErr=1 useLocalTemplates=1 maxEvents=10000 noMagField=1 outputFileName=tem_phase1_91X_MC0T_10k.root

Adat:
cmsRun phase1/run_PixelHitAssociator_PhaseI_MuPt10_cfg.py saveRECO=1 useTemplates=0 useLocalLASim=1 useLocalLA=1 useLocalGenErr=1 useLocalTemplates=1 maxEvents=10000 outputFileName=gen_phase1_91X_Data38T_10k.root
cmsRun phase1/run_PixelHitAssociator_PhaseI_MuPt10_cfg.py saveRECO=1 useTemplates=1 useLocalLASim=1 useLocalLA=1 useLocalGenErr=1 useLocalTemplates=1 maxEvents=10000 outputFileName=tem_phase1_91X_Data38T_10k.root

Template&GenErr DB resolutions

https://github.com/cms-analysis/DPGAnalysis-SiPixelTools/blob/master/PixelHitAssociator/MCResolution.C#L70-L71
https://github.com/cms-analysis/DPGAnalysis-SiPixelTools/blob/master/PixelHitAssociator/MCResolution.C#L120-L130

LA DB

Geometrical coordinates are defined in: https://github.com/cms-analysis/DPGAnalysis-SiPixelTools/blob/master/PixelDBTools/test/SiPixelLorentzAngleDBLoader.cc#L109-L126
Exceptions are to be defined from rawID: https://github.com/cms-analysis/DPGAnalysis-SiPixelTools/blob/master/PixelDBTools/test/SiPixelLorentzAngleDBLoader.cc#L96-L107

Root notes

Standard things

htemp->GetXaxis()->SetRangeUser(-1.7,1.7);
TCanvas *c2_2 = new TCanvas("c2_2", "2_2",0,0,1355,523);

TH2D (const char *name, const char *title, 
      Int_t nbinsx, Double_t xlow, Double_t xup, 
      Int_t nbinsy, Double_t ylow, Double_t yup)
 

Custum canvas

TH2* h, std::string canname, 
      int gx = 0, int gy = 0,
      int histosize_x = 500, int histosize_y = 500,
      int mar_left = 80, int mar_right = 120,
      int mar_top = 20, int mar_bottom = 60,
      int title_align = 33, float title_y = 1.0, float title_x = 1.0,
      std::string draw="COLZ", bool norm=false, bool log=false
 
Some explanations:
  • title_align could be 11, 12, 13, 31, 32, 33 (upper right) For horizontal alignment the following convention applies: 1=left adjusted, 2=centered, 3=right adjusted
    For vertical alignment the following convention applies: 1=bottom adjusted, 2=centered, 3=top adjusted

prelim_lat_(double xmin, double xmax, double ymin, double ymax, bool in, double scale = -1)
draw_lat_(250, 198.0, "Module X", 1, 0.04, 0.0, 21);

-- TamasVami - 2016-01-14

Topic attachments
I Attachment History Action Size Date Who Comment
Cc LALayer.C r1 manage 9.7 K 2017-07-21 - 22:19 TamasVami  
Cc SiPixelLorentzAngleTree_.C r1 manage 1.5 K 2017-07-21 - 22:20 TamasVami  
Hh SiPixelLorentzAngleTree_.h r1 manage 7.2 K 2017-07-21 - 22:20 TamasVami  
Cc tdrstyle.C r1 manage 4.9 K 2017-07-21 - 22:20 TamasVami  
Edit | Attach | Watch | Print version | History: r29 < r28 < r27 < r26 < r25 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r29 - 2020-02-21 - TamasVami
 
This site is powered by the TWiki collaboration platform Powered by Perl This site is powered by the TWiki collaboration platformCopyright &© by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback