Tracker Alignment How-To

Mille/Pede Tracker Alignment

This page serves as a How To for running MillePede Tracker alignment framework. Commands are shown as step-by-step guide with brief descriptions.

I. Mille/Pede setup used for CRAFT11

The common alignment area for MP can be found here:


1) Setup CMSSW environment (was CMSSW_3_11_0_hackMP for 2010)

cd /afs/
source /afs/

OR for tcsh shell users

source /afs/

( As an example, in subdirectory /afs/ can find the file which I have used for the CRAFT11 alignment using CMSSW_3_11_0, and under ../mp635 the same config files for CMSSW_4_1_4_patch2_IOVpatch, and under .../mp0735 the config files for CMSSW_4_1_4_patch2_IOVpatch3 )

3) Go to the common alignment area

cd /afs/ 

The script will create a new directory and you have to give some comments in the file MP_ali_list.txt.

Copy the py config files (e.g. from under ../mp0735) also into the newly created folder and adjust them if needed.

Adjust the variables in the Perl script. For example

my $mssDir = '/castor/';

is the directory where the Millepede binary files are stored. Adjust this path, after you have created that directory (e.g.

nsmkdir /castor/


4) The variable %confhash describes the jobs to be setup. You have to provide the config file (py), the file with the input data (data), the number of jobs (njobs), and submit should be always equal to 1.

5) Afterwards, execute the script:

perl > log 2>&1 

6) Now submit the Mille jobs to the CAF with: 1000

The console output willl look something like:

bsub -J craft11align -q cmscaf1nd -m g_cmscaf /afs/
 Job <130792415> is submitted to queue <cmscaf1nd>.
bsub -J craft11align -q cmscaf1nd -m g_cmscaf /afs/
 Job <130792416> is submitted to queue <cmscaf1nd>.


you can study the job status. Once everything is labelled as DONE, you get the output with 

this will then recognize if all jobs are correctly finished. If some didn't finish correctly the mps tool will label as FAILED.

To resubmit the failed jobs do: FAIL

The retried jobs will once more go to SETUP state (this you can verify by saying '' again), and can be resubmitted with the 1000

command. The number of tries (which is reported by the mps_stat command) is incremented by one. More documentation: SWGuideMillepedeProductionSystem

7) You do the above steps a few times, until you have FAILED jobs. if ALL jobs are DONE, get the output with 

The output is the collected/filtered/refitted tracks out of which the input (for the alignment matrix problem) is created for the Pede job. These inputs then saved under the relevant subjobs folders.

8) Then submit the Pede job with: -mf 

9) The results are stored for each Mille job and the Pede job under jobData whereas the Pede output is stored in jobData/jobm. In the latter directory also the result file millepede.res can be found, if all went well with the Pede job.

10) THIS STEP IS NOT NEEDED ANYMORE, PLEASE SKIP: With the aid of a small script ( ) that I have copied to /afs/ you can adjust the labels in the millepede.res file. Executing the script in the same directory will produce several files called millepede.res_0, millepede.res_1 and so on. millepede.res_0 corresponds to period A2 and millepede.res_1 to period B. Consequently millepede.res_6 is the file for CRAFT11. Copy millepede.res_6 to a different directory and rename it to millepede.res .

11) Copy file from jobData/jobm to that directory, open the file and change

process.AlignmentProducer.algoConfig.mode = 'pede'


process.AlignmentProducer.algoConfig.mode = 'pedeRead'

and execute the script with


This procedure will then create a new alignment db file in the jobData/jobm folder.

II. Offline Validation Tools

This step is to rerun the track fitting using various alignment.db files. E.g. including that that has just been created by You with the MPS system above.

First, there are (at least) two TWiki pages you should read: SWGuideTkGeomComparisonTool TkAlAllInOneValidation

The common validation area can be found in AFS under


1) You should source the CMSSW installation under CMSSW_3_11_0, or CMSSW_4_1_2.

cd /afs/
source /afs/

OR for tcsh shell users

source /afs/

2) Got to the CAF working folder

cd /afs/

create some test folder

mkdir test_mine
cd test_mine/

Copy the config files:

cp /afs/ .
cp /afs/ .

(Or see /afs/ )

The interpretation of of these config files can be read from here:

You always need to provide a default validation ini file, and a specific other one which may overwrite default values. The most relevant is the field e.g.

mode=compare offline split
compare|EoYperiodF = Tracker SubDets
compare|TwistFreeperiodF = Tracker SubDets
compare|IDEAL = Tracker SubDets
compare|CRAFT11fromEoY = Tracker SubDets
compare|CRAFT11fromTwistFree = Tracker SubDets
compare|IDEAL=Tracker SubDets

Here you just say that you have a new alignment scenario, callsed IsoMu, and for that the alignments_MP.db file is in the dbpath . And additionally you want to compare Tracker and SubDets between the IsoMu and the others (EoYperiodF , TwistFreeperiodF , etc.) You run the validation (track refit) jobs as: -c defaultValidation_Cosmics_CMSSW_3_11_0.ini,craft11_pixelalignment.ini -N blablaname --getImages

which command will send the jobs to the batch queuing system. You can check the status of the jobs with:

bjobs -a

The configs and log files will be stored in blablaname/. The jobs are running on cmsexpress.

3) (THIS IS NOT NEEDED, JUST FOR INFORMATION) As an example in /afs/ the configs and log files for the CRAFT11 validation are stored. The results can be found in /afs/

4) If all your jobs are successfully done, you have to execute a script, called which should exist in your blahblahname folder:

cp /afs/ .

But be aware that this shell script is calling some ROOT/CINT macros. The script is generated automatically using the information in the .ini files. The ROOT/CINT commands are something like:

#run comparisons
#merge for OfflineValidation
root -q -b '/afs/"/afs/|5|5 , /afs/|4|4 , /afs/|2|2 , /afs/|3|3 , /afs/|1|1")'

Sometimes some files are empty (e.g. some tracks are filtered out) and as a result CINT is saying that there some error, or it can't run. Then you have to edit manually to exclude those files from the command(s). You see in the commands the files are seperated with coma, therefore just remove those ones from the list which generate the runtime errors.

If all is well you should see console output:

plot medianX>>myhisto(50,  -0.005000 , 0.005000) vs #modules
Info in <TCanvas::Print>: eps file /tmp/radbal/craft11_pixelalignment2/6170172907/ExtendedOfflineValidation_Images/DmedianR_TPB.eps has been created
plot medianX>>myhisto(50,  -0.005000 , 0.005000) vs #modules
Info in <TCanvas::Print>: eps file /tmp/radbal/craft11_pixelalignment2/6170172907/ExtendedOfflineValidation_Images/DmedianR_TIB.eps has been created
plot medianX>>myhisto(50,  -0.005000 , 0.005000) vs #modules
Info in <TCanvas::Print>: eps file /tmp/radbal/craft11_pixelalignment2/6170172907/ExtendedOfflineValidation_Images/DmedianR_TID.eps has been created
plot medianX>>myhisto(50,  -0.005000 , 0.005000) vs #modules
Info in <TCanvas::Print>: eps file /tmp/radbal/craft11_pixelalignment2/6170172907/ExtendedOfflineValidation_Images/DmedianR_TOB.eps has been created
plot medianX>>myhisto(50,  -0.005000 , 0.005000) vs #modules
Info in <TCanvas::Print>: eps file /tmp/radbal/craft11_pixelalignment2/6170172907/ExtendedOfflineValidation_Images/DmedianR_TEC.eps has been created

Apart from the files above stored under /tmp/radbal/... all the output root and image files are also stored under similar path as:


where $USERNAME/blahblahname is your working folder.

5) Finally, you can look at the plots like:

display /tmp/radbal/craft11_pixelalignment2/6170172907/ExtendedOfflineValidation_Images/DmedianR_TPB.eps

which should look something like:

6) Make sure you always know what plots are the important ones as the MPS and the Validation is generating a looots of plots....

Useful links and literature

The role of various scripts and the 2011 alignment validation

How to produce a new alignment .db file?

1. Choose a CMSSW version for alignment:

cd into the directory (for instance: /afs/

and run the setup scripts

source /afs/

OR for tcsh shell users

source /afs/

2011 alignment validation:

Do we need to do anything?

2. To create a new directory to work in (using the previously defined CMSSW version) do the following:

Go to the common alignment area:

cd /afs/

and create a new directory using the following script:


This will ask you to make a note about the purpose of the new directory in a txt file.

2011 alignment validation:

What exactly is needed to be done? One folder for MinBias and an other for Isolated Muon? Per person?

3. Get the python, pearl and txt (?) files and run Mille

Now you have to somehow get a hand on the various perl and python scripts, which are resposible for submitting the alignment jobs with the desired settings.

This usually happens by copying already existing scripts from some of the mpXYZV directories in the common alignment area into your newly created folder.



And run the pearl script
mps_fire 1000

Check the status, get the output:

Resubmit failed jobs: FAIL 1000

The output files will be located in jobData.

2011 alignment validation:

BALINT MinBias : scripts are in mp0735

BALINT: Isolated Muon: scripts are in mp0734

Both folders contain various python sctipts. For instance, in mp0735 I see the following ones:,,, and However, it only has a single pearl file:

The $millescript and $pedescript varables point to the old release area (CMSSW_4_1_4_patch2_IOVpatch3). Is it ok?

How is a "path" get selected in %confhash? Should we change any of them?

Gero sent dataset names. Are they in %confhash

4. Run Pede -mf

The output .db will be located in jobData/jobm.

How to run compare the content of the newly created .db file with the default one?

1. Set up CMSSW and create a new directory for validation:

cd /afs/
source /afs/

OR for tcsh shell users

source /afs/

Go the the validation area and create a directory:

cd /afs/
mkdir test_mine
cd test_mine/

2011 alignment validation:

JUSTYNA: /afs/

2. Create an .ini file

You have to set the path of your newly created .db file in the ini (a la


You also need a .ini file with the default .db file.

The default ini defines the dataset to be used. The configurable one defines the various scenarios. The names of the scenarios will appear in the validation figures. In case of a new alignment db file, a new scenario is needed to be defined with a unique name.

2011 alignment validation:

JUSTYNA: (default)




JUSTYNA: (configurable)



3. Submit jobs to make the comparisons: -c defaultValidation_Cosmics_CMSSW_3_11_0.ini,craft11_pixelalignment.ini -N blablaname --getImages

The output will be stored in the "blablaname" folder.

Check the status of the jobs by

bjobs -a

4. Produce the validation figures:

Copy TkAlMerge .sh from "blablanema" to the main directory and run it: ./

If the script crashes due to missing files, edit the script.

Part of the output root files and images will be located at /afs/ .

There is another folder created at /tmp/USERNAME/ which also contains images. Since the /tmp is cleaned usually, it makes sense to make a backup of this folder.

2011 alignment validation:

Which figures are interesting?

- DMR plots

- Chi2/ndf

- residuals

- DRR plots

Justyna's talk:

Alignment campaign 2011

It is the so called track-based validation which follows the millepede alignment. It is the first part of a complete validation process. It creates the root file which is the input for the so called offline validation which is part of the validation as a plotting tool.

1. Set up CMSSW:
cd /afs/
source /afs/

or for tcsh shell users

source /afs/

2. Go to the validation area and set up a directory
cd /afs/
mkdir whatever
cd whatever
3. Copy the default and configurable ini files
cp /afs/ .
cp /afs/ .
cp /afs/ .
cp /afs/ .

Rename the configurable ini files according to the IOV you want to analyse.

4. Modify the configurable ini files by changing the "dataset" in [localGeneral]

Justyna's talk:


5. Produce the ROOT files -c defaultValidation_IsoMuonsDef_CMSSW_4_2_4_patch1.ini,E_odd_validationIsoMuon_isosampleDef_4_2_4_patch1.ini  -N working_dir_Def --getImages

and -c defaultValidation_IsoMuons_CMSSW_4_2_4_patch1.ini,E_odd_validationIsoMuon_isosample_4_2_4_patch1.ini  -N working_dir --getImages

Check the status of the jobs by

bjobs -a

The output ROOT files will be located at /afs/

6. Run the macros to create the validation figures

Once the jobs are done, copy the macros for createing the validation figures :

cp /afs/ .
cp /afs/ .
cp /afs/ .
cp /afs/ .
cp /afs/ .

Rename the first three of them, change the parths of the ROOT files, and change all appearances of "E_odd_" in the macros according to your IOV. These macros will create all the validation plots: DRR, DMR, Chi2/ndf, and residuals.

The figures will be put into the /tmp/USERNAME directory.

Run the macros (the others are not be run directly):

root -x -b -q E_odd_isosample_PlotTheResiduals_8geom.C
root -x -b -q E_odd_TkAlExtendedOfflineValidation.C

September 2011 alignment (Krisztian's guesses)


1. Setup the CMSSW version (pure guess based on this email):

cd /afs/
source /afs/


source /afs/

2. Setup a new directory on the common alignment area (DO THIS ONLY ONCE!)

cd /afs/

Do not foget to edit MP_ali_list.txt.

3. Copy the config files

from ../mp0789 and edit them according to Joerg's and Gero's instructions. About the dataset to be used please see Gero's email.

The available global tags can be seen at Joerg recommedns the GR_P_V20:All global tag, but Krisztian finds that the latest global tag for CMSSW_4_2_X is GR_P_V22::All. Any advice on which global tag should be used would be much appreciated.

For comparison purposes please see the list of Krisztian's actions:

- working directory: mp0837

- I edited the as per Joerg's and Gero's intructions. The chosen dataset and global tag: (i) /SingleMu/Run2011A-TkAlMuonIsolated-v6/ALCARECO (run range: 172620 - 175770) and (ii) GR_P_V22::All

- I copied an official JSON file into my directory: Cert_160404-176309_7TeV_PromptReco_Collisions11_JSON.txt (created on the 23th of September)

- The converted and edited JSON file: convertedJson.txt. The is taught to use this file.

- I updated the startgeometry.txt

- I DID NOT update alignable.txt!!! See Gero's (already linked) email

- I changed the value of the $millescript, $pedescript, and $mpsdir variables from ...TkAl1 -> ...TkAl3. NOTE THAT THIS WAS NOT SUGGESTED BY JOERG!

4. Create jobs, submit them, check their status, and get the output files

perl 1000

Resubmit the failed jobs with FAIL

until all are successfully done.

5. Run Pede -mf

The output .db will be located in jobData/jobm.

MC alignment, 10.2011


Important emails on the topic (time ordered list): Joerg's summary of his work, Gero's 1st, Gero's 2nd, Roberto's 1st, Gero about weighting, Gero about Pede job being failed with CPU time issue, Gero about Pede job being failed due to some other reason, Gero about the meaning of method 1a)

Gero on plotting geometry comparisons: email1, email2, email3

Roberto's new geometries: email

Joerg's talk: Joerg's talk mentioned in Gero's first email

Part 1

cd /afs/

source /afs/


source /afs/

Then repeat Gero's work copying his mp0895 directory; no modification to be made yet:

cd /afs/ etc.

Part 2

Comparing geometries

The instructions were included in Roberto's email: email

scramv1 project -n CMSSW_4_2_8_patch7_StartGeo CMSSW CMSSW_4_2_8_patch7
cd CMSSW_4_2_8_patch7_StartGeo/src


cvs co -A Alignment/OfflineValidation
cvs co UserCode /castello/Alignment/ToolforMisalignment/plugins
cp -r UserCode /castello/Alignment/ToolforMisalignment/plugins/TrackerGeometryCompare.h Alignment/OfflineValidation/plugins/
cp -r UserCode /castello/Alignment/ToolforMisalignment/plugins/ Alignment/OfflineValidation/plugins/

rm UserCode -R
scram b

Now for running, you have to replace in the cfg contained in the test/ directory (BUT THESE FILES ARE NOT THERE):



with the two geometries (sqlite files) you want to compare and extract the Deltas:


at this point the Deltas will be automatically applied (running TkAlCompareCommonTracker to the ideal geometry in the output sqlite file: IDEAL+Deltas (change the inputs in the file according to the names) and a sqlite file is produced as output. This output could be used directly as starting geometry to be included in your alignment job, as we did for the one I produced for the MC scenario.

Validation from Ádám

Set up your working area

1, Set up the CMSSW version
cd /afs/

source /afs/

of (if you want to use tcsh shell)

source /afs/

2. Go to the validation area and create a new directory
cd /afs/
mkdir <New_DIRECTORY>

3. Copy .root files
cp -r /afs/ .


4. Modify the given .root files (better root scripts, i.e. .C files???)
According to All-in-One validation tool (, there are 4 different validations. This recepie presents 3 of them (Offline validation, Geometry comparison and Track Splitting).

Offline validation

4a. Offline validation
Scripts for offline validation can be found at <NEW_DIRECTORY>/Histograms/offline.

4a1. Modify and run TkAlExtendedOfflineValidation .C
Open TkAlExtendedOfflineValidation .C file with your favourite editor (e.i vi TkAlExtendedOfflineValidation ).

At line 11 there is a object decleration:
PlotAlignmentValidation p("/afs/", "IDEALplus_02Delta MC_42_V15B", 1, 1);
At this line validation results of the first geometry can be defined with the following sintax:
PlotAlingmentValidation p(<Path_of_validation_result>, <Name_of_the_result>, <Color_of_line>, <Style_of_line>);

New validation results can be added with the following lines:
p.loadFileList("/afs/", "IDEALplus_033Delta MC_42_V15B", 2, 1);

The sintax is the following:
p.loadFileList(<Path_of_validation_result>, <Name_of_the_result>, <Color_of_line>, <Style_of_line>);

Run TkAlExtendedOfflineValidation as
root -x -b -q TkAlExtendedOfflineValidation

4a2. Modify and run isosample_PlotTheResiduals_8geom.C
Open isosample_PlotTheResiduals_8geom.C with your favourite editor (e.i. vi isosample_PlotTheResiduals_8geom.C).

From line 67 to line 81 there are TFile* declaration which open the files of validation results of geometries. Originally, this file is optimalized for 8 validation results of geometries. Therefore if you want to run this macro for another number of validation results of geometries, you should modify the following lines (I give an example for 3 geometries):

At line 181: Remove source code from line 183 to line 187 because these lines can modify the maximum value of the plots.
At line 247: Modify the name of the validation results of geometries.
At line 273: Remove source code from line 276 to line 280 because these lines modify the subtitles (TLegend) of the plots.
At line 283: Remove source code from line 299 to line 317 because these lines reference to removed variables.
At line 329: Remove source code from line 333 to line 337 because these lines want to draw unused plots.

At line 613: Remove source code from line 616 to line 620 because these lines want to modify the maximum value of the plots.
At line 729: Remove source code from line 741 to line 759 because these lines modify the subtitles (TLegend) of the plots.
At line 762: Remove source code from line 792 to line 845 because these lines reference to removed variables.
At line 924: Remove source code from line 951 to line 955 because these lines response to draw plots.

Run isosample_PlotTheResiduals_8geom.C as
root -x -b -q isosample_PlotTheResiduals_8geom

Geometry comparison

4b. Geometry Comparison
These histograms are created by All-in-one Validation tool.

Track splitting

4c. Track Splitting
Script for Track Splitting can be found at <NEW_DIRECTORY>Histograms/split/

Open cosmicSplittingValidation_v7.C with your favourite editor (e.i. vi cosmicSplittingValidation_v7.C).

From line 343 to line 350 there are input files declarations. Originally, cosmicSplittingValidation_v7.C have been optimalizied for 7 validation results of geometries. Therefore you should modify this macro if you want to execute it for another number of validation results of geometries (I give an example for 3 geometries). Note that source code from line 357 to line 364 describes names of the geometries.

At line 328: Modify the nGeoms variable to the given number. Our case: const int nGeoms = 3;
At line 343: Remove source code from line 347 to line 350 because these lines reference to another input files.
At line 357: Remove source code from line 361 to line 364 because these lines response to name of these geometries which are not used.
At line 379: Remove source code from line 382 to line 385 because these lines reference to invalid variables.
At line 391: Remoce source code from line 394 to line 397 because these lines reference to invalid variables too.

Run cosmicSplittingValidation_v7.C as
root -x -b -q cosmicSplittingValidation_v7


5. Results of macros
Results of these macros can be found at /tmp/<USERNAME> directory.

Z->mumu validation

The instructions were included in Roberto's email: email

Use 42X.

cmsrel CMSSW_4_2_8_patch7
cd CMSSW_4_2_8_patch7/src
cvs co -A UserCode/castello/Alignment/ZmumuValidation.tar
cd UserCode/castello/Alignment
tar -xf ZmumuValidation.tar

Uncompressing results in errors, but some (all?) files are actually produced.

After this step Robert says that ``compile it from CMSSW_4_2_3_patch5/src''. This directory indeed seems to be produced, but scram fails to run in that directory. If I copy MuonAnalysis/MomentumScaleCalibration/ from Roberto's directory to the src of my local CMSSW release and try to compile it, it fails.

Pal's way (2012 january)

Initialize your working area:

cd /afs/
source /afs/
cd /afs/
mkdir -f test_hidas
cd test_hidas
mkdir Zmumu_Roberto
cd Zmumu_Roberto

Set up your code when doing it first time:

rfcp /castor/ .
tar xvf ZmumuValidation.tar

cd CMSSW_4_2_3_patch5
scramv1 b ProjectRename
scramv1 b clean
rm -r tmp
cd src/MuonAnalysis/MomentumScaleCalibration/test/RooFit
scramv1 b clean
rm CompareBias_cc.d
cd ../../../../..

scramv1 b

cd src/MuonAnalysis/MomentumScaleCalibration/test/

source /afs/

Next time and later do only this initialization:

cd CMSSW_4_2_3_patch5
cd src/MuonAnalysis/MomentumScaleCalibration/test/

source /afs/

Follow the instructions of README.txt found in this directory:

 Quick instructions on how to run Zmumu validation (R. Castello)

1) set up CAF environment:
source /afs/
source /afs/

depending on your shell

2) ALCARECO REFIT: Run specifying the input files (ALCAReco) to
  use and the TRK geometry (chenge the sqlite file or the tag in the
  ESPrefer) to be used in the refitting step.
  (By default output file is saved in the /tmp area, so please change them)

3) TREE PRODUCTION: Run the Zmumu validator to produce the tree (the default input is
  the refitted file previously saved in the /tmp).
  Move into /test/tree_production and run

cmsRun mode=tree geometryName=TEST

  The output tree is: zmumuTree_TEST.root

4) HISTOGRAMS PRODUCTION: Run Zmumu validator to produce the histograms
  (on the previously produced tree), by selecting the desired eta range for the muons.
  This step is pretty fast because is accessing tree.

cmsRun mode=histogram geometryName=TEST etaMin1=-2.1 etaMax1=2.1 etaMin2=-2.1 etaMax2=2.1

  OR the default mode without specifying ranges ( --> etaMin/Max=-2.5/2.5)

cmsRun mode=histogram geometryName=TEST

 The output file containing the interesting histograms is 0_histogram_TEST.root

5) FITTING Z MASS: extract the interesting istograms fitting the Z mass in each bin
  using RooFit macro --> go to test/RooFit directory and copy the previously produced

  0_histogram_TEST_1.root (for the first geometry)
  0_histogram_TEST_2.root (for the second geometry)
  in the test/RooFit directory and then run :

source RooFitSetup (set the Roofit environment on lxplus)
root -l (after changing the correct input file names in the macro)

 The output file CompareBias.root contains the istograms of interest.
 The individual canvases are saved in BiasCheck_TEST1.root and BiasCheck_TEST2.root, please use
 personal macros to extract and overlay the plots you want. You have
 some examples in /Macros

Here at section 5) above, if using (t)csh then some extra attention is needed. tcsh here will not work. RooFitSetup is a (ba)sh initilizator from which it is very starightforward to write a (t)csh one. But then the fit will not build, cause some header files would not be found. Change shell here in your window and do this step in bash, instead.

February alignment - Krisztian

Email from Roberto about our resposibility and out job: email


cd /afs/
source /afs/
cd /afs/

This results in the mp1055 directory. I take my setup used for aligning data from mp1029 and copied here. I also copied here the example 0 T config file from mp1046 (see Roberto's above-mentioned email).

cp ../mp1046/

Observation: in this config file the the alignment database is overwritten by hand! In earlier config files we used (for instance, this was not done. As far as understand the comment in the config file, this manual overwriting was done to get around some problem with ESPrefer. Should be checked with Gero! Does it mean that the startgeomerty.txt or the alignables.txt has no effect now, and the starting geometries are to be adjusted directly in the config file?

For the alignment of the 4 T data this will needed to be modified. After these steps I removed the unnecessary files from the directory and made the following modifications to the remaining files:

- several things in (the location of the MillePede scripts, endpoint directories, the config file to be run, global tag, json file)

Check with Gero: global tag (I use GR_P_V28)? JSON?

Test 0T dataset: /Cosmics/Commissioning12-TkAlCosmics0T-v1/ALCARECO

To produce the list of ROOT files corresponding to the above-mentioned dataset do:

dbs search --query='find file where dataset=/Cosmics/Commissioning12-TkAlCosmics0T-v1/ALCARECO and file.numevents>0' | grep -i store > bla.txt

Add "CastorPool=cmscaf" to the beginning of bla.txt to create the final input file list for the script.

I made a simply JSON file, which accepts everyting, and fed it

Run Mille and then Pede by:

perl 1000 FAIL -mf

These complete successfully.

Topic attachments
I Attachment History Action Size Date Who Comment
Pngpng DmedianR_TPB.png r1 manage 5.3 K 2011-04-05 - 07:50 RadicsBalint DmedianR _TPB.png example
Loglog Gero_PedeFails.log r1 manage 4.7 K 2011-10-29 - 09:05 KrisztianKrajczar  
Loglog Gero_PedeFailure.log r1 manage 9.8 K 2011-10-31 - 09:21 KrisztianKrajczar  
Txttxt Gero_Sept_Summary.txt r1 manage 3.9 K 2011-09-23 - 13:54 KrisztianKrajczar  
Txttxt Gero_dataset.txt r1 manage 2.3 K 2011-09-25 - 13:38 KrisztianKrajczar  
Loglog Gero_first.log r1 manage 6.0 K 2011-10-26 - 07:43 KrisztianKrajczar  
Loglog Gero_oneimportant.log r1 manage 11.0 K 2011-10-27 - 12:55 KrisztianKrajczar  
Loglog Gero_plotting.log r1 manage 2.1 K 2011-11-02 - 11:01 KrisztianKrajczar  
Loglog Gero_plotting2.log r1 manage 3.6 K 2011-11-02 - 13:22 KrisztianKrajczar  
Loglog Gero_plotting3.log r1 manage 13.3 K 2011-11-10 - 08:37 KrisztianKrajczar  
Loglog Gero_resubmit.log r1 manage 7.3 K 2011-10-31 - 12:56 KrisztianKrajczar  
Loglog Gero_second.log r1 manage 4.7 K 2011-10-26 - 07:45 KrisztianKrajczar  
Txttxt Joerg_Sept_Summary.txt r1 manage 6.4 K 2011-09-23 - 12:55 KrisztianKrajczar  
Txttxt Joerg_directory.txt r1 manage 2.2 K 2011-09-23 - 13:01 KrisztianKrajczar  
Loglog Joerg_earlierRelease.log r1 manage 5.1 K 2011-10-28 - 15:32 KrisztianKrajczar  
Loglog Roberto_Febr.log r1 manage 2.8 K 2012-02-29 - 09:14 KrisztianKrajczar  
Txttxt Roberto_GeomManipulation.txt r1 manage 3.8 K 2012-01-05 - 10:51 KrisztianKrajczar  
Loglog Roberto_newGeometries.log r1 manage 5.0 K 2011-11-04 - 14:14 KrisztianKrajczar  
Loglog roberto_first.log r1 manage 6.7 K 2011-10-26 - 07:59 KrisztianKrajczar  
Edit | Attach | Watch | Print version | History: r46 < r45 < r44 < r43 < r42 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r46 - 2012-03-13 - PalHidas
This site is powered by the TWiki collaboration platform Powered by Perl This site is powered by the TWiki collaboration platformCopyright &© by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback