You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When condition data are read from /cvmfs/cms-opendata.cern.ch the full database is read before the job starts. It can be several GBs. The analysis jobs typically need only a fraction of these condition data.
Downloading all these files could be avoided by having the needed database files locally.
For example, TriggerInfoTool/GeneralInfoAnalyzer can be run by copying nice databases in the local container and modifying the config file to access them locally instead of the full condition database.
The database files needed to the GeneralInfoAnalyzer job are:
~/CMSSW_5_3_32/src/TriggerInfoTool $ ls -lh *db
-rw-r--r-- 1 cmsusr cmsusr 84K Jun 20 14:54 AlCaRecoHLTpaths8e29_1e31_v13_offline.db
-rw-r--r-- 1 123 130 584K Jan 21 2016 L1GtPrescaleFactorsAlgoTrig_CRAFT09v2_hlt.db
-rw-r--r-- 1 123 130 260K Jan 21 2016 L1GtPrescaleFactorsTechTrig_CRAFT09v2_hlt.db
-rw-r--r-- 1 cmsusr cmsusr 38K Jun 20 18:27 L1GtStableParameters_CRAFT09_hlt.db
-rw-r--r-- 1 123 130 231K Jan 21 2016 L1GtTriggerMaskAlgoTrig_CRAFT09v2_hlt.db
-rw-r--r-- 1 123 130 214K Jan 21 2016 L1GtTriggerMaskTechTrig_CRAFT09v2_hlt.db
-rw-r--r-- 1 123 130 34K Jan 21 2016 L1GtTriggerMaskVetoAlgoTrig_CRAFT09_hlt.db
-rw-r--r-- 1 123 130 125K Jan 21 2016 L1GtTriggerMaskVetoTechTrig_CRAFT09v2_hlt.db
-rw-r--r-- 1 123 130 410K Jan 21 2016 L1GtTriggerMenu_CRAFT09_hlt.db
and they can be copied to the container from a host with /cvmfs/cms-opendata-conddb.cern.ch mounted with
An update: @clelange prepared a new main db file that can be used to avoid the separate process blocks for each database file in the config file.
It is a stripped version of /cvmfs/cms-opendata-conddb.cern.ch/FT_53_LV5_AN1.db and points to a local FT_53_LV5_AN1 directory instead of /cvmfs/cms-opendata-conddb.cern.ch/FT_53_LV5_AN1 for the database files
The usage would then be:
have that file in the directory where the jobs is run
have the local db files in a local FT_53_LV5_AN1 directory
@caredg for the record:
When condition data are read from /cvmfs/cms-opendata.cern.ch the full database is read before the job starts. It can be several GBs. The analysis jobs typically need only a fraction of these condition data.
Downloading all these files could be avoided by having the needed database files locally.
For example,
TriggerInfoTool/GeneralInfoAnalyzer
can be run by copying nice databases in the local container and modifying the config file to access them locally instead of the full condition database.The database files needed to the GeneralInfoAnalyzer job are:
and they can be copied to the container from a host with
/cvmfs/cms-opendata-conddb.cern.ch
mounted withThe modifications needed for the config file are:
and the output is as:
The text was updated successfully, but these errors were encountered: