Important notice: opendata-dev.cern.ch is a development server. Please use it for testing purposes only. The content may be erased at any time. Please use opendata.cern.ch for production.
Physically, an event is the result of a single readout of the detector electronics and the signals that will (in general) have been generated by particles, tracks, energy deposits, present in a number of bunch crossings. The task of the online Trigger and Data Acquisition System (TriDAS) is to select, out of the millions of events recorded in the detector, the most interesting 100 or so per second, and then store them for further analysis. An event has to pass two independent sets of tests, or Trigger Levels, in order to qualify. The tests range from simple and of short duration (Level-1) to sophisticated ones requiring significantly more time to run (High Levels 2 and 3, called HLT). In the end, the HLT system creates RAW data events containing:
Original Source: TriggerSystem
The HLT contains many trigger paths, each corresponding to a dedicated trigger (such as a prescaled single-electron trigger or a 3-jets-with-MET trigger). A path consists of several steps (software modules), each module performing a well-defined task such as unpacking (raw to digi), reconstruction of physics objects (electrons, muons, jets, MET, etc.), making intermediate decisions triggering more refined reconstructions in subsequent modules, or calculating the final decision for a trigger path. The CMSSW Framework/EDM ensures that if an intermediate filter decision on a trigger path is negative, the rest of the path is not executed (skipped) and the specific trigger is regarded as rejecting the event. In order to save CPU time, each reconstruction step is followed by a filter in order to avoid running time-consuming reco code if it is already clear it will not be needed.
In general it is expected that all HLT trigger paths are run, even if the event is already accepted by a path. In case this turns out to be too time-consuming, a truncated mode of HLT operations should be foreseen where the HLT short-circuits after the first accept (and after the triggers needed to classify the event for a primary data set and output stream are computed) and does not run the rest of the triggers. Presumably, the triggers not run online could be run in the offline reconstruction step to compute all trigger bits (for events written out) in order to get a complete trigger picture allowing trigger efficiency studies.
Each HLT trigger path must be seeded by one or more L1 trigger bit seeds: the first filter module in each HLT path is looking for a suitable L1 seed (consisting of L1 bit[s] and L1 object[s]) as a starting point for that specific HLT trigger.
Original Source: SWGuideHighLevelTrigger
Two persistent HLT products are available:
TriggerResults
: (subclassed from HLTGlobalStatus
object), containing
all the usual decision bits.
The TriggerResults
product (available for events written to output)
allows access to the configuration and trigger decisions, i.e.,
all the usual "trigger bits", including:
The corresponding code can be found in DataFormats/Common/interface/TriggerResults.h and DataFormats/Common/interface/HLTGlobalStatus.h
TriggerEvent
: summarising the "L3" trigger collections and "L3" filter decisions.
The corresponding code can be found in DataFormats/HLTReco/interface/TriggerEvent.h
Additionally, the package HLTrigger/HLTcore contains several
analyzers pulling out the trigger information.
You can use the corresponding analyzers directly - see their cfi
files in
the python subdirectory - or copy relevant code pieces into your modules.
TriggerSummaryAnalyzerAOD
: analyser printing the content of the TriggerEvent
productHLTEventAnalyzerAOD
: analyser combining the information from TriggerResults and TriggerEvent productsThe HLTEventAnalyzer
plugin make use of the helper class HLTConfigProvider (also in HLTrigger/HLTcore), which extracts the HLT configuration (paths, modules) from the provenance.
Note: this helper class must be initialised calling it's init(...)
from the beginRun()
method of your plugin using this helper class. The reason
that it has to be (re-)initialised in beginRun()
is that the HLT
configuration can (only) change at the boundary between runs.
Original Source: Persistent Trigger Results Objects
Find the software and usage instructions in
and in TriggerAnalyzer and TriggerObjectAnalyzer packages in
For each dataset, the possible HLT trigger paths are listed in the dataset record. Each HLT path is composed of different algorithms and they are defined in the data-taking configuration files, which are given in
The exact parameters (such as pT or eta cuts, or else) for each component of the HLT path can be found in the data-taking configuration file.
As an example, if you are looking for information about HLT_Dimuon10_Jpsi
trigger, look for a string process.HLT_Dimuon10_Jpsi
in the data-taking configuration file corresponding to the event range of your interest.
You will find the corresponding CMSSW path, which consists of all the modules (i.e., specifically python-configured CMSSW code)
and/or sequences (groups of modules) that make up that trigger path.
You can then look for each of these modules or sequences in the same configuration file until your find
the parameters with which they are configured.
As an example, you will find one of the modules of this path, hltDimuon10JpsiL3Filtered
, by searching for
process.hltDimuon10JpsiL3Filtered
and you will see that it is an instantiation of class HLTMuonDimuonL3Filter
:
process.hltDimuon10JpsiL3Filtered = cms.EDFilter( "HLTMuonDimuonL3Filter",
followed by the values of different parameters for this algorithm. All HLT algorithm classes and definitions of the parameters therein can be found in HLTrigger, the example class above in HLTMuonDimuonL3Filter.h.