3 research outputs found
Job Life Cycle Management Libraries for CMS Workflow Management Projects
Scientific analysis and simulation requires the processing and generation of millions of data samples. These processing and generation tasks are often comprised of multiple smaller tasks divided over multiple (computing) sites. This paper discusses the Compact Muon Solenoid (CMS) workflow infrastructure, and specifically the Python based workflow library which is used for so called task lifecycle management. The CMS workflow infrastructure consists of three layers: high level specification of the various tasks based on input/output datasets, life cycle management of task instances derived from the high level specification and execution management. The workflow library is the result of a convergence of three CMS subprojects that respectively deal with scientific analysis, simulation and real time data aggregation from the experiment
The CMS workload management system
CMS has started the process of rolling out a new workload management
system. This system is currently used for reprocessing and monte carlo
production with tests under way using it for user analysis.
It was decided to combine, as much as possible, the
production/processing, analysis and T0 codebases so as to reduce
duplicated functionality and make best use of limited developer and
testing resources.
This system now includes central request submission and management
(Request Manager); a task queue for parcelling up and distributing
work (WorkQueue) and agents which process requests by interfacing with
disparate batch and storage resources (WMAgent)