1,353 research outputs found

    Provisioning of data locality for HEP analysis workflows

    Get PDF
    The heavily increasing amount of data produced by current experiments in high energy particle physics challenge both end users and providers of computing resources. The boosted data rates and the complexity of analyses require huge datasets being processed in short turnaround cycles. Usually, data storages and computing farms are deployed by different providers, which leads to data delocalization and a strong influence of the interconnection transfer rates. The CMS collaboration at KIT has developed a prototype enabling data locality for HEP analysis processing via two concepts. A coordinated and distributed caching approach that reduce the limiting factor of data transfers by joining local high performance devices with large background storages were tested. Thereby, a throughput optimization was reached by selecting and allocating critical data within user work-flows. A highly performant setup using these caching solutions enables fast processing of throughput dependent analysis workflows

    Boosting Performance of Data-intensive Analysis Workflows with Distributed Coordinated Caching

    Get PDF
    Data-intensive end-user analyses in high energy physics require high data throughput to reach short turnaround cycles. This leads to enormous challenges for storage and network infrastructure, especially when facing the tremendously increasing amount of data to be processed during High-Luminosity LHC runs. Including opportunistic resources with volatile storage systems into the traditional HEP computing facilities makes this situation more complex. Bringing data close to the computing units is a promising approach to solve throughput limitations and improve the overall performance. We focus on coordinated distributed caching by coordinating workows to the most suitable hosts in terms of cached files. This allows optimizing overall processing efficiency of data-intensive workows and efficiently use limited cache volume by reducing replication of data on distributed caches. We developed a NaviX coordination service at KIT that realizes coordinated distributed caching using XRootD cache proxy server infrastructure and HTCondor batch system. In this paper, we present the experience gained in operating coordinated distributed caches on cloud and HPC resources. Furthermore, we show benchmarks of a dedicated high throughput cluster, the Throughput-Optimized Analysis-System (TOpAS), which is based on the above-mentioned concept

    Federation of compute resources available to the German CMS community

    Get PDF
    The German CMS community (DCMS) as a whole can benefit from the various compute resources, available to its different institutes. While Grid-enabled and National Analysis Facility resources are usually shared within the community, local and recently enabled opportunistic resources like HPC centers and cloud resources are not. Furthermore, there is no shared submission infrastructure available. Via HTCondor\u27s [1] mechanisms to connect resource pools, several remote pools can be connected transparently to the users and therefore used more efficiently by a multitude of user groups. In addition to the statically provisioned resources, also dynamically allocated resources from external cloud providers as well as HPC centers can be integrated. However, the usage of such dynamically allocated resources gives rise to additional complexity. Constraints on access policies of the resources, as well as workflow necessities have to be taken care of. To maintain a well-defined and reliable runtime environment on each resource, virtualization and containerization technologies such as virtual machines, Docker, and Singularity, are used

    Mastering Opportunistic Computing Resources for HEP

    Get PDF
    As results of the excellent LHC performance in 2016, more data than expected has been recorded leading to a higher demand for computing resources. It is already foreseeable that for the current and upcoming run periods a flat computing budget and the expected technology advance will not be sufficient to meet the future requirements. This results in a growing gap between supplied and demanded resources. One option to reduce the emerging lack of computing resources is the utilization of opportunistic resources such as local university clusters, public and commercial cloud providers, HPC centers and volunteer computing. However, to use opportunistic resources additional challenges have to be tackled. At the Karlsruhe Institute of Technology (KIT) an infrastructure to dynamically use opportunistic resources is built up. In this paper tools, experiences, future plans and possible improvements are discussed

    HEP Analyses on Dynamically Allocated Opportunistic Computing Resources

    Get PDF
    The current experiments in high energy physics (HEP) have a huge data rate. To convert the measured data, an enormous number of computing resources is needed and will further increase with upgraded and newer experiments. To fulfill the ever-growing demand the allocation of additional, potentially only temporary available non-HEP dedicated resources is important. These so-called opportunistic resources cannot only be used for analyses in general but are also well-suited to cover the typical unpredictable peak demands for computing resources. For both use cases, the temporary availability of the opportunistic resources requires a dynamic allocation, integration, and management, while their heterogeneity requires optimization to maintain high resource utilization by allocating best matching resources. To find the best matching resources which should be allocated is challenging due to the unpredictable submission behavior as well as an ever-changing mixture of workflows with different requirements. Instead of predicting the best matching resource, we base our decisions on the utilization of resources. For this reason, we are developing the resource manager TARDIS (Transparent Adaptive Resource Dynamic Integration System) which manages and dynamically requests or releases resources. The decision of how many resources TARDIS has to request is implemented in COBalD (COBald - The Opportunistic Balancing Daemon) to ensure further allocation of well-used resources while reducing the amount of insufficiently used ones. TARDIS allocates and manages resources from various resource providers such as HPC centers or commercial and public clouds while ensuring a dynamic allocation and efficient utilization of these heterogeneous opportunistic resources. Furthermore, TARDIS integrates the allocated opportunistic resources into one overlay batch system which provides a single point of entry for all users. In order to provide the dedicated HEP software environment, we use virtualization and container technologies. In this contribution, we give an overview of the dynamic integration of opportunistic resources via TARDIS/COBalD in our HEP institute as well as how user analyses benefit from these additional resources

    Exploring the Anticancer Activity of Tamoxifen-Based Metal Complexes Targeting Mitochondria

    Get PDF
    Two new 'hybrid' metallodrugs of Au(III)(AuTAML)and Cu(II) (CuTAML) were designed featuring a tamoxifen-derived pharmacophoreto ideally synergize the anticancer activity of both the metal centerand the organic ligand. The compounds have antiproliferative effectsagainst human MCF-7 and MDA-MB 231 breast cancer cells. Moleculardynamics studies suggest that the compounds retain the binding activityto estrogen receptor (ER & alpha;). In vitro and in silico studies showed that the Au(III) derivative isan inhibitor of the seleno-enzyme thioredoxin reductase, while theCu(II) complex may act as an oxidant of different intracellular thiols.In breast cancer cells treated with the compounds, a redox imbalancecharacterized by a decrease in total thiols and increased reactiveoxygen species production was detected. Despite their different reactivitiesand cytotoxic potencies, a great capacity of the metal complexes toinduce mitochondrial damage was observed as shown by their effectson mitochondrial respiration, membrane potential, and morphology

    A Prospective Pilot Study to Identify a Myocarditis Cohort who may Safely Resume Sports Activities 3 Months after Diagnosis

    Get PDF
    International cardiovascular society recommendations to return to sports activities following acute myocarditis are based on expert consensus in the absence of prospective studies. We prospectively enrolled 30 patients with newly diagnosed myocarditis based on clinical parameters, laboratory measurements and cardiac magnetic resonance imaging with mildly reduced or pre served left ventricular ejection fraction (LVEF) with a follow-up of 12 months. Cessation of physical activity was recommended for 3 months. The average age was 35 (19–80) years with 73% male patients. One case of non-sustained ventricular tachycardia was recorded during 48-h-Holter electrocardiogram. Except for this case, all patients were allowed to resume physical exercise after 3 months. At 6- (n = 26) and 12-month (n = 19) follow-up neither cardiac events nor worsening LVEF were recorded. The risk of cardiac events at 1 year after diagnosis of myocarditis appears to be low after resumption of exercise after 3 months among patients who recover from acute myocarditis

    Myocarditis following COVID-19 vaccine: incidence, presentation, diagnosis, pathophysiology, therapy, and outcomes put into perspective. A clinical consensus document supported by the Heart Failure Association of the European Society of Cardiology (ESC) and the ESC Working Group on Myocardial and Pericardial Diseases

    Get PDF
    Over 10 million doses of COVID-19 vaccines based on RNA technology, viral vectors, recombinant protein, and inactivated virus have been administered worldwide. Although generally very safe, post-vaccine myocarditis can result from adaptive humoral and cellular, cardiac-specific inflammation within days and weeks of vaccination. Rates of vaccine-associated myocarditis vary by age and sex with the highest rates in males between 12 and 39 years. The clinical course is generally mild with rare cases of left ventricular dysfunction, heart failure and arrhythmias. Mild cases are likely underdiagnosed as cardiac magnetic resonance imaging (CMR) is not commonly performed even in suspected cases and not at all in asymptomatic and mildly symptomatic patients. Hospitalization of symptomatic patients with electrocardiographic changes and increased plasma troponin levels is considered necessary in the acute phase to monitor for arrhythmias and potential decline in left ventricular function. In addition to evaluation for symptoms, electrocardiographic changes and elevated troponin levels, CMR is the best non-invasive diagnostic tool with endomyocardial biopsy being restricted to severe cases with heart failure and/or arrhythmias. The management beyond. guideline-directed treatment of heart failure and arrhythmias includes non-specific measures to control pain. Anti-inflammatory drugs such as non-steroidal anti-inflammatory drugs, and corticosteroids have been used in more severe cases, with only anecdotal evidence for their effectiveness. In all age groups studied, the overall risks of SARS-CoV-2 infection-related hospitalization and death are hugely greater than the risks from post-vaccine myocarditis. This consensus statement serves as a practical resource for physicians in their clinical practice, to understand, diagnose, and manage affected patients. Furthermore, it is intended to stimulate research in this area
    • …
    corecore