73 research outputs found

    StochKit-FF: Efficient Systems Biology on Multicore Architectures

    Full text link
    The stochastic modelling of biological systems is an informative, and in some cases, very adequate technique, which may however result in being more expensive than other modelling approaches, such as differential equations. We present StochKit-FF, a parallel version of StochKit, a reference toolkit for stochastic simulations. StochKit-FF is based on the FastFlow programming toolkit for multicores and exploits the novel concept of selective memory. We experiment StochKit-FF on a model of HIV infection dynamics, with the aim of extracting information from efficiently run experiments, here in terms of average and variance and, on a longer term, of more structured data.Comment: 14 pages + cover pag

    Compression and strength behaviour of viscose/polypropylene nonwoven fabrics

    Get PDF
    Compression and strength properties of viscose/polypropylene nonwoven fabrics has been studied. Compressionbehavior of the nonwoven samples (sample compressibility, sample thickness loss & sample compressive resilience) havebeen analyzed considering the magnitude of applied pressure, fabric weight, fabric thickness, and the porosity of thesamples. Based on the calculated porosity of the samples, pore compression behavior (pore compressibility, porosity loss &pore compressive resilience) are determined. Equations for the determination of pore compressibility, porosity loss, and porecompressive resilience, are established. Tensile strength and elongation as well as bursting strength and ball traverseelongation are also determined. The results show that the sample compression behavior as well as pore compressionbehavior depend on the magnitude of applied pressure. At the high level of applied pressure, a sample with highercompressibility has the lower sample compressive resilience. Differences in pore compressibility and porosity loss betweeninvestigated samples have also been registered, except in pore compressive resilience. Sample with the higher fabric weight,higher thickness, and lower porosity shows the lower sample compressibility, pore compressibility, sample thickness loss,porosity loss, and tensile elongation, but the higher tensile strength, bursting strength, and ball traverse elongation

    Quality of clothing fabrics in terms of their comfort properties

    Get PDF
    Quality of various clothing woven fabrics with respect to their comfort properties, such as electro-physical properties, air permeability, and compression properties has been studied. Fabrics are produced from cotton and cotton/polyester fibre blends in plain, twill, satin and basket weave. Results show that cotton fabrics have lower values of the volume resistivity, air permeability and compressive resilience but higher values of effective relative dielectric permeability and compressibility as compared to fabrics that have been produced from cotton/PES fibre blends. Regression analysis shows a strong linear correlative relationship between the air permeability and the porosity of the woven fabrics with very high coefficient of linear correlation (0.9807). It is also observed that comfort properties are determined by the structure of woven fabrics (raw material composition, type of weave) as well as by the fabrics surface condition. Findings of the studies have been used for estimating the quality of woven fabrics in terms of their comfort properties by the application of ranking method. It is concluded that the group of cotton fabrics exhibits better quality of comfort as compared to the group of cotton/PES blend fabrics.

    Analysing Astronomy Algorithms for GPUs and Beyond

    Full text link
    Astronomy depends on ever increasing computing power. Processor clock-rates have plateaued, and increased performance is now appearing in the form of additional processor cores on a single chip. This poses significant challenges to the astronomy software community. Graphics Processing Units (GPUs), now capable of general-purpose computation, exemplify both the difficult learning-curve and the significant speedups exhibited by massively-parallel hardware architectures. We present a generalised approach to tackling this paradigm shift, based on the analysis of algorithms. We describe a small collection of foundation algorithms relevant to astronomy and explain how they may be used to ease the transition to massively-parallel computing architectures. We demonstrate the effectiveness of our approach by applying it to four well-known astronomy problems: Hogbom CLEAN, inverse ray-shooting for gravitational lensing, pulsar dedispersion and volume rendering. Algorithms with well-defined memory access patterns and high arithmetic intensity stand to receive the greatest performance boost from massively-parallel architectures, while those that involve a significant amount of decision-making may struggle to take advantage of the available processing power.Comment: 10 pages, 3 figures, accepted for publication in MNRA

    Refactoring GrPPI:Generic Refactoring for Generic Parallelism in C++

    Get PDF
    Funding: EU Horizon 2020 project, TeamPlay (https://www.teamplay-xh2020.eu), Grant Number 779882, UK EPSRC Discovery, grant number EP/P020631/1, and Madrid Regional Government, CABAHLA-CM (ConvergenciA Big dAta-Hpc: de Los sensores a las Aplicaciones) Grant Number S2018/TCS-4423.The Generic Reusable Parallel Pattern Interface (GrPPI) is a very useful abstraction over different parallel pattern libraries, allowing the programmer to write generic patterned parallel code that can easily be compiled to different backends such as FastFlow, OpenMP, Intel TBB and C++ threads. However, rewriting legacy code to use GrPPI still involves code transformations that can be highly non-trivial, especially for programmers who are not experts in parallelism. This paper describes software refactorings to semi-automatically introduce instances of GrPPI patterns into sequential C++ code, as well as safety checking static analysis mechanisms which verify that introducing patterns into the code does not introduce concurrency-related bugs such as race conditions. We demonstrate the refactorings and safety-checking mechanisms on four simple benchmark applications, showing that we are able to obtain, with little effort, GrPPI-based parallel versions that accomplish good speedups (comparable to those of manually-produced parallel versions) using different pattern backends.Publisher PDFPeer reviewe

    SignS: a parallelized, open-source, freely available, web-based tool for gene selection and molecular signatures for survival and censored data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Censored data are increasingly common in many microarray studies that attempt to relate gene expression to patient survival. Several new methods have been proposed in the last two years. Most of these methods, however, are not available to biomedical researchers, leading to many re-implementations from scratch of ad-hoc, and suboptimal, approaches with survival data.</p> <p>Results</p> <p>We have developed SignS (Signatures for Survival data), an open-source, freely-available, web-based tool and R package for gene selection, building molecular signatures, and prediction with survival data. SignS implements four methods which, according to existing reviews, perform well and, by being of a very different nature, offer complementary approaches. We use parallel computing via MPI, leading to large decreases in user waiting time. Cross-validation is used to asses predictive performance and stability of solutions, the latter an issue of increasing concern given that there are often several solutions with similar predictive performance. Biological interpretation of results is enhanced because genes and signatures in models can be sent to other freely-available on-line tools for examination of PubMed references, GO terms, and KEGG and Reactome pathways of selected genes.</p> <p>Conclusion</p> <p>SignS is the first web-based tool for survival analysis of expression data, and one of the very few with biomedical researchers as target users. SignS is also one of the few bioinformatics web-based applications to extensively use parallelization, including fault tolerance and crash recovery. Because of its combination of methods implemented, usage of parallel computing, code availability, and links to additional data bases, SignS is a unique tool, and will be of immediate relevance to biomedical researchers, biostatisticians and bioinformaticians.</p

    Impact of renal impairment on atrial fibrillation: ESC-EHRA EORP-AF Long-Term General Registry

    Get PDF
    Background: Atrial fibrillation (AF) and renal impairment share a bidirectional relationship with important pathophysiological interactions. We evaluated the impact of renal impairment in a contemporary cohort of patients with AF. Methods: We utilised the ESC-EHRA EORP-AF Long-Term General Registry. Outcomes were analysed according to renal function by CKD-EPI equation. The primary endpoint was a composite of thromboembolism, major bleeding, acute coronary syndrome and all-cause death. Secondary endpoints were each of these separately including ischaemic stroke, haemorrhagic event, intracranial haemorrhage, cardiovascular death and hospital admission. Results: A total of 9306 patients were included. The distribution of patients with no, mild, moderate and severe renal impairment at baseline were 16.9%, 49.3%, 30% and 3.8%, respectively. AF patients with impaired renal function were older, more likely to be females, had worse cardiac imaging parameters and multiple comorbidities. Among patients with an indication for anticoagulation, prescription of these agents was reduced in those with severe renal impairment, p&nbsp;&lt;.001. Over 24&nbsp;months, impaired renal function was associated with significantly greater incidence of the primary composite outcome and all secondary outcomes. Multivariable Cox regression analysis demonstrated an inverse relationship between eGFR and the primary outcome (HR 1.07 [95% CI, 1.01–1.14] per 10&nbsp;ml/min/1.73&nbsp;m2 decrease), that was most notable in patients with eGFR &lt;30&nbsp;ml/min/1.73&nbsp;m2 (HR 2.21 [95% CI, 1.23–3.99] compared to eGFR ≄90&nbsp;ml/min/1.73&nbsp;m2). Conclusion: A significant proportion of patients with AF suffer from concomitant renal impairment which impacts their overall management. Furthermore, renal impairment is an independent predictor of major adverse events including thromboembolism, major bleeding, acute coronary syndrome and all-cause death in patients with AF

    Clinical complexity and impact of the ABC (Atrial fibrillation Better Care) pathway in patients with atrial fibrillation: a report from the ESC-EHRA EURObservational Research Programme in AF General Long-Term Registry

    Get PDF
    Background: Clinical complexity is increasingly prevalent among patients with atrial fibrillation (AF). The ‘Atrial fibrillation Better Care’ (ABC) pathway approach has been proposed to streamline a more holistic and integrated approach to AF care; however, there are limited data on its usefulness among clinically complex patients. We aim to determine the impact of ABC pathway in a contemporary cohort of clinically complex AF patients. Methods: From the ESC-EHRA EORP-AF General Long-Term Registry, we analysed clinically complex AF patients, defined as the presence of frailty, multimorbidity and/or polypharmacy. A K-medoids cluster analysis was performed to identify different groups of clinical complexity. The impact of an ABC-adherent approach on major outcomes was analysed through Cox-regression analyses and delay of event (DoE) analyses. Results: Among 9966 AF patients included, 8289 (83.1%) were clinically complex. Adherence to the ABC pathway in the clinically complex group reduced the risk of all-cause death (adjusted HR [aHR]: 0.72, 95%CI 0.58–0.91), major adverse cardiovascular events (MACEs; aHR: 0.68, 95%CI 0.52–0.87) and composite outcome (aHR: 0.70, 95%CI: 0.58–0.85). Adherence to the ABC pathway was associated with a significant reduction in the risk of death (aHR: 0.74, 95%CI 0.56–0.98) and composite outcome (aHR: 0.76, 95%CI 0.60–0.96) also in the high-complexity cluster; similar trends were observed for MACEs. In DoE analyses, an ABC-adherent approach resulted in significant gains in event-free survival for all the outcomes investigated in clinically complex patients. Based on absolute risk reduction at 1 year of follow-up, the number needed to treat for ABC pathway adherence was 24 for all-cause death, 31 for MACEs and 20 for the composite outcome. Conclusions: An ABC-adherent approach reduces the risk of major outcomes in clinically complex AF patients. Ensuring adherence to the ABC pathway is essential to improve clinical outcomes among clinically complex AF patients

    Impact of clinical phenotypes on management and outcomes in European atrial fibrillation patients: a report from the ESC-EHRA EURObservational Research Programme in AF (EORP-AF) General Long-Term Registry

    Get PDF
    Background: Epidemiological studies in atrial fibrillation (AF) illustrate that clinical complexity increase the risk of major adverse outcomes. We aimed to describe European AF patients\u2019 clinical phenotypes and analyse the differential clinical course. Methods: We performed a hierarchical cluster analysis based on Ward\u2019s Method and Squared Euclidean Distance using 22 clinical binary variables, identifying the optimal number of clusters. We investigated differences in clinical management, use of healthcare resources and outcomes in a cohort of European AF patients from a Europe-wide observational registry. Results: A total of 9363 were available for this analysis. We identified three clusters: Cluster 1 (n = 3634; 38.8%) characterized by older patients and prevalent non-cardiac comorbidities; Cluster 2 (n = 2774; 29.6%) characterized by younger patients with low prevalence of comorbidities; Cluster 3 (n = 2955;31.6%) characterized by patients\u2019 prevalent cardiovascular risk factors/comorbidities. Over a mean follow-up of 22.5 months, Cluster 3 had the highest rate of cardiovascular events, all-cause death, and the composite outcome (combining the previous two) compared to Cluster 1 and Cluster 2 (all P &lt;.001). An adjusted Cox regression showed that compared to Cluster 2, Cluster 3 (hazard ratio (HR) 2.87, 95% confidence interval (CI) 2.27\u20133.62; HR 3.42, 95%CI 2.72\u20134.31; HR 2.79, 95%CI 2.32\u20133.35), and Cluster 1 (HR 1.88, 95%CI 1.48\u20132.38; HR 2.50, 95%CI 1.98\u20133.15; HR 2.09, 95%CI 1.74\u20132.51) reported a higher risk for the three outcomes respectively. Conclusions: In European AF patients, three main clusters were identified, differentiated by differential presence of comorbidities. Both non-cardiac and cardiac comorbidities clusters were found to be associated with an increased risk of major adverse outcomes

    Application instrumentation for performance analysis and tuning with focus on energy efficiency

    No full text
    Profiling and tuning of parallel applications is an essential part of HPC. Analysis and elimination of application hot spots can be performed using many available tools, which also provides resource consumption measurements for instrumented parts of the code. Since complex applications show different behavior in each part of the code, it is essential to be able to insert instrumentation to analyse these parts. Because each performance analysis or autotuning tool can bring different insights into an application behavior, it is valuable to analyze and optimize an application using a variety of them. We present our on request inserted shared C/C++ API for the most common open-source HPC performance analysis tools, which simplify the process of the manual instrumentation. Besides manual instrumentation, profiling libraries provide different methods for instrumentation. Of these, the binary patching is the most universal mechanism, and highly improves the user-friendliness and robustness of the tool. We provide an overview of the most commonly used binary patching tools, and describe a workflow for how to use them to implement a binary instrumentation tool for any profiler or autotuner. We have also evaluated the minimum overhead of the manual and binary instrumentation.Web of Scienc
    • 

    corecore