69 research outputs found

    A proposed quantitative methodology for the evaluation of the effectiveness of Human Element, Leadership and Management (HELM) training in the UK

    Get PDF
    In 2006, a review of maritime accidents found that non-technical skills (NTSs) are the single largest contributing factor towards such incidents. NTSs are composed of both interpersonal and cognitive elements. These include things such as situational awareness, teamwork, decision making, leadership, management and communication skills. In a crisis situation, good NTSs allow a deck officer to quickly recognise that a problem exists and then harness the resources that are at their disposal to safely and efficiently bring the situation back under control. This paper has two aims. The first is to develop a methodology which will enable educators to quantitatively assess the impact of Maritime and Coastguard Agency (MCA)-approved Human Element, Leadership and Management (HELM) training on deck officer’s NTSs with a view to identifying further training requirements. The second is to determine whether the HELM training provided to develop the NTSs of trainee deck officers is fit for purpose. To achieve these aims, a three-phase approach was adopted. Initially, a taxonomy for deck officer’s NTSs is established, behavioural markers are identified and the relative importance of each attribute is calculated using the analytical hierarchy process (AHP). Subsequently, a set of scenarios were identified for the assessment of deck officer’s NTSs in a ship bridge simulator environment. A random selection of students that have completed the Chief Mate (CM) programme was performed, and data regarding their NTS-related performance in the scenarios was collected. Finally, the collected data was fed into the evidential reasoning (ER) algorithm, utility values were produced and, having established these values, the effectiveness of the HELM training that the students have received was then evaluated

    Crystalline phases involved in the hydration of calcium silicate-based cements: Semi-quantitative Rietveld X-ray diffraction analysis

    Get PDF
    Chemical comparisons of powder and hydrated forms of calcium silicate cements (CSCs) and calculation of alterations in tricalcium silicate (Ca3SiO5) calcium hydroxide (Ca(OH)2) are essential for understanding their hydration processes. This study aimed to evaluate and compare these changes in ProRoot MTA, Biodentine and CEM cement. Powder and hydrated forms of tooth coloured ProRoot MTA, Biodentine and CEM cement were subjected to X-ray diffraction (XRD) analysis with Rietveld refinement to semi-quantitatively identify and quantify the main phases involved in their hydration process. Data were reported descriptively. Reduction in Ca3SiO5 and formation of Ca(OH)2 were seen after the hydration of ProRoot MTA and Biodentine; however, in the case of CEM cement, no reduction of Ca3SiO5 and no formation of Ca(OH)2 were detected. The highest percentages of amorphous phases were seen in Biodentine samples. Ettringite was detected in the hydrated forms of ProRoot MTA and CEM cement but not in Biodentine

    The Coevolution of Virulence: Tolerance in Perspective

    Get PDF
    Coevolutionary interactions, such as those between host and parasite, predator and prey, or plant and pollinator, evolve subject to the genes of both interactors. It is clear, for example, that the evolution of pollination strategies can only be understood with knowledge of both the pollinator and the pollinated. Studies of the evolution of virulence, the reduction in host fitness due to infection, have nonetheless tended to focus on parasite evolution. Host-centric approaches have also been proposed—for example, under the rubric of “tolerance”, the ability of hosts to minimize virulence without necessarily minimizing parasite density. Within the tolerance framework, however, there is room for more comprehensive measures of host fitness traits, and for fuller consideration of the consequences of coevolution. For example, the evolution of tolerance can result in changed selection on parasite populations, which should provoke parasite evolution despite the fact that tolerance is not directly antagonistic to parasite fitness. As a result, consideration of the potential for parasite counter-adaptation to host tolerance—whether evolved or medially manipulated—is essential to the emergence of a cohesive theory of biotic partnerships and robust disease control strategies

    Determinants of costs and the length of stay in acute coronary syndromes : a real life analysis of more than 10 000 patients

    Full text link
    Aims: The aim of this study was to investigate inpatient costs of acute coronary syndromes (ACS) in Switzerland and to assess the main cost drivers associated with this disease. Methods and Results: We used the national multicenter registry AMIS (acute myocardial infarction in Switzerland) which includes a representative number of 65 hospitals and a total of 11.623 patient records. The following cost modules were analyzed: hospital stay, percutaneous coronary interventions (PCI) and thrombolysis. Expenses were assessed using data from official Swiss national statistical sources. Mean total costs per patient were 12.101 Euro (median 10.929 Euro; 95% CI: 1.161–27.722 Euro). The length of stay ranged from one to 129 days with a mean of 9.5 days (median 8.0 days; 95% CI: 1–23). Overall costs were independently influenced by age, gender and existent co-morbidities, e.g. cerebrovascular disease and diabetes (p < 0.0001). Conclusion: Our study determined specific causes for the high costs associated with hospital treatment on a large representative sample. The results should highlight unnecessary expenses and help policy makers to evaluate the base case for a DRG (Diagnosis Related Groups) scenario in Switzerland. Cost weighting of the identified secondary diagnosis should be considered in the calculation and coding of a primary diagnosis for ACS

    Multifractal and entropy analysis of resting-state electroencephalography reveals spatial organization in local dynamic functional connectivity

    Get PDF
    Functional connectivity of the brain fluctuates even in resting-state condition. It has been reported recently that fluctuations of global functional network topology and those of individual connections between brain regions expressed multifractal scaling. To expand on these findings, in this study we investigated if multifractality was indeed an inherent property of dynamic functional connectivity (DFC) on the regional level as well. Furthermore, we explored if local DFC showed region-specific differences in its multifractal and entropy-related features. DFC analyses were performed on 62-channel, resting-state electroencephalography recordings of twelve young, healthy subjects. Surrogate data testing verified the true multifractal nature of regional DFC that could be attributed to the presumed nonlinear nature of the underlying processes. Moreover, we found a characteristic spatial distribution of local connectivity dynamics, in that frontal and occipital regions showed stronger long-range correlation and higher degree of multifractality, whereas the highest values of entropy were found over the central and temporal regions. The revealed topology reflected well the underlying resting-state network organization of the brain. The presented results and the proposed analysis framework could improve our understanding on how resting-state brain activity is spatio-temporally organized and may provide potential biomarkers for future clinical research

    Impact of clinical phenotypes on management and outcomes in European atrial fibrillation patients: a report from the ESC-EHRA EURObservational Research Programme in AF (EORP-AF) General Long-Term Registry

    Get PDF
    Background: Epidemiological studies in atrial fibrillation (AF) illustrate that clinical complexity increase the risk of major adverse outcomes. We aimed to describe European AF patients\u2019 clinical phenotypes and analyse the differential clinical course. Methods: We performed a hierarchical cluster analysis based on Ward\u2019s Method and Squared Euclidean Distance using 22 clinical binary variables, identifying the optimal number of clusters. We investigated differences in clinical management, use of healthcare resources and outcomes in a cohort of European AF patients from a Europe-wide observational registry. Results: A total of 9363 were available for this analysis. We identified three clusters: Cluster 1 (n = 3634; 38.8%) characterized by older patients and prevalent non-cardiac comorbidities; Cluster 2 (n = 2774; 29.6%) characterized by younger patients with low prevalence of comorbidities; Cluster 3 (n = 2955;31.6%) characterized by patients\u2019 prevalent cardiovascular risk factors/comorbidities. Over a mean follow-up of 22.5 months, Cluster 3 had the highest rate of cardiovascular events, all-cause death, and the composite outcome (combining the previous two) compared to Cluster 1 and Cluster 2 (all P &lt;.001). An adjusted Cox regression showed that compared to Cluster 2, Cluster 3 (hazard ratio (HR) 2.87, 95% confidence interval (CI) 2.27\u20133.62; HR 3.42, 95%CI 2.72\u20134.31; HR 2.79, 95%CI 2.32\u20133.35), and Cluster 1 (HR 1.88, 95%CI 1.48\u20132.38; HR 2.50, 95%CI 1.98\u20133.15; HR 2.09, 95%CI 1.74\u20132.51) reported a higher risk for the three outcomes respectively. Conclusions: In European AF patients, three main clusters were identified, differentiated by differential presence of comorbidities. Both non-cardiac and cardiac comorbidities clusters were found to be associated with an increased risk of major adverse outcomes

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global challenges for seagrass conservation

    Get PDF
    Seagrasses, flowering marine plants that form underwater meadows, play a significant global role in supporting food security, mitigating climate change and supporting biodiversity. Although progress is being made to conserve seagrass meadows in select areas, most meadows remain under significant pressure resulting in a decline in meadow condition and loss of function. Effective management strategies need to be implemented to reverse seagrass loss and enhance their fundamental role in coastal ocean habitats. Here we propose that seagrass meadows globally face a series of significant common challenges that must be addressed from a multifaceted and interdisciplinary perspective in order to achieve global conservation of seagrass meadows. The six main global challenges to seagrass conservation are (1) a lack of awareness of what seagrasses are and a limited societal recognition of the importance of seagrasses in coastal systems; (2) the status of many seagrass meadows are unknown, and up-to-date information on status and condition is essential; (3) understanding threatening activities at local scales is required to target management actions accordingly; (4) expanding our understanding of interactions between the socio-economic and ecological elements of seagrass systems is essential to balance the needs of people and the planet; (5) seagrass research should be expanded to generate scientific inquiries that support conservation actions; (6) increased understanding of the linkages between seagrass and climate change is required to adapt conservation accordingly. We also explicitly outline a series of proposed policy actions that will enable the scientific and conservation community to rise to these challenges. We urge the seagrass conservation community to engage stakeholders from local resource users to international policy-makers to address the challenges outlined here, in order to secure the future of the world’s seagrass ecosystems and maintain the vital services which they supply

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore