95 research outputs found
Competing risks models and time-dependent covariates
New statistical models for analysing survival data in an intensive care unit context have recently been developed. Two models that offer significant advantages over standard survival analyses are competing risks models and multistate models. Wolkewitz and colleagues used a competing risks model to examine survival times for nosocomial pneumonia and mortality. Their model was able to incorporate time-dependent covariates and so examine how risk factors that changed with time affected the chances of infection or death. We briefly explain how an alternative modelling technique (using logistic regression) can more fully exploit time-dependent covariates for this type of data
Health service pathways for patients with chronic leg ulcers: identifying effective pathways for facilitation of evidence based wound care
Background: Chronic leg ulcers cause long term ill-health for older adults and the condition places a significant burden on health service resources. Although evidence on effective management of the condition is available, a significant evidence-practice gap is known to exist, with many suggested reasons e.g. multiple care providers, costs of care and treatments. This study aimed to identify effective health service pathways of care which facilitated evidence-based management of chronic leg ulcers.
Methods: A sample of 70 patients presenting with a lower limb leg or foot ulcer at specialist wound clinics in Queensland, Australia were recruited for an observational study and survey. Retrospective data were collected on demographics, health, medical history, treatments, costs and health service pathways in the previous 12 months. Prospective data were collected on health service pathways, pain, functional ability, quality of life, treatments, wound healing and recurrence outcomes for 24 weeks from admission.
Results: Retrospective data indicated that evidence based guidelines were poorly implemented prior to admission to the study, e.g. only 31% of participants with a lower limb ulcer had an ABPI or duplex assessment in the previous 12 months. On average, participants accessed care 2–3 times/week for 17 weeks from multiple health service providers in the twelve months before admission to the study clinics. Following admission to specialist wound clinics, participants accessed care on average once per week for 12 weeks from a smaller range of providers. The median ulcer duration on admission to the study was 22 weeks (range 2–728 weeks). Following admission to wound clinics, implementation of key indicators of evidence based care increased (p < 0.001) and Kaplan-Meier survival analysis found the median time to healing was 12 weeks (95% CI 9.3–14.7). Implementation of evidence based care was significantly related to improved healing outcomes (p < 0.001).
Conclusions: This study highlights the complexities involved in accessing expertise and evidence based wound care for adults with chronic leg or foot ulcers. Results demonstrate that access to wound management expertise can promote streamlined health services and evidence based wound care, leading to efficient use of health resources and improved health
Microcontact Printing: How the Reusability of Stamps Without Reinking Affects Cell Adhesion
Microcontact printing is a method that utilizes a polydimethylsiloxane (PDMS) stamp to pattern extracellular matrix (ECM) onto a substrate which can be used to adhere to biological substances such as proteins and cells. This technique is effective for studying, maintaining, and isolating biological variables. Specifically, it has been used for creating neural networks and understanding cell adhesion and differentiation. Stamps are often reinked with the ECM substrate before each use, a time-consuming process. Others continue to reuse the stamp without reinking to shorten the process. Thus, it is necessary to understand the effects that stamping without reinking has on cell adherence. This was investigated by fabricating three replicate PDMS stamps using the columns on pennies as a mold. Each stamp was used to stamp gelatin into three separate well plates without reinking the gelatin between uses. Cells were then seeded onto the stamped ECM and fixed after 24 hours. The cell cytoskeletons were dyed with a DAPI/TRITC-phalloidin/ PBST-T solution and imaged using a fluorescent microscope where the cell adhesion was quantified by calculating the confluency using ImageJ software. There was no statistical difference between the cell adhesion of stamps and the number of repeated uses. However, stamp 1 had cell adhesion for each repeated use with higher confluency values and the other stamps had little or no cell adhesion. Small sample size and variability in the stamps during fabrication could have led to insignificant statistical results. A larger sample size and high-quality stamps in future iterations could statistically support that cell adhesion decreases as the stamp is used repeatedly without reinking. Thus, it is necessary to reink the stamp before each use when using it in applications such as studying cell adhesion, proliferation, and differentiation.Noundergraduat
Is it worth screening elective orthopaedic patients for carriage of Staphylococcus aureus ? A part-retrospective case–control study in a Scottish hospital
Background With recent focus on methicillin-resistant Staphylococcus aureus (MRSA) screening, methicillin-susceptible S. aureus (MSSA) has been overlooked. MSSA infections are costly and debilitating in orthopaedic surgery.Methods We broadened MRSA screening to include MSSA for elective orthopaedic patients. Preoperative decolonisation was offered if appropriate. Elective and trauma patients were audited for staphylococcal infection during 2 6-month periods (A: January to June 2013 MRSA screening; B: January to June 2014 MRSA and MSSA screening). Trauma patients are not screened presurgery and provided a control. MSSA screening costs of a modelled cohort of 500 elective patients were offset by changes in number and costs of MSSA infections to demonstrate the change in total health service costs.Findings Trauma patients showed similar infection rates during both periods (p=1). In period A, 4 (1.72%) and 15 (6.47%) of 232 elective patients suffered superficial and deep MSSA infections, respectively, with 6 superficial (2%) and 1 deep (0.3%) infection among 307 elective patients during period B. For any MSSA infection, risk ratios were 0.95 (95% CI 0.41 to 2.23) for trauma and 0.28 (95% CI 0.12 to 0.65) for elective patients (period B vs period A). For deep MSSA infections, risk ratios were 0.58 (95% CI 0.20 to 1.67) for trauma and 0.05 (95% CI 0.01 to 0.36) for elective patients (p=0.011). There were 29.12 fewer deep infections in the modelled cohort of 500 patients, with a cost reduction of £831 678 for 500 patients screened.Conclusions MSSA screening for elective orthopaedic patients may reduce the risk of deep postoperative MSSA infection with associated cost-benefits
Recommended from our members
A high-resolution map of human evolutionary constraint using 29 mammals.
The comparison of related genomes has emerged as a powerful lens for genome interpretation. Here we report the sequencing and comparative analysis of 29 eutherian genomes. We confirm that at least 5.5% of the human genome has undergone purifying selection, and locate constrained elements covering ∼4.2% of the genome. We use evolutionary signatures and comparisons with experimental data sets to suggest candidate functions for ∼60% of constrained bases. These elements reveal a small number of new coding exons, candidate stop codon readthrough events and over 10,000 regions of overlapping synonymous constraint within protein-coding exons. We find 220 candidate RNA structural families, and nearly a million elements overlapping potential promoter, enhancer and insulator regions. We report specific amino acid residues that have undergone positive selection, 280,000 non-coding elements exapted from mobile elements and more than 1,000 primate- and human-accelerated elements. Overlap with disease-associated variants indicates that our findings will be relevant for studies of human biology, health and disease
Towards net zero in agriculture: future challenges and opportunities for arable, livestock and protected cropping systems in the UK
© The AuthorsThe agricultural sector faces multiple challenges linked to increased climate uncertainty, causing severe shocks including increased frequency of extreme weather events, new pest and disease risks, soil degradation, and pre and postharvest food losses. This situation is further exacerbated by geopolitical instability and volatility in energy prices impacting on fertiliser supplies and production costs. Net zero strategies are vital to achieve both food security and address negative environmental impacts. This perspective paper reviews and assesses the most viable options (actions) to achieve net zero with a focus on the arable/livestock and protected cropping sectors in the UK. The methodology was based on a synthesis of relevant literature, coupled with expert opinions using the holistic PESTLE (Political, Environmental, Social, Technological, Legal and Environmental) approach to categorise actions, leading to formulation of a roadmap to achieve net zero. The PESTLE analysis indicated that there are technically and economically viable actions available which need to be prioritised depending on the ease of their implementation within the two crop sectors investigated. These actions include (i) policy changes that are better aligned to net zero; (ii) circular economy approaches; (iii) connectivity and accessibility of information; (iv) increased resilience to shocks; (v) changing diets, nutrition and lifestyles; (vi) target setting and attainment; and (vii) farm economics and livelihoods. The outputs can be used by stakeholders and decision makers to inform policy and drive meaningful changes in global food and environmental security
Determinants of serum zinc in a random population sample of four Belgian towns with different degrees of environmental exposure to cadmium
This report investigated the distribution of serum zinc and the factors determining serum zinc concentration in a large random population sample. The 1977 participants (959 men and 1018 women), 20–80 years old, constituted a stratified random sample of the population of four Belgian districts, representing two areas with low and two with high environmental exposure to cadmium. For each exposure level, a rural and an urban area were selected. The serum concentration of zinc, frequently used as an index for zinc status in human subjects, was higher in men (13.1 μmole/L, range 6.5–23.0 μmole/L) than in women (12.6 μmole/L, range 6.3–23.2 μmole/L). In men, 20% of the variance of serum zinc was explained by age (linear and squared term, R = 0.29), diurnal variation (r = 0.29), and total cholesterol (r = 0.16). After adjustment for these covariates, a negative relationship was observed between serum zinc and both blood (r = −0.10) and urinary cadmium (r = −0.14). In women, 11% of the variance could be explained by age (linear and squared term, R = 0.15), diurnal variation in serum zinc (r = 0.27), creatinine clearance (r = −0.11), log γ-glutamyltranspeptidase (r = 0.08), cholesterol (r = 0.07), contraceptive pill intake (r = −0.07), and log serum ferritin (r = 0.06). Before and after adjustment for significant covariates, serum zinc was, on average, lowest in the two districts where the body burden of cadmium, as assessed by urinary cadmium excretion, was highest. These results were not altered when subjects exposed to heavy metals at work were excluded from analysis
Mechanisms and Evolutionary Patterns of Mammalian and Avian Dosage Compensation
A large-scale comparative gene expression study reveals the different ways in which the chromosome-wide gene dosage reductions resulting from sex chromosome differentiation events were compensated during mammalian and avian evolution
Sushi in the United States, 1945-1970
Sushi first achieved widespread popularity in the United States in
the mid-1960s. Many accounts of sushi’s US establishment foreground
the role of a small number of key actors, yet underplay
the role of a complex web of large-scale factors that provided the
context in which sushi was able to flourish. This article critically
reviews existing literature, arguing that sushi’s US popularity
arose from contingent, long-term, and gradual processes. It examines
US newspaper accounts of sushi during 1945–1970, which
suggest the discursive context for US acceptance of sushi was
considerably more propitious than generally acknowledged.
Using California as a case study, the analysis also explains
conducive social and material factors, and directs attention to
the interplay of supply- and demand-side forces in the favorable
positioning of this “new” food. The article argues that the US
establishment of sushi can be understood as part of broader
public acceptance of Japanese cuisine
- …