238 research outputs found

    The factor structure of the Forms of Self-Criticising/Attacking & Self-Reassuring Scale in thirteen distinct populations

    Get PDF
    There is considerable evidence that self-criticism plays a major role in the vulnerability to and recovery from psychopathology. Methods to measure this process, and its change over time, are therefore important for research in psychopathology and well-being. This study examined the factor structure of a widely used measure, the Forms of Self-Criticising/Attacking & Self-Reassuring Scale in thirteen nonclinical samples (N = 7510) from twelve different countries: Australia (N = 319), Canada (N = 383), Switzerland (N = 230), Israel (N = 476), Italy (N = 389), Japan (N = 264), the Netherlands (N = 360), Portugal (N = 764), Slovakia (N = 1326), Taiwan (N = 417), the United Kingdom 1 (N = 1570), the United Kingdom 2 (N = 883), and USA (N = 331). This study used more advanced analyses than prior reports: a bifactor item-response theory model, a two-tier item-response theory model, and a non-parametric item-response theory (Mokken) scale analysis. Although the original three-factor solution for the FSCRS (distinguishing between Inadequate-Self, Hated-Self, and Reassured-Self) had an acceptable fit, two-tier models, with two general factors (Self-criticism and Self-reassurance) demonstrated the best fit across all samples. This study provides preliminary evidence suggesting that this two-factor structure can be used in a range of nonclinical contexts across countries and cultures. Inadequate-Self and Hated-Self might not by distinct factors in nonclinical samples. Future work may benefit from distinguishing between self-correction versus shame-based self-criticism.Peer reviewe

    The fitness of African malaria vectors in the presence and limitation of host behaviour

    Get PDF
    <p>Background Host responses are important sources of selection upon the host species range of ectoparasites and phytophagous insects. However little is known about the role of host responses in defining the host species range of malaria vectors. This study aimed to estimate the relative importance of host behaviour to the feeding success and fitness of African malaria vectors, and assess its ability to predict their known host species preferences in nature.</p> <p>Methods Paired evaluations of the feeding success and fitness of African vectors Anopheles arabiensis and Anopheles gambiae s.s in the presence and limitation of host behaviour were conducted in a semi-field system (SFS) at Ifakara Health Institute, Tanzania. In one set of trials, mosquitoes were released within the SFS and allowed to forage overnight on a host that was free to exhibit natural behaviour in response to insect biting. In the other, mosquitoes were allowed to feed directly on from the skin surface of immobile hosts. The feeding success and subsequent fitness of vectors under these conditions were investigated on 6 host types (humans, calves, chickens, cows, dogs and goats) to assess whether physical movements of preferred host species (cattle for An. arabiensis, humans for An. gambiae s.s.) were less effective at preventing mosquito bites than those of common alternatives.</p> <p>Results Anopheles arabiensis generally had greater feeding success when applied directly to host skin than when foraging on unrestricted hosts (in five of six host species). However, An. gambiae s.s obtained blood meals from free and restrained hosts with similar success from most host types (four out of six). Overall, the blood meal size, oviposition rate, fecundity and post-feeding survival of mosquito vectors were significantly higher after feeding on hosts free to exhibit behaviour, than those who were immobilized during feeding trials.</p> <p>Conclusions Allowing hosts to move freely during exposure to mosquitoes was associated with moderate reductions in mosquito feeding success, but no detrimental impact to the subsequent fitness of mosquitoes that were able to feed upon them. This suggests that physical defensive behaviours exhibited by common host species including humans do not impose substantial fitness costs on African malaria vectors.</p&gt

    Improving well-being in Higher Education: Adopting a compassionate approach

    Get PDF
    © Springer Nature Switzerland AG 2019. This is a post-peer-review, pre-copyedit version of a chapter published in the 'Values of the University in a Time of Uncertainty'. The final authenticated version is available online at: https://doi.org/10.1007/978-3-030-15970-2_18Peer reviewe

    Consensus-based antimicrobial resistance and stewardship competencies for UK undergraduate medical students.

    Get PDF
    BACKGROUND: In the UK there is limited coverage of antimicrobial stewardship across postgraduate curricula and evidence that final year medical students have insufficient and inconsistent antimicrobial stewardship teaching. A national undergraduate curriculum for antimicrobial resistance and stewardship is required to standardize an adequate level of understanding for all future doctors. OBJECTIVES: To provide a UK national consensus on competencies for antimicrobial resistance and stewardship for undergraduate medical education. METHODS: Using the modified Delphi method over two online survey rounds, an expert panel comprising leads for infection teaching from 25 UK medical schools reviewed competency descriptors for antimicrobial resistance and stewardship education. RESULTS: There was a response rate of 100% with all 28 experts who agreed to take part completing both survey rounds. Following the first-round survey, of the initial 55 descriptors, 43 reached consensus (78%). The second-round survey included the 12 descriptors from the first round in which agreement had not been reached, four amended descriptors and 12 new descriptors following qualitative feedback from the panel members. Following the second-round survey, a total of 58 consensus-based competency descriptors within six overarching domains were identified. CONCLUSIONS: The consensus-based competency descriptors defined here can be used to inform standards, design curricula, develop assessment tools and direct UK undergraduate medical education

    How should health service organizations respond to diversity? A content analysis of six approaches

    Get PDF
    Background Health care organizations need to be responsive to the needs of increasingly diverse patient populations. We compared the contents of six publicly available approaches to organizational responsiveness to diversity. The central questions addressed in this paper are: what are the most consistently recommended issues for health care organizations to address in order to be responsive to the needs of diverse groups that differ from the majority population? How much consensus is there between various approaches? Methods We purposively sampled six approaches from the US, Australia and Europe and used qualitative textual analysis to categorize the content of each approach into domains (conceptually distinct topic areas) and, within each domain, into dimensions (operationalizations). The resulting classification framework was used for comparative analysis of the content of the six approaches. Results We identified seven domains that were represented in most or all approaches: organizational commitment, empirical evidence on inequalities and needs, a competent and diverse workforce, ensuring access for all users, ensuring responsiveness in care provision, fostering patient and community participation, and actively promoting responsiveness. Variations in the operationalization of these domains related to different scopes, contexts and types of diversity. For example, approaches that focus on ethnic diversity mostly provide recommendations to handle cultural and language differences; approaches that take an intersectional approach and broaden their target population to vulnerable groups in a more general sense also pay attention to factors such as socio-economic status and gender. Conclusions Despite differences in labeling, there is a broad consensus about what health care organizations need to do in order to be responsive to patient diversity. This opens the way to full scale implementation of organizational responsiveness in healthcare and structured evaluation of its effectiveness in improving patient outcomes

    Colo-Pro: a pilot randomised controlled trial to compare standard bolus-dosed cefuroxime prophylaxis to bolus-continuous infusion–dosed cefuroxime prophylaxis for the prevention of infections after colorectal surgery

    Get PDF
    Standard bolus-dosed antibiotic prophylaxis may not inhibit growth of antibiotic resistant colonic bacteria, a cause of SSIs after colorectal surgery. An alternative strategy is continuous administration of antibiotic throughout surgery, maintaining concentrations of antibiotics that inhibit growth of resistant bacteria. This study is a pilot comparing bolus-continuous infusion with bolus-dosed cefuroxime prophylaxis in colorectal surgery. This is a pilot randomised controlled trial in which participants received cefuroxime bolus-infusion (intervention arm) targeting free serum cefuroxime concentrations of 64 mg/L, or 1.5 g cefuroxime as a bolus dose four-hourly (standard arm). Patients in both arms received metronidazole (500 mg intravenously). Eligible participants were adults undergoing colorectal surgery expected to last for over 2 h. Results were analysed on an intention-to-treat basis. The study was successfully piloted, with 46% (90/196) of eligible patients recruited and 89% (80/90) of participants completing all components of the protocol. A trialled bolus-continuous dosing regimen was successful in maintaining free serum cefuroxime concentrations of 64 mg/L. No serious adverse reactions were identified. Rates of SSIs (superficial and deep SSIs) were lower in the intervention arm than the standard treatment arm (24% (10/42) vs. 30% (13/43)), as were infection within 30 days of operation (41% (17/43) vs 51% (22/43)) and urinary tract infections (2% (1/42) vs. 9% (4/43)). These infection rates can be used to power future clinical trials. This study demonstrates the feasibility of cefuroxime bolus-continuous infusion of antibiotic prophylaxis trials, and provides safety data for infusions targeting free serum cefuroxime concentrations of 64 mg/L. Trial registration: NCT02445859

    High Prevalence of Malaria in Zambezia, Mozambique: The Protective Effect of IRS versus Increased Risks Due to Pig-Keeping and House Construction

    Get PDF
    BACKGROUND: African countries are scaling up malaria interventions, especially insecticide treated nets (ITN) and indoor residual spraying (IRS), for which ambitious coverage targets have been set. In spite of these efforts infection prevalence remains high in many parts of the continent. This study investigated risk factors for malaria infection in children using three malaria indicator surveys from Zambezia province, Mozambique. The impact of IRS and ITNs, the effects of keeping farm animals and of the construction material of roofs of houses and other potential risk factors associated with malaria infection in children were assessed. METHODS: Cross-sectional community-based surveys were conducted in October of 2006, 2007 and 2008. A total of 8338 children (ages 1-15 years) from 2748 households were included in the study. All children were screened for malaria by rapid diagnostic tests. Caregiver interviews were used to assess household demographic and wealth characteristics and ITN and IRS coverage. Associations between malaria infection, vector control interventions and potential risk factors were assessed. RESULTS: Overall, the prevalence of malaria infection was 47.8% (95%CI: 38.7%-57.1%) in children 1-15 years of age, less than a quarter of children (23.1%, 95%CI: 19.1%-27.6%) were sleeping under ITN and almost two thirds were living in IRS treated houses (coverage 65.4%, 95%CI: 51.5%-77.0%). Protective factors that were independently associated with malaria infection were: sleeping in an IRS house without sleeping under ITN (Odds Ratio (OR)= 0.6; 95%CI: 0.4-0.9); additional protection due to sleeping under ITN in an IRS treated house (OR = 0.5; 95%CI: 0.3-0.7) versus sleeping in an unsprayed house without a ITN; and parental education (primary/secondary: OR = 0.6; 95%CI: 0.5-0.7) versus parents with no education. Increased risk of infection was associated with: current fever (OR = 1.2; 95%CI: 1.0-1.5) versus no fever; pig keeping (OR = 3.2; 95%CI: 2.1-4.9) versus not keeping pigs; living in houses with a grass roof (OR = 1.7; 95%CI: 1.3-2.4) versus other roofing materials and bigger household size (8-15 people: OR = 1.6; 95%CI: 1.3-2.1) versus small households (1-4 persons). CONCLUSION: Malaria infection among children under 15 years of age in Zambezia remained high but conventional malaria vector control methods, in particular IRS, provided effective means of protection. Household ownership of farm animals, particularly pigs, and living in houses with a grass roof were independently associated with increased risk of infection, even after allowing for household wealth. To reduce the burden of malaria, national control programs need to ensure high coverage of effective IRS and promote the use of ITNs, particularly in households with elevated risks of infection, such as those keeping farm animals, and those with grass roofs

    Laboratory-based evaluation of legionellosis epidemiology in Ontario, Canada, 1978 to 2006

    Get PDF
    BACKGROUND: Legionellosis is a common cause of severe community acquired pneumonia and respiratory disease outbreaks. The Ontario Public Health Laboratory (OPHL) has conducted most testing for Legionella species in the Canadian province of Ontario since 1978, and represents a multi-decade repository of population-based data on legionellosis epidemiology. We sought to provide a laboratory-based review of the epidemiology of legionellosis in Ontario over the past 3 decades, with a focus on changing rates of disease and species associated with legionellosis during that time period. METHODS: We analyzed cases that were submitted and tested positive for legionellosis from 1978 to 2006 using Poisson regression models incorporating temporal, spatial, and demographic covariates. Predictors of infection with culture-confirmed L. pneumophila serogroup 1 (LP1) were evaluated with logistic regression models. Results: 1,401 cases of legionellosis tested positive from 1978 to 2006. As in other studies, we found a late summer to early autumn seasonality in disease occurrence with disease risk increasing with age and in males. In contrast to other studies, we found a decreasing trend in cases in the recent decade (IRR 0.93, 95% CI 0.91 to 0.95, P-value = 0.001); only 66% of culture-confirmed isolates were found to be LP1. CONCLUSION: Despite similarities with disease epidemiology in other regions, legionellosis appears to have declined in the past decade in Ontario, in contrast to trends observed in the United States and parts of Europe. Furthermore, a different range of Legionella species is responsible for illness, suggesting a distinctive legionellosis epidemiology in this North American region

    Fever as a Cause of Hypophosphatemia in Patients with Malaria

    Get PDF
    Hypophosphatemia occurs in 40 to 60% of patients with acute malaria, and in many other conditions associated with elevations of body temperature. To determine the prevalence and causes of hypophosphatemia in patients with malaria, we retrospectively studied all adults diagnosed with acute malaria during a 12-year period. To validate our findings, we analyzed a second sample of malaria patients during a subsequent 10-year period. Serum phosphorus correlated inversely with temperature (n = 59, r = −0.62; P<0.0001), such that each 1°C increase in body temperature was associated with a reduction of 0.18 mmol/L (0.56 mg/dL) in the serum phosphorus level (95% confidence interval: −0.12 to −0.24 mmol/L [−0.37 to −0.74 mg/dL] per 1°C). A similar effect was observed among 19 patients who had repeat measurements of serum phosphorus and temperature. In a multiple linear regression analysis, the relation between temperature and serum phosphorus level was independent of blood pH, PCO2, and serum levels of potassium, bicarbonate, calcium, albumin, and glucose. Our study demonstrates a strong inverse linear relation between body temperature and serum phosphorus level that was not explained by other factors known to cause hypophosphatemia. If causal, this association can account for the high prevalence of hypophosphatemia, observed in our patients and in previous studies of patients with malaria. Because hypophosphatemia has been observed in other clinical conditions characterized by fever or hyperthermia, this relation may not be unique to malaria. Elevation of body temperature should be added to the list of causes of hypophosphatemia

    Unification of New Zealand's local vertical datums: iterative gravimetric quasigeoid computations

    Get PDF
    New Zealand uses 13 separate local vertical datums (LVDs) based on normal-orthometric-corrected precise geodetic levelling from 12 different tide-gauges. We describe their unification using a regional gravimetric quasigeoid model and GPS-levelling data on each LVD. A novel application of iterative quasigeoid computation is used, where the LVD offsets computed from earlier models are used to apply additional gravity reductions from each LVD to that model. The solution converges after only three iterations yielding LVD offsets ranging from 0.24 m to 0.58 m with an average standard deviation of 0.08 m. The so-computed LVD offsets agree, within expected data errors, with geodetically levelled height differences at common benchmarks between adjacent LVDs. This shows that iterated quasigeoid models do have a role in vertical datum unification
    corecore