273 research outputs found

    Cyanobacteria blooms cannot be controlled by effective microorganisms (EM) from mud- or Bokashi-balls

    Get PDF
    In controlled experiments, the ability of ‘‘Effective Microorganisms (EM, in the form of mudballs or Bokashi-balls)’’ was tested for clearing waters from cyanobacteria. We found suspensions of EM-mudballs up to 1 g l-1 to be ineffective in reducing cyanobacterial growth. In all controls and EM-mudball treatments up to 1 g l-1 the cyanobacterial chlorophyll-a (Chl-a) concentrations increased within 4 weeks from&120 to 325–435 lg l-1. When pieces of EM-mudballs (42.5 g) were added to 25-l lake water with cyanobacteria, no decrease of cyanobacteria as compared to untreated controls was observed. In contrast, after 4 weeks cyanobacterial Chl-a concentrations were significantly higher in EM-mudball treatments (52 lg l-1) than in controls (20 lg l-1). Only when suspensions with extremely high EM-mudball concentrations were applied (i.e., 5 and 10 g l-1), exceeding the recommended concentrations by orders of magnitude, cyanobacterial growth was inhibited and a bloom forming concentration was reduced strongly. In these high dosing treatments, the oxygen concentration dropped initially to very low levels of 1.8 g l-1. This was most probably through forcing strong light limitation on the cyanobacteria caused by the high amount of clay and subsequent high turbidity of the water. Hence, this study yields no support for the hypothesis that EM is effective in preventing cyanobacterial proliferation or in terminating blooms. We consider EM products to be ineffective because they neither permanently bind nor remove phosphorus from eutroficated systems, they have no inhibiting effect on cyanobacteria, and they could even be an extra source of nutrients

    The effect of increasing conceptual challenge in primary science lessons on pupils' achievement and engagement

    Full text link
    This paper reports research into the effect on 11 year old pupils of introducing more cognitively challenging, practical, and interactive science lessons. Our hypothesis was that such lessons would increase the children’s enthusiasm for science and their engagement with the scientific process, thereby improving educational performance. Schools in England are under pressure to raise achievement, as measured by the results of national tests. This has an impact on teaching, where revision of subject knowledge often dominates and can be particularly detrimental to more able pupils. The research was a controlled trial which took place in 32 English primary schools as part of a project 'Conceptual Challenge in Primary Science'. Teachers from 16 intervention schools participated in continuing professional development (CPD) and developed science lessons that had more practical work, more discussion, more thinking and less (but more focused) writing. The proportion of pupils achieving the highest level (level 5) in the national science tests at age 11 was compared in the matched-school pairs before and after the intervention. Focus group interviews were also held with a group of pupils in each intervention school. There was a 10% (95% Confidence Interval 2-17%) increase in the proportion of children achieving the top score in the intervention schools. The pupils and teachers reported greater engagement and motivation. These findings suggest that moving from rote revision to cognitively challenging, interactive science could help improve science education. They merit replication in other international settings to test their generalisability

    Should identical CVD risks in young and old patients be managed identically? Results from two models

    Get PDF
    OBJECTIVES: To assess whether delaying risk reduction treatment has a different impact on potential life years lost in younger compared with older patients at the same baseline short-term cardiovascular risk. DESIGN: Modelling based on population data. METHODS: Potential years of life lost from a 5-year treatment delay were estimated for patients of different ages but with the same cardiovascular risk (either 5% or 10% 5-year risk). Two models were used: an age-based residual life expectancy model and a Markov simulation model. Age-specific case fatality rates and time preferences were applied to both models, and competing mortality risks were incorporated into the Markov model. RESULTS: Younger patients had more potential life years to lose if untreated, but the maximum difference between 35 and 85 years was <1 year, when models were unadjusted for time preferences or competing risk. When these adjusters were included, the maximum difference fell to about 1 month, although the direction was reversed with older people having more to lose. CONCLUSIONS: Surprisingly, age at onset of treatment has little impact on the likely benefits of interventions that reduce cardiovascular risk because of the opposing effects of life expectancy, case fatality, time preferences and competing risks. These findings challenge the appropriateness of recommendations to use lower risk-based treatment thresholds in younger patients

    Neuraminidase inhibitors for treatment and prophylaxis of influenza in children: systematic review and meta-analysis of randomised controlled trials

    Get PDF
    Objective To assess the effects of the neuraminidase inhibitors oseltamivir and zanamivir in treatment of children with seasonal influenza and prevention of transmission to children in households

    Human resources for primary health care in sub-Saharan Africa: progress or stagnation?

    Get PDF
    BACKGROUND: The World Health Organization defines a "critical shortage" of health workers as being fewer than 2.28 health workers per 1000 population and failing to attain 80% coverage for deliveries by skilled birth attendants. We aimed to quantify the number of health workers in five African countries and the proportion of these currently working in primary health care facilities, to compare this to estimates of numbers needed and to assess how the situation has changed in recent years. METHODS: This study is a review of published and unpublished "grey" literature on human resources for health in five disparate countries: Mali, Sudan, Uganda, Botswana and South Africa. RESULTS: Health worker density has increased steadily since 2000 in South Africa and Botswana which already meet WHO targets but has not significantly increased since 2004 in Sudan, Mali and Uganda which have a critical shortage of health workers. In all five countries, a minority of doctors, nurses and midwives are working in primary health care, and shortages of qualified staff are greatest in rural areas. In Uganda, shortages are greater in primary health care settings than at higher levels. In Mali, few community health centres have a midwife or a doctor. Even South Africa has a shortage of doctors in primary health care in poorer districts. Although most countries recognize village health workers, traditional healers and traditional birth attendants, there are insufficient data on their numbers. CONCLUSION: There is an "inverse primary health care law" in the countries studied: staffing is inversely related to poverty and level of need, and health worker density is not increasing in the lowest income countries. Unless there is money to recruit and retain staff in these areas, training programmes will not improve health worker density because the trained staff will simply leave to work elsewhere. Information systems need to be improved in a way that informs policy on the health workforce. It may be possible to use existing resources more cost-effectively by involving skilled staff to supervise and support lower level health care workers who currently provide the front line of primary health care in most of Africa

    Security and skills: the two key issues in health worker migration.

    Get PDF
    BACKGROUND: Migration of health workers from Africa continues to undermine the universal provision of quality health care. South Africa is an epicentre for migration--it exports more health workers to high-income countries than any other African country and imports health workers from its lower-income neighbours to fill the gap. Although an inter-governmental agreement in 2003 reduced the very high numbers migrating from South Africa to the United Kingdom, migration continues to other high-income English-speaking countries and few workers seem to return although the financial incentive to work abroad has lessened. A deeper understanding of reasons for migration from South Africa and post-migration experiences is therefore needed to underpin policy which is developed in order to improve retention within source countries and encourage return. METHODS: Semi-structured interviews were conducted with 16 South African doctors and nurses who had migrated to the United Kingdom. Interviews explored factors influencing the decision to migrate and post-migration experiences. RESULTS: Salary, career progression, and poor working conditions were not major push factors for migration. Many health workers reported that they had previously overcome these issues within the South African healthcare system by migrating to the private sector. Overwhelmingly, the major push factors were insecurity, high levels of crime, and racial tension. Although the wish to work and train in what was perceived to be a first-class care system was a pull factor to migrate to the United Kingdom, many were disappointed by the experience. Instead of obtaining new skills, many (particularly nurses) felt they had become 'de-skilled'. Many also felt that working conditions and opportunities for them in the UK National Health Service (NHS) compared unfavourably with the private sector in South Africa. CONCLUSIONS: Migration from South Africa seems unlikely to diminish until the major concerns over security, crime, and racial tensions are resolved. However, good working conditions in the private sector in South Africa provide an occupational incentive to return if security did improve. Potential migrants should be made more aware of the risks of losing skills while working abroad that might prejudice return. In addition, re-skilling initiatives should be encouraged

    The Evidence Base for Interventions Delivered to Children in Primary Care: An Overview of Cochrane Systematic Reviews

    Get PDF
    Background: As a first step in developing a framework to evaluate and improve the quality of care of children in primary care there is a need to identify the evidence base underpinning interventions relevant to child health. Our objective was to identify all Cochrane systematic reviews relevant to the management of childhood conditions in primary care and to assess the extent to which Cochrane reviews reflect the burden of childhood illness presenting in primary care.Methodology/Principal Findings: We used the Cochrane Child Health Field register of child-relevant systematic reviews to complete an overview of Cochrane reviews related to the management of children in primary care. We compared the proportion of systematic reviews with the proportion of consultations in Australia, US, Dutch and UK general practice in children. We identified 396 relevant systematic reviews; 385 included primary studies on children while 251 undertook a meta-analysis. Most reviews (n=218, 55%) focused on chronic conditions and over half (n=216, 57%) evaluated drug interventions. Since 2000, the percentage of pediatric primary care relevant reviews only increased by 2% (7% to 9%) compared to 18% (10% to 28%) in all child relevant reviews. Almost a quarter of reviews (n=78, 23%) were published on asthma treatments which only account for 3-5% of consultations. Conversely, 15-23% of consultations are due to skin conditions yet they represent only 7% (n=23) of reviews.Conclusions/Significance: Although Cochrane systematic reviews focus on clinical trials and do not provide a comprehensive picture of the evidence base underpinning the management of children in primary care, the mismatch between the focus of the published research and the focus of clinical activity is striking. Clinical trials are an important component of the evidence based and the lack of trial evidence to demonstrate intervention effectiveness in substantial areas of primary care for children should be addressed.</p

    Self-Screening and Non-Physician Screening for Hypertension in Communities: A Systematic Review.

    Get PDF
    BACKGROUND: Community-based self-screening may provide opportunities to increase detection of hypertension, and identify raised blood pressure (BP) in populations who do not access healthcare. This systematic review aimed to evaluate the effectiveness of non-physician screening and self-screening of BP in community settings. METHODS: We searched the Cochrane Central Trials Register, Medline, Embase, CINAHL, and Science Citation Index & Conference Proceedings Citation Index-Science to November 2013 to identify studies reporting community-based self-screening or non-physician screening for hypertension in adults. Results were stratified by study site, screener, and the cut-off used to define high screening BP. RESULTS: We included 73 studies, which described screening in 9 settings, with pharmacies (22%) and public areas/retail (15%) most commonly described. We found high levels of heterogeneity in all analyses, despite stratification. The highest proportions of eligible participants screened were achieved by mobile units (range 21%-88%) and pharmacies (range 40%-90%). Self-screeners had similar median rates of high BP detection (25%-35%) to participants in studies using other screeners. Few (16%) studies reported referral to primary care after screening. However, where participants were referred, a median of 44% (range 17%-100%) received a new hypertension diagnosis or antihypertensive medication. CONCLUSIONS: Community-based non-physician or self-screening for raised BP can detect raised BP, which may lead to the identification of new cases of hypertension. However, current evidence is insufficient to recommend specific approaches or settings. Studies with good follow-up of patients to definitive diagnosis are needed.This article presents independent research funded by a National Institute for Health Research Programme Grant RP-PG-1209–10051.This is the final version of the article. It was first available from Oxford University Press via http://dx.doi.org/10.1093/ajh/hpv02

    Effect of 3 to 5 years of scheduled CEA and CT follow-up to detect recurrence of colorectal cancer The FACS Randomized Clinical Trial

    Get PDF
    IMPORTANCE Intensive follow-up after surgery for colorectal cancer is common practice but is based on limited evidence. OBJECTIVE To assess the effect of scheduled blood measurement of carcinoembryonic antigen (CEA) and computed tomography (CT) as follow-up to detect recurrent colorectal cancer treatable with curative intent. DESIGN, SETTING, AND PARTICIPANTS Randomized clinical trial in 39 National Health Service hospitals in the United Kingdom; 1202 eligible participants were recruited between January 2003 and August 2009 who had undergone curative surgery for primary colorectal cancer, including adjuvant treatment if indicated, with no evidence of residual disease on investigation. INTERVENTIONS Participants were randomly assigned to 1 of 4 groups: CEA only (n = 300), CT only (n = 299), CEA+CT (n = 302), or minimum follow-up (n = 301). Blood CEA was measured every 3 months for 2 years, then every 6 months for 3 years; CT scans of the chest, abdomen, and pelvis were performed every 6 months for 2 years, then annually for 3 years; and the minimum follow-up group received follow-up if symptoms occurred. MAIN OUTCOMES AND MEASURES The primary outcome was surgical treatment of recurrence with curative intent; secondary outcomes were mortality (total and colorectal cancer), time to detection of recurrence, and survival after treatment of recurrence with curative intent. RESULTS After a mean 4.4 (SD, 0.8) years of observation, cancer recurrence was detected in 199 participants (16.6%; 95% CI, 14.5%-18.7%) overall; 71 of 1202 participants (5.9%; 95% CI, 4.6%-7.2%) were treated for recurrence with curative intent, with little difference according to Dukes staging (stage A, 5.1% [13/254]; stage B, 6.1% [34/553]; stage C, 6.2% [22/354]). Surgical treatment of recurrence with curative intent was 2.3% (7/301) in the minimum follow-up group, 6.7% (20/300) in the CEA group, 8% (24/299) in the CT group, and 6.6% (20/302) in the CEA+CT group. Compared with minimum follow-up, the absolute difference in the percentage of patients treated with curative intent in the CEA group was 4.4% (95% CI, 1.0%-7.9%; adjusted odds ratio [OR], 3.00; 95% CI, 1.23-7.33), in the CT group was 5.7% (95% CI, 2.2%-9.5%; adjusted OR, 3.63; 95% CI, 1.51-8.69), and in the CEA+CT group was 4.3% (95% CI, 1.0%-7.9%; adjusted OR, 3.10; 95% CI, 1.10-8.71). The number of deaths was not significantly different in the combined intensive monitoring groups (CEA, CT, and CEA+CT; 18.2% [164/901]) vs the minimum follow-up group (15.9% [48/301]; difference, 2.3%; 95% CI, −2.6% to 7.1%). CONCLUSIONS AND RELEVANCE Among patients who had undergone curative surgery for primary colorectal cancer, intensive imaging or CEA screening each provided an increased rate of surgical treatment of recurrence with curative intent compared with minimal follow-up; there was no advantage in combining CEA and CT. If there is a survival advantage to any strategy, it is likely to be small. TRIAL REGISTRATION isrctn.org Identifier: 4145854
    corecore