17 research outputs found
Strongyloidiasis: A Disease of Socioeconomic Disadvantage
This is an open access article distributed under the Creative Commons Attribution License (CC BY) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Abstract: Strongyloidiasis is a disease caused by soil transmitted helminths of the Strongyloides
genus. Currently, it is predominately described as a neglected tropical disease. However, this
description is misleading as it focuses on the geographical location of the disease and not the primary
consideration, which is the socioeconomic conditions and poor infrastructure found within endemic
regions. This classification may result in misdiagnosis and mistreatment by physicians, but more
importantly, it influences how the disease is fundamentally viewed. Strongyloidiasis must be first and
foremost considered as a disease of disadvantage, to ensure the correct strategies and control measures
are used to prevent infection. Changing how strongyloidiasis is perceived from a geographic and
clinical issue to an environmental health issue represents the first step in identifying appropriate
long term control measures. This includes emphasis on environmental health controls, such as
better infrastructure, sanitation and living conditions. This review explores the global prevalence
of strongyloidiasis in relation to its presence in subtropical, tropical and temperate climate zones
with mild and cold winters, but also explores the corresponding socioeconomic conditions of these
regions. The evidence shows that strongyloidiasis is primarily determined by the socioeconomic
status of the communities rather than geographic or climatic conditions. It demonstrates that
strongyloidiasis should no longer be referred to as a “tropical” disease but rather a disease of
disadvantage. This philosophical shift will promote the development of correct control strategies for
preventing this disease of disadvantage
Australian Food Safety Policy Changes from a “Command and Control” to an “Outcomes-Based” Approach: Reflection on the Effectiveness of Its Implementation
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC-BY) license (http://creativecommons.org/licenses/by/4.0/).Foodborne illness is a global public health burden. Over the past decade in Australia, despite advances in microbiological detection and control methods, there has been an increase in the incidence of foodborne illness. Therefore improvements in the regulation and implementation of food safety policy are crucial for protecting public health. In 2000, Australia established a national food safety regulatory system, which included the adoption of a mandatory set of food safety standards. These were in line with international standards and moved away from a “command and control” regulatory approach to an “outcomes-based” approach using risk assessment. The aim was to achieve national consistency and reduce foodborne illness without unnecessarily burdening businesses. Evidence demonstrates that a risk based approach provides better protection for consumers; however, sixteen years after the adoption of the new approach, the rates of food borne illness are still increasing. Currently, food businesses are responsible for producing safe food and regulatory bodies are responsible for ensuring legislative controls are met. Therefore there is co-regulatory responsibility and liability and implementation strategies need to reflect this. This analysis explores the challenges facing food regulation in Australia and explores the rationale and evidence in support of this new regulatory approach. View Full-Tex
The effectiveness of an educational intervention to improve knowledge and perceptions for reducing organophosphate pesticide exposure among Indonesian and South Australian migrant farmworkers
This work is published and licensed by Dove Medical Press Limited
The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution – Non Commercial (unported, v3.0) License (http://creativecommons.org/licenses/by-nc/3.0/). By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed.Background
Farmworkers are at risk of exposure to organophosphate pesticides (OPs). Improvements of knowledge and perceptions about organophosphate (OP) exposure may be of benefit for the reduction in OP exposure.
Purpose
The purpose of this study was to examine the effectiveness of an educational intervention to improve knowledge and perceptions for reducing OP exposure among Indonesian and South Australian (SA) migrant farmworkers.
Methods
This was a quasi-experimental study. The educational intervention used a method of group communication for 30 Indonesian farmworkers and individual communication for seven SA migrant farmworkers. Knowledge and perceptions about OP exposure were measured pre-intervention and 3 months after the intervention.
Results
Unadjusted intervention effects at follow-up showed statistically significantly improved scores of knowledge (both adverse effects of OPs and self-protection from OP exposure), perceived susceptibility, and perceived barriers among Indonesian farmworkers compared with SA migrant farmworkers. Furthermore, these four significant variables in the unadjusted model and the two other variables (perceived severity and perceived benefits) were statistically significant after being adjusted for the level of education and years working as a farmworker. In contrast, knowledge about adverse effects of OPs was the only variable that was statistically significantly improved among SA migrant farmworkers. The results of this study suggests educational interventions using a method of group communication could be more effective than using individual intervention.
Conclusion
These improvements provide starting points to change health behavior of farmworkers, particularly to reduce OP exposure, both at the workplace and at home
Reducing Risk of Salmonellosis through Egg Decontamination Processes
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).Eggs have a high nutritional value and are an important ingredient in many food products. Worldwide foodborne illnesses, such as salmonellosis linked to the consumption of eggs and raw egg products, are a major public health concern. This review focuses on previous studies that have investigated the procedures for the production of microbiologically safe eggs. Studies exploring pasteurization and decontamination methods were investigated. Gamma irradiation, freeze drying, hot air, hot water, infra-red, atmospheric steam, microwave heating and radiofrequency heating are all different decontamination methods currently considered for the production of microbiologically safe eggs. However, each decontamination procedure has different effects on the properties and constituents of the egg. The pasteurization processes are the most widely used and best understood; however, they influence the coagulation, foaming and emulsifying properties of the egg. Future studies are needed to explore combinations of different decontamination methods to produce safe eggs without impacting the protein structure and usability. Currently, eggs which have undergone decontamination processes are primarily used in food prepared for vulnerable populations. However, the development of a decontamination method that does not affect egg properties and functionality could be used in food prepared for the general population to provide greater public health protection
Validation of DESS as a DNA Preservation Method for the Detection of Strongyloides spp. in Canine Feces
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).Abstract: Strongyloides stercoralis is a gastrointestinal parasitic nematode with a life cycle that includes free-living and parasitic forms. For both clinical (diagnostic) and environmental evaluation, it is important that we can detect Strongyloides spp. in both human and non-human fecal samples. Real-time PCR is the most feasible method for detecting the parasite in both clinical and environmental samples that have been preserved. However, one of the biggest challenges with PCR detection is DNA degradation during the postage time from rural and remote areas to the laboratory. This study included a laboratory assessment and field validation of DESS (dimethyl sulfoxide, disodium EDTA, and saturated NaCl) preservation of Strongyloides spp. DNA in fecal samples. The laboratory study investigated the capacity of 1:1 and 1:3 sample to DESS ratios to preserve Strongyloides ratti in spike canine feces. It was found that both ratios of DESS significantly prevented DNA degradation compared to the untreated sample. This method was then validated by applying it to the field-collected canine feces and detecting Strongyloides DNA using PCR. A total of 37 canine feces samples were collected and preserved in the 1:3 ratio (sample: DESS) and of these, 17 were positive for Strongyloides spp. The study shows that both 1:1 and 1:3 sample to DESS ratios were able to preserve the Strongyloides spp. DNA in canine feces samples stored at room temperature for up to 56 days. This DESS preservation method presents the most applicable and feasible method for the Strongyloides DNA preservation in field-collected feces
Strongyloidiasis in Ethiopia: systematic review on risk factors, diagnosis, prevalence and clinical outcomes
© The Author(s). 2019 This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.Background
Strongyloidiasis is a gastrointestinal infection caused by the parasitic nematode Strongyloides stercoralis. It is estimated to infect up to 370 million people globally and is predominately found in tropical and subtropical areas of socioeconomic disadvantage.
Main body
This systematic literature review identified studies published in the last ten years on the risk factors, diagnosis, prevalence and/or clinical outcomes of strongyloidiasis in Ethiopia. The prevalence of S. stercoralis ranged from 0.2 to 11.1% in adults, 0.3% to 20.7% in children, 1.5% to 17.3% in HIV positive adults and 5% in HIV positive children. The identified studies primarily used microscopy based techniques that potentially underestimated the prevalence four fold compared with serology and PCR. Strongyloidiasis in children presents a particularly significant issue in Ethiopia as children often presented with anemia, which is associated with impaired mental and cognitive development. The most significant risk factor for strongyloidiasis was HIV status and although other risk factors were identified for helminth infections, none were statistically significant for S. stercoralis specifically. Several studies detected S. stercoralis in dogs and non-biting cyclorrhaphan flies. However, future research is needed to explore the role of these reservoirs in disease transmission.
Conclusions
This review demonstrated that strongyloidiasis is an overlooked and neglected disease in Ethiopia. There is a need for a systematic approach using a combination of molecular and serology based diagnostic methods to ascertain the true incidence and burden of strongyloidiasis in Ethiopia. Further research is also needed to break the cycle of transmission by identifying environmental reservoirs, risk factors and exploring the potential for zoonotic transfer
Barriers to Effective Municipal Solid Waste Management in a Rapidly Urbanizing Area in Thailand
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).This study focused on determining the barriers to effective municipal solid waste management (MSWM) in a rapidly urbanizing area in Thailand. The Tha Khon Yang Subdistrict Municipality is a representative example of many local governments in Thailand that have been facing MSWM issues. In-depth interviews with individuals and focus groups were conducted with key informants including the municipality staff, residents, and external organizations. The major influences affecting waste management were categorized into six areas: social-cultural, technical, financial, organizational, and legal-political barriers and population growth. SWOT analysis shows both internal and external factors are playing a role in MSWM: There is good policy and a reasonably sufficient budget. However, there is insufficient infrastructure, weak strategic planning, registration, staff capacity, information systems, engagement with programs; and unorganized waste management and fee collection systems. The location of flood prone areas has impacted on location and operation of landfill sites. There is also poor communication between the municipality and residents and a lack of participation in waste separation programs. However, external support from government and the nearby university could provide opportunities to improve the situation. These findings will help inform municipal decision makers, leading to better municipal solid waste management in newly urbanized areas
Dissecting the Shared Genetic Architecture of Suicide Attempt, Psychiatric Disorders, and Known Risk Factors
Background Suicide is a leading cause of death worldwide, and nonfatal suicide attempts, which occur far more frequently, are a major source of disability and social and economic burden. Both have substantial genetic etiology, which is partially shared and partially distinct from that of related psychiatric disorders. Methods We conducted a genome-wide association study (GWAS) of 29,782 suicide attempt (SA) cases and 519,961 controls in the International Suicide Genetics Consortium (ISGC). The GWAS of SA was conditioned on psychiatric disorders using GWAS summary statistics via multitrait-based conditional and joint analysis, to remove genetic effects on SA mediated by psychiatric disorders. We investigated the shared and divergent genetic architectures of SA, psychiatric disorders, and other known risk factors. Results Two loci reached genome-wide significance for SA: the major histocompatibility complex and an intergenic locus on chromosome 7, the latter of which remained associated with SA after conditioning on psychiatric disorders and replicated in an independent cohort from the Million Veteran Program. This locus has been implicated in risk-taking behavior, smoking, and insomnia. SA showed strong genetic correlation with psychiatric disorders, particularly major depression, and also with smoking, pain, risk-taking behavior, sleep disturbances, lower educational attainment, reproductive traits, lower socioeconomic status, and poorer general health. After conditioning on psychiatric disorders, the genetic correlations between SA and psychiatric disorders decreased, whereas those with nonpsychiatric traits remained largely unchanged. Conclusions Our results identify a risk locus that contributes more strongly to SA than other phenotypes and suggest a shared underlying biology between SA and known risk factors that is not mediated by psychiatric disorders.Peer reviewe
Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial
Background
Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear.
Methods
RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047.
Findings
Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths.
Interpretation
Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population
Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial
Background
Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain.
Methods
RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and
ClinicalTrials.gov
,
NCT00541047
.
Findings
Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths.
Interpretation
Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy.
Funding
Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society