806 research outputs found
Low cancer suspicion following experience of a cancer 'warning sign'
© 2015 The Authors. Published by Elsevier Ltd. Aim Lower socioeconomic status (SES) is associated with a higher risk of late-stage cancer diagnosis. A number of explanations have been advanced for this, but one which has attracted recent attention is lower patient knowledge of cancer warning signs, leading to delay in help-seeking. However, although there is psychometric evidence of SES differences in knowledge of cancer symptoms, no studies have examined differences in 'cancer suspicion' among people who are actually experiencing a classic warning sign. Methods A 'health survey' was mailed to 9771 adults (≥50 years, no cancer diagnosis) with a symptom list including 10 cancer 'warning signs'. Respondents were asked if they had experienced any of the symptoms in the past 3 months, and if so, were asked 'what do you think caused it?' Any mention of cancer was scored as 'cancer suspicion'. SES was indexed by education. Results Nearly half the respondents (1732/3756) had experienced a 'warning sign', but only 63/1732 (3.6%) mentioned cancer as a possible cause. Lower education was associated with lower likelihood of cancer suspicion: 2.6% of respondents with school-only education versus 7.3% with university education suspected cancer as a possible cause. In multivariable analysis, low education was the only demographic variable independently associated with lower cancer suspicion (odds ratio (OR) = 0.34, confidence interval (CI): 0.20-0.59). Conclusion Levels of cancer suspicion were low overall in this community sample, and even lower in people from less educated backgrounds. This may hinder early symptomatic presentation and contribute to inequalities in stage at diagnosis
Training Deer to Avoid Sites Through Negative Reinforcement
Deer frequently visit areas where they may cause damage. Incidents along roadways and runways inflict numerous injuries to animals and humans, and cause considerable economic losses. Concerns are increasing that deer interactions with domestic animals may contribute to spread of disease. Deer foraging in residential areas, agricultural fields, or plant propagation sites can impede growth and possibly survival of desirable plants. We conducted a series of trials to determine whether mild electric shock would induce place avoidance in deer. Shock was delivered through a device attached to a collar. A noise cue was emitted as an animal approached a defined area if the animal failed to retreat a shock followed. Deer learned to avoid areas associated with shock. We concluded that place avoidance induced through negative reinforcement may be a feasible means to protect valuable resources from resident animals. However, the technological limitations of tested devices, costs to implement, and required training for individual deer reduced the practicality of this approach for highly mobile animals and as a means to protect resources with low economic significance
Genetic fingerprinting reveals natal origins of male leatherback turtles encountered in the Atlantic Ocean and Mediterranean Sea
This paper is not subject to U.S. copyright. The definitive version was published in Marine Biology 164 (2017): 181, doi:10.1007/s00227-017-3211-0.Understanding population dynamics in broadly distributed marine species with cryptic life history stages is challenging. Information on the population dynamics of sea turtles tends to be biased toward females, due to their accessibility for study on nesting beaches. Males are encountered only at sea; there is little information about their migratory routes, residence areas, foraging zones, and population boundaries. In particular, male leatherbacks (Dermochelys coriacea) are quite elusive; little is known about adult and juvenile male distribution or behavior. The at-sea distribution of male turtles from different breeding populations is not known. Here, 122 captured or stranded male leatherback turtles from the USA, Turkey, France, and Canada (collected 1997–2012) were assigned to one of nine Atlantic basin populations using genetic analysis with microsatellite DNA markers. We found that all turtles originated from western Atlantic nesting beaches (Trinidad 55%, French Guiana 31%, and Costa Rica 14%). Although genetic data for other Atlantic nesting populations were represented in the assignment analysis (St. Croix, Brazil, Florida, and Africa (west and south), none of the male leatherbacks included in this study were shown to originate from these populations. This was an unexpected result based on estimated source population sizes. One stranded turtle from Turkey was assigned to French Guiana, while others that were stranded in France were from Trinidad or French Guiana breeding populations. For 12 male leatherbacks in our dataset, natal origins determined from the genetic assignment tests were compared to published satellite and flipper tag information to provide evidence of natal homing for male leatherbacks, which corroborated our genetic findings. Our focused study on male leatherback natal origins provides information not previously known for this cryptic, but essential component of the breeding population. This method should provide a guideline for future studies, with the ultimate goal of improving management and conservation strategies for threatened and endangered species by taking the male component of the breeding population into account.Sample collection in Nova Scotia, Canada, was supported by funding from Canadian Wildlife Federation, Environment Canada, Fisheries and Oceans Canada, George Cedric Metcalf Foundation, Habitat Stewardship Program for Species at Risk, National Fish and Wildlife Foundation (USA), National Marine Fisheries Service (USA), Natural Sciences and Engineering Research Council of Canada, and World Wildlife Fund Canada. Funding for US samples was provided by National Oceanic and Atmospheric Administration, Massachusetts Division of Marine Fisheries, National Fish and Wildlife Foundation, and Cape Cod Commercial Fisherman’s Alliance. Funding support for this analysis and for Kelly R. Stewart was provided by a Lenfest Ocean Program Grant
Transcriptome Sequencing Reveals Novel Candidate Genes for Cardinium hertigii-Caused Cytoplasmic Incompatibility and Host-Cell Interaction
Cytoplasmic incompatibility (CI) is an intriguing, widespread, symbiont-induced reproductive failure that decreases offspring production of arthropods through crossing incompatibility of infected males with uninfected females or with females infected with a distinct symbiont genotype. For years, the molecular mechanism of CI remained unknown. Recent genomic, proteomic, biochemical, and cell biological studies have contributed to understanding of CI in the alphaproteobacterium Wolbachia and implicate genes associated with the WO prophage. Besides a recently discovered additional lineage of alphaproteobacterial symbionts only moderately related to Wolbachia, Cardinium (Bacteroidetes) is the only other symbiont known to cause CI, and genomic evidence suggests that it has very little homology with Wolbachia and evolved this phenotype independently. Here, we present the first transcriptomic study of the CI Cardinium strain cEper1, in its natural host, Encarsia suzannae, to detect important CI candidates and genes involved in the insect-Cardinium symbiosis. Highly expressed transcripts included genes involved in manipulating ubiquitination, apoptosis, and host DNA. Female-biased genes encoding ribosomal proteins suggest an increase in general translational activity of Cardinium in female wasps. The results confirm previous genomic analyses that indicated that Wolbachia and Cardinium utilize different genes to induce CI, and transcriptome patterns further highlight expression of some common pathways that these bacteria use to interact with the host and potentially cause this enigmatic and fundamental manipulation of host reproduction
Objective Coding of Content and Techniques in Workplace-Based Supervision of an EBT in Public Mental Health
BACKGROUND: Workplace-based clinical supervision as an implementation strategy to support evidence-based treatment (EBT) in public mental health has received limited research attention. A commonly provided infrastructure support, it may offer a relatively cost-neutral implementation strategy for organizations. However, research has not objectively examined workplace-based supervision of EBT and specifically how it might differ from EBT supervision provided in efficacy and effectiveness trials.
METHODS: Data come from a descriptive study of supervision in the context of a state-funded EBT implementation effort. Verbal interactions from audio recordings of 438 supervision sessions between 28 supervisors and 70 clinicians from 17 public mental health organizations (in 23 offices) were objectively coded for presence and intensity coverage of 29 supervision strategies (16 content and 13 technique items), duration, and temporal focus. Random effects mixed models estimated proportion of variance in content and techniques attributable to the supervisor and clinician levels.
RESULTS: Interrater reliability among coders was excellent. EBT cases averaged 12.4 min of supervision per session. Intensity of coverage for EBT content varied, with some discussed frequently at medium or high intensity (exposure) and others infrequently discussed or discussed only at low intensity (behavior management; assigning/reviewing client homework). Other than fidelity assessment, supervision techniques common in treatment trials (e.g., reviewing actual practice, behavioral rehearsal) were used rarely or primarily at low intensity. In general, EBT content clustered more at the clinician level; different techniques clustered at either the clinician or supervisor level.
CONCLUSIONS: Workplace-based clinical supervision may be a feasible implementation strategy for supporting EBT implementation, yet it differs from supervision in treatment trials. Time allotted per case is limited, compressing time for EBT coverage. Techniques that involve observation of clinician skills are rarely used. Workplace-based supervision content appears to be tailored to individual clinicians and driven to some degree by the individual supervisor. Our findings point to areas for intervention to enhance the potential of workplace-based supervision for implementation effectiveness.
TRIAL REGISTRATION: NCT01800266 , Clinical Trials, Retrospectively Registered (for this descriptive study; registration prior to any intervention [part of phase II RCT, this manuscript is only phase I descriptive results])
Recommended from our members
Improving practice in community-based settings: a randomized trial of supervision – study protocol
Background: Evidence-based treatments for child mental health problems are not consistently available in public mental health settings. Expanding availability requires workforce training. However, research has demonstrated that training alone is not sufficient for changing provider behavior, suggesting that ongoing intervention-specific supervision or consultation is required. Supervision is notably under-investigated, particularly as provided in public mental health. The degree to which supervision in this setting includes ‘gold standard’ supervision elements from efficacy trials (e.g., session review, model fidelity, outcome monitoring, skill-building) is unknown. The current federally-funded investigation leverages the Washington State Trauma-focused Cognitive Behavioral Therapy Initiative to describe usual supervision practices and test the impact of systematic implementation of gold standard supervision strategies on treatment fidelity and clinical outcomes. Methods/Design The study has two phases. We will conduct an initial descriptive study (Phase I) of supervision practices within public mental health in Washington State followed by a randomized controlled trial of gold standard supervision strategies (Phase II), with randomization at the clinician level (i.e., supervisors provide both conditions). Study participants will be 35 supervisors and 130 clinicians in community mental health centers. We will enroll one child per clinician in Phase I (N = 130) and three children per clinician in Phase II (N = 390). We use a multi-level mixed within- and between-subjects longitudinal design. Audio recordings of supervision and therapy sessions will be collected and coded throughout both phases. Child outcome data will be collected at the beginning of treatment and at three and six months into treatment. Discussion This study will provide insight into how supervisors can optimally support clinicians delivering evidence-based treatments. Phase I will provide descriptive information, currently unavailable in the literature, about commonly used supervision strategies in community mental health. The Phase II randomized controlled trial of gold standard supervision strategies is, to our knowledge, the first experimental study of gold standard supervision strategies in community mental health and will yield needed information about how to leverage supervision to improve clinician fidelity and client outcomes. Trial registration ClinicalTrials.gov NCT0180026
Attributions of cancer 'alarm' symptoms in a community sample
© 2014 Whitaker et al. Background: Attribution of early cancer symptoms to a non-serious cause may lead to longer diagnostic intervals. We investigated attributions of potential cancer 'alarm' and non-alarm symptoms experienced in everyday life in a community sample of adults, without mention of a cancer context. Methods: A questionnaire was mailed to 4858 adults (≥50 years old, no cancer diagnosis) through primary care, asking about symptom experiences in the past 3 months. The word cancer was not mentioned. Target 'alarm' symptoms, publicised by Cancer Research UK, were embedded in a longer symptom list. For each symptom experienced, respondents were asked for their attribution ('what do you think caused it'), concern about seriousness ('not at all' to 'extremely'), and help-seeking ('did you contact a doctor about it': Yes/No). Results: The response rate was 35% (n=1724). Over half the respondents (915/1724; 53%) had experienced an 'alarm' symptom, and 20 (2%) cited cancer as a possible cause. Cancer attributions were highest for 'unexplained lump'; 7% (6/87). Cancer attributions were lowest for 'unexplained weight loss' (0/47). A higher proportion (375/1638; 23%) were concerned their symptom might be 'serious', ranging from 12% (13/112) for change in a mole to 41% (100/247) for unexplained pain. Just over half had contacted their doctor about their symptom (59%), although this varied by symptom. Alarm symptoms were appraised as more serious than non-alarm symptoms, and were more likely to trigger help-seeking. Conclusions: Consistent with retrospective reports from cancer patients, 'alarm' symptoms experienced in daily life were rarely attributed to cancer. These results have implications for understanding how people appraise and act on symptoms that could be early warning signs of cancer
A 12 week longitudinal study of microbial translocation and systemic inflammation in undernourished HIV-infected Zambians initiating antiretroviral therapy.
BACKGROUND: Undernourished, HIV-infected adults in sub-Saharan Africa have high levels of systemic inflammation, which is a risk factor for mortality and other adverse health outcomes. We hypothesized that microbial translocation, due to the deleterious effects of HIV and poor nutrition on intestinal defenses and mucosal integrity, contributes to heightened systemic inflammation in this population, and reductions in inflammation on antiretroviral therapy (ART) accompany reductions in translocation. METHODS: HIV-infected, Zambian adults with a body mass index <18.5 kg/m2 were recruited for a pilot study to assess the relationships between microbial translocation and systemic inflammation over the first 12 weeks of ART. To assess microbial translocation we measured serum lipopolysaccharide binding protein (LBP), endotoxin core IgG and IgM, and soluble CD14, and to assess intestinal permeability we measured the urinary excretion of an oral lactulose dose normalized to urinary creatinine (Lac/Cr ratio). Linear mixed models were used to assess within-patient changes in these markers relative to serum C-reactive protein (CRP), tumor necrosis factor-α receptor 1 (TNF-α R1), and soluble CD163 over 12 weeks, in addition to relationships between variables independent of time point and adjusted for age, sex, and CD4+ count. RESULTS: Thirty-three participants had data from recruitment and at 12 weeks: 55% were male, median age was 36 years, and median baseline CD4+ count was 224 cells/μl. Over the first 12 weeks of ART, there were significant decreases in serum levels of LBP (median change -8.7 μg/ml, p = 0.01), TNF-α receptor 1 (-0.31 ng/ml, p < 0.01), and CRP (-3.5 mg/l, p = 0.02). The change in soluble CD14 level over 12 weeks was positively associated with the change in CRP (p < 0.01) and soluble CD163 (p < 0.01). Pooling data at baseline and 12 weeks, serum LBP was positively associated with CRP (p = 0.01), while endotoxin core IgM was inversely associated with CRP (p = 0.01) and TNF-α receptor 1 (p = 0.04). The Lac/Cr ratio was not associated with any serum biomarkers. CONCLUSIONS: In undernourished HIV-infected adults in Zambia, biomarkers of increased microbial translocation are associated with high levels of systemic inflammation before and after initiation of ART, suggesting that impaired gut immune defenses contribute to innate immune activation in this population
Dietary intake in a randomized-controlled pilot of NOURISH: A parent intervention for overweight children
NOURISH is a community-based treatment program for parents of overweight and obese children (ages 6–11, BMI ≥ 85th percentile). This study examined the impact of NOURISH on child and parent dietary intake, secondary trial outcomes
Host-linked soil viral ecology along a permafrost thaw gradient
Climate change threatens to release abundant carbon that is sequestered at high latitudes, but the constraints on microbial metabolisms that mediate the release of methane and carbon dioxide are poorly understood1,2,3,4,5,6,7. The role of viruses, which are known to affect microbial dynamics, metabolism and biogeochemistry in the oceans8,9,10, remains largely unexplored in soil. Here, we aimed to investigate how viruses influence microbial ecology and carbon metabolism in peatland soils along a permafrost thaw gradient in Sweden. We recovered 1,907 viral populations (genomes and large genome fragments) from 197 bulk soil and size-fractionated metagenomes, 58% of which were detected in metatranscriptomes and presumed to be active. In silico predictions linked 35% of the viruses to microbial host populations, highlighting likely viral predators of key carbon-cycling microorganisms, including methanogens and methanotrophs. Lineage-specific virus/host ratios varied, suggesting that viral infection dynamics may differentially impact microbial responses to a changing climate. Virus-encoded glycoside hydrolases, including an endomannanase with confirmed functional activity, indicated that viruses influence complex carbon degradation and that viral abundances were significant predictors of methane dynamics. These findings suggest that viruses may impact ecosystem function in climate-critical, terrestrial habitats and identify multiple potential viral contributions to soil carbon cycling
- …