179 research outputs found

    Nodular Worm Infection in Wild Chimpanzees in Western Uganda: A Risk for Human Health?

    Get PDF
    This study focused on Oeosophagostomum sp., and more especially on O. bifurcum, as a parasite that can be lethal to humans and is widespread among humans and monkeys in endemic regions, but has not yet been documented in apes. Its epidemiology and the role played by non-human primates in its transmission are still poorly understood. O. stephanostomum was the only species diagnosed so far in chimpanzees. Until recently, O. bifurcum was assumed to have a high zoonotic potential, but recent findings tend to demonstrate that O. bifurcum of non-human primates and humans might be genetically distinct. As the closest relative to human beings, and a species living in spatial proximity to humans in the field site studied, Pan troglodytes is thus an interesting host to investigate. Recently, a role for chimpanzees in the emergence of HIV and malaria in humans has been documented. In the framework of our long-term health monitoring of wild chimpanzees from Kibale National Park in Western Uganda, we analysed 311 samples of faeces. Coproscopy revealed that high-ranking males are more infected than other individuals. These chimpanzees are also the more frequent crop-raiders. Results from PCR assays conducted on larvae and dried faeces also revealed that O. stephanostomum as well as O. bifurcum are infecting chimpanzees, both species co-existing in the same individuals. Because contacts between humans and great apes are increasing with ecotourism and forest fragmentation in areas of high population density, this paper emphasizes that the presence of potential zoonotic parasites should be viewed as a major concern for public health. Investigations of the parasite status of people living around the park or working inside as well as sympatric non-human primates should be planned, and further research might reveal this as a promising aspect of efforts to reinforce measures against crop-raiding

    Treatment options for subjective tinnitus: Self reports from a sample of general practitioners and ENT physicians within Europe and the USA

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Tinnitus affects about 10-15% of the general population and risks for developing tinnitus are rising through increased exposure to leisure noise through listening to personal music players at high volume. The disorder has a considerable heterogeneity and so no single mechanism is likely to explain the presence of tinnitus in all those affected. As such there is no standardized management pathway nor singly effective treatment for the condition. Choice of clinical intervention is a multi-factorial decision based on many factors, including assessment of patient needs and the healthcare context. The present research surveyed clinicians working in six Westernized countries with the aims: a) to establish the range of referral pathways, b) to evaluate the typical treatment options for categories of subjective tinnitus defined as acute or chronic, and c) to seek clinical opinion about levels of satisfaction with current standards of practice.</p> <p>Methods</p> <p>A structured online questionnaire was conducted with 712 physicians who reported seeing at least one tinnitus patients in the previous three months. They were 370 general practitioners (GPs) and 365 ear-nose-throat specialists (ENTs) from the US, Germany, UK, France, Italy and Spain.</p> <p>Results</p> <p>Our international comparison of health systems for tinnitus revealed that although the characteristics of tinnitus appeared broadly similar across countries, the patient's experience of clinical services differed widely. GPs and ENTs were always involved in referral and management to some degree, but multi-disciplinary teams engaged either neurology (Germany, Italy and Spain) or audiology (UK and US) professionals. For acute subjective tinnitus, pharmacological prescriptions were common, while audiological and psychological approaches were more typical for chronic subjective tinnitus; with several specific treatment options being highly country specific. All therapy options were associated with low levels of satisfaction.</p> <p>Conclusions</p> <p>Despite a large variety of treatment options, the low success rates of tinnitus therapy lead to frustration of physicians and patients alike. For subjective tinnitus in particular, effective therapeutic options with guidelines about key diagnostic criteria are urgently needed.</p

    Routine use of ancillary investigations in staging diffuse large B-cell lymphoma improves the International Prognostic Index (IPI)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The International Prognostic Index (IPI) is used to determine prognosis in diffuse large B-cell lymphoma (DLBCL). One of the determinants of IPI is the stage of disease with bone marrow involvement being classified as stage IV. For the IPI, involvement on bone marrow is traditionally defined on the basis of histology with ancillary investigations used only in difficult cases to aid histological diagnosis. This study aimed to determine the effect of the routine use of flow cytometry, immunohistochemistry and molecular studies in bone marrow staging upon the IPI.</p> <p>Results</p> <p>Bone marrow trephines of 156 histologically proven DLBCL cases at initial diagnosis were assessed on routine histology, and immunohistochemistry using two T-cell markers (CD45RO and CD3), two B-cell markers (CD20 and CD79a) and kappa and lambda light chains. Raw flow cytometry data on all samples were reanalysed and reinterpreted blindly. DNA extracted from archived paraffin-embedded trephine biopsy samples was used for immunoglobulin heavy chain and light chain gene rearrangement analysis. Using immunophenotyping (flow cytometry and immunohistochemistry), 30 (19.2%) cases were upstaged to stage IV. A further 8 (5.1%) cases were upstaged using molecular studies. A change in IPI was noted in 18 cases (11.5%) on immunophenotyping alone, and 22 (14.1%) cases on immunophenotyping and molecular testing. Comparison of two revised IPI models, 1) using immunophenotyping alone, and 2) using immunophenotyping with molecular studies, was performed with baseline IPI using a Cox regression model. It showed that the revised IPI model using immunophenotyping provides the best differentiation between the IPI categories.</p> <p>Conclusion</p> <p>Improved bone marrow staging using flow cytometry and immunohistochemistry improves the predictive value of the IPI in patients with DLBCL and should be performed routinely in all cases.</p

    Mosaic Convergence of Rodent Dentitions

    Get PDF
    BACKGROUND:Understanding mechanisms responsible for changes in tooth morphology in the course of evolution is an area of investigation common to both paleontology and developmental biology. Detailed analyses of molar tooth crown shape have shown frequent homoplasia in mammalian evolution, which requires accurate investigation of the evolutionary pathways provided by the fossil record. The necessity of preservation of an effective occlusion has been hypothesized to functionally constrain crown morphological changes and to also facilitate convergent evolution. The Muroidea superfamily constitutes a relevant model for the study of molar crown diversification because it encompasses one third of the extant mammalian biodiversity. METHODOLOGY/PRINCIPAL FINDINGS:Combined microwear and 3D-topographic analyses performed on fossil and extant muroid molars allow for a first quantification of the relationships between changes in crown morphology and functionality of occlusion. Based on an abundant fossil record and on a well resolved phylogeny, our results show that the most derived functional condition associates longitudinal chewing and non interlocking of cusps. This condition has been reached at least 7 times within muroids via two main types of evolutionary pathways each respecting functional continuity. In the first type, the flattening of tooth crown which induces the removal of cusp interlocking occurs before the rotation of the chewing movement. In the second type however, flattening is subsequent to rotation of the chewing movement which can be associated with certain changes in cusp morphology. CONCLUSION/SIGNIFICANCE:The reverse orders of the changes involved in these different pathways reveal a mosaic evolution of mammalian dentition in which direction of chewing and crown shape seem to be partly decoupled. Either can change in respect to strong functional constraints affecting occlusion which thereby limit the number of the possible pathways. Because convergent pathways imply distinct ontogenetic trajectories, new Evo/Devo comparative studies on cusp morphogenesis are necessary

    Epidemiology and Molecular Relationships of Cryptosporidium spp. in People, Primates, and Livestock from Western Uganda

    Get PDF
    Cryptosporidium is a common gastrointestinal parasite known for its zoonotic potential. We found Cryptosporidium in 32.4% of people, 11.1% of non-human primates, and 2.2% of livestock in the region of Kibale National Park, Uganda. In people, infection rates were higher in one community than elsewhere, and fetching water from an open water source increased the probability of infection. Phylogenetic analyses identified clusters of Cryptosporidium with mixed host origins in people, primates, and livestock outside the park; however, parasites from primates inside the park were genetically divergent, suggesting a separate sylvatic transmission cycle. Infection was not associated with clinical disease in people, even in the case of co-infection with the gastrointestinal parasite Giardia duodenalis. Parasites such as Cryptosporidium may be maintained through frequent cross-species transmission in tropical settings where people, livestock, and wildlife interact frequently, but the parasite may undergo more host-specific transmission where such interactions do not occur. Persistent low-level shedding and immunity may limit the clinical effects of infection in such settings

    Evidence and morality in harm-reduction debates: can we use value-neutral arguments to achieve value-driven goals?

    Get PDF
    It is common to argue that politicians make selective use of evidence to tacitly reinforce their moral positions, but all stakeholders combine facts and values to produce and use research for policy. The drug policy debate has largely been framed in terms of an opposition between evidence and politics. Focusing on harm reduction provides useful ground to discuss a further opposition proposed by evidence advocates, that between evidence and morality. Can evidence sway individuals from their existing moral positions, so as to “neutralise” morality? And if not, then should evidence advocates change the way in which they frame their arguments? To address these questions, analysis of N=27 interviews with stakeholders involved in drug policy and harm reduction research, advocacy, lobbying, implementation and decision-making in England, UK and New South Wales, Australia, was conducted. Participants’ accounts suggest that although evidence can help focus discussions away from values and principles, exposure to evidence does not necessarily change deeply held views. Whether stakeholders decide to go with the evidence or not seems contingent on whether they embrace a view of evidence as secular faith; a view that is shaped by experience, politics, training, and role. And yet, morality, values, and emotions underpin all stakeholders’ views, motivating their commitment to drug policy and harm reduction. Evidence advocates might thus benefit from morally and emotionally engaging audiences. This paper aims to develop better tools for analysing the role of morality in decision-making, starting with moral foundations theory. Using tools from disciplines such as moral psychology is relevant to the study of the politics of evidence-based policymaking

    The Gender Congruency Effect across languages in bilinguals: A meta-analysis

    Get PDF
    In the study of gender representation and processing in bilinguals, two contrasting perspectives exist: integrated vs. the autonomous (Costa, Kovacic, Fedorenko, & Caramazza, 2003). In the former, cross-linguistic interactions during the selection of grammatical gender values are expected; in the latter, they are not. To address this issue, authors have typically explored the cross-linguistic Gender Congruency Effect (GCE: a facilitation on the naming or translation of second language [L2] nouns when their first language [L1] translations are of the same gender, in comparison to those of a different gender). However, the literature suggests that this effect is sometimes difficult to observe and might vary as a function of variables such as the syntactic structure produced to translate or name the target (bare nouns vs. noun phrases), the phonological gender transparency of both languages (whether or not they have phonological gender cues associated with the ending letter [e.g., “–a” for feminine words and “–o” for masculine words in Romance languages]), the degree of L2 proficiency, and task requirements (naming vs. translation). The aim of the present quantitative meta-analysis is to examine the robustness of the cross-linguistic GCE obtained during language production. It involves 25 experiments from 11 studies. The results support a bilingual gender-integrated view, in that they show a small but significant GC effect regardless of the variables mentioned above.This paper was funded through the state budget with reference IF / 00784/2013 / CP1158 / CT0013. The study has also been partially supported by the FCT and the Portuguese Ministry of Science, Technology and Higher Education through national funds and co-financed by FEDER through COMPETE2020 under the PT2020 Partnership Agreement (POCI-01-0145-FEDER-007653). Government of Spain—Ministry of Education, Culture and Sports—through the Training program for Academic Staff (Ayudas para la Formación del Profesorado Universitario, FPU grant BOE-B-2017-2646), the research project (reference PSI2015-65116-P) granted by the Spanish Ministry of Economy and Competitiveness, and the grant for research groups (reference ED431B 2019/2020) from the Galician Government, as well as by the FCT (Foundation for Science and Technology, Portugal) through the state budget (reference IF / 00784/2013 / CP1158 / CT0013). Finally, the study has also been partially supported by the FCT and the Portuguese Ministry of Science, Technology and Higher Education through national funds and co-financed by FEDER through COMPETE2020 under the PT2020 Partnership Agreement (POCI-01-0145-FEDER-007653

    Effects of Wolves on Elk and Cattle Behaviors: Implications for Livestock Production and Wolf Conservation

    Get PDF
    BACKGROUND: In many areas, livestock are grazed within wolf (Canis lupus) range. Predation and harassment of livestock by wolves creates conflict and is a significant challenge for wolf conservation. Wild prey, such as elk (Cervus elaphus), perform anti-predator behaviors. Artificial selection of cattle (Bos taurus) might have resulted in attenuation or absence of anti-predator responses, or in erratic and inconsistent responses. Regardless, such responses might have implications on stress and fitness. METHODOLOGY/PRINCIPAL FINDINGS: We compared elk and cattle anti-predator responses to wolves in southwest Alberta, Canada within home ranges and livestock pastures, respectively. We deployed satellite- and GPS-telemetry collars on wolves, elk, and cattle (n = 16, 10 and 78, respectively) and measured seven prey response variables during periods of wolf presence and absence (speed, path sinuosity, time spent head-up, distance to neighboring animals, terrain ruggedness, slope and distance to forest). During independent periods of wolf presence (n = 72), individual elk increased path sinuosity (Z = -2.720, P = 0.007) and used more rugged terrain (Z = -2.856, P = 0.004) and steeper slopes (Z = -3.065, P = 0.002). For cattle, individual as well as group behavioral analyses were feasible and these indicated increased path sinuosity (Z = -2.720, P = 0.007) and decreased distance to neighbors (Z = -2.551, P = 0.011). In addition, cattle groups showed a number of behavioral changes concomitant to wolf visits, with variable direction in changes. CONCLUSIONS/SIGNIFICANCE: Our results suggest both elk and cattle modify their behavior in relation to wolf presence, with potential energetic costs. Our study does not allow evaluating the efficacy of anti-predator behaviors, but indicates that artificial selection did not result in their absence in cattle. The costs of wolf predation on livestock are often compensated considering just the market value of the animal killed. However, society might consider refunding some additional costs (e.g., weight loss and reduced reproduction) that might be associated with the changes in cattle behaviors that we documented

    Secondary Endoleak Management Following TEVAR and EVAR.

    Get PDF
    Endovascular abdominal and thoracic aortic aneurysm repair and are widely used to treat increasingly complex aneurysms. Secondary endoleaks, defined as those detected more than 30 days after the procedure and after previous negative imaging, remain a challenge for aortic specialists, conferring a need for long-term surveillance and reintervention. Endoleaks are classified on the basis of their anatomic site and aetiology. Type 1 and type 2 endoleaks (EL1 and EL2) are the most common endoleaks necessitating intervention. The management of these requires an understanding of their mechanics, and the risk of sac enlargement and rupture due to increased sac pressure. Endovascular techniques are the main treatment approach to manage secondary endoleaks. However, surgery should be considered where endovascular treatments fail to arrest aneurysm growth. This chapter reviews the aetiology, significance, management strategy and techniques for different endoleak types
    corecore