1,041 research outputs found

    Seeing two faces together: preference formation in humans and rhesus macaques

    Get PDF
    Humans, great apes and old world monkeys show selective attention to faces depending on conspecificity, familiarity, and social status supporting the view that primates share similar face processing mechanisms. Although many studies have been done on face scanning strategy in monkeys and humans, the mechanisms influencing viewing preference have received little attention. To determine how face categories influence viewing preference in humans and rhesus macaques (Macaca mulatta), we performed two eye-tracking experiments using a visual preference task whereby pairs of faces from different species were presented simultaneously. The results indicated that viewing time was significantly influenced by the pairing of the face categories. Humans showed a strong bias towards an own-race face in an Asian–Caucasian condition. Rhesus macaques directed more attention towards non-human primate faces when they were paired with human faces, regardless of the species. When rhesus faces were paired with faces from Barbary macaques (Macaca sylvanus) or chimpanzees (Pan troglodytes), the novel species’ faces attracted more attention. These results indicate that monkeys’ viewing preferences, as assessed by a visual preference task, are modulated by several factors, species and dominance being the most influential

    Ethanol reversal of tolerance to the respiratory depressant effects of morphine

    Get PDF
    Opioids are the most common drugs associated with unintentional drug overdose. Death results from respiratory depression. Prolonged use of opioids results in the development of tolerance but the degree of tolerance is thought to vary between different effects of the drugs. Many opioid addicts regularly consume alcohol (ethanol), and post-mortem analyses of opioid overdose deaths have revealed an inverse correlation between blood morphine and ethanol levels. In the present study, we determined whether ethanol reduced tolerance to the respiratory depressant effects of opioids. Mice were treated with opioids (morphine, methadone, or buprenorphine) for up to 6 days. Respiration was measured in freely moving animals breathing 5% CO(2) in air in plethysmograph chambers. Antinociception (analgesia) was measured as the latency to remove the tail from a thermal stimulus. Opioid tolerance was assessed by measuring the response to a challenge dose of morphine (10 mg/kg i.p.). Tolerance developed to the respiratory depressant effect of morphine but at a slower rate than tolerance to its antinociceptive effect. A low dose of ethanol (0.3 mg/kg) alone did not depress respiration but in prolonged morphine-treated animals respiratory depression was observed when ethanol was co-administered with the morphine challenge. Ethanol did not alter the brain levels of morphine. In contrast, in methadone- or buprenorphine-treated animals no respiratory depression was observed when ethanol was co-administered along with the morphine challenge. As heroin is converted to morphine in man, selective reversal of morphine tolerance by ethanol may be a contributory factor in heroin overdose deaths

    Elevated hemostasis markers after pneumonia increases one-year risk of all-cause and cardiovascular deaths

    Get PDF
    Background: Acceleration of chronic diseases, particularly cardiovascular disease, may increase long-term mortality after community-acquired pneumonia (CAP), but underlying mechanisms are unknown. Persistence of the prothrombotic state that occurs during an acute infection may increase risk of subsequent atherothrombosis in patients with pre-existing cardiovascular disease and increase subsequent risk of death. We hypothesized that circulating hemostasis markers activated during CAP persist at hospital discharge, when patients appear to have recovered clinically, and are associated with higher mortality, particularly due to cardiovascular causes. Methods: In a cohort of survivors of CAP hospitalization from 28 US sites, we measured D-Dimer, thrombin-antithrombin complexes [TAT], Factor IX, antithrombin, and plasminogen activator inhibitor-1 at hospital discharge, and determined 1-year all-cause and cardiovascular mortality. Results: Of 893 subjects, most did not have severe pneumonia (70.6% never developed severe sepsis) and only 13.4% required intensive care unit admission. At discharge, 88.4% of subjects had normal vital signs and appeared to have clinically recovered. D-dimer and TAT levels were elevated at discharge in 78.8% and 30.1% of all subjects, and in 51.3% and 25.3% of those without severe sepsis. Higher D-dimer and TAT levels were associated with higher risk of all-cause mortality (range of hazard ratios were 1.66-1.17, p = 0.0001 and 1.46-1.04, p = 0.001 after adjusting for demographics and comorbid illnesses) and cardiovascular mortality (p = 0.009 and 0.003 in competing risk analyses). Conclusions: Elevations of TAT and D-dimer levels are common at hospital discharge in patients who appeared to have recovered clinically from pneumonia and are associated with higher risk of subsequent deaths, particularly due to cardiovascular disease. © 2011 Yende et al

    Injury morbidity in an urban and a rural area in Tanzania: an epidemiological survey

    Get PDF
    BACKGROUND: Injuries are becoming a major health problem in developing countries. Few population based studies have been carried out in African countries. We examined the pattern of nonfatal injuries and associated risk factors in an urban and rural setting of Tanzania. METHODS: A population-based household survey was conducted in 2002. Participants were selected by cluster sampling. A total of 8,188 urban and 7,035 rural residents of all ages participated in the survey. All injuries reported among all household members in the year preceding the interview and resulting in one or more days of restricted activity were included in the analyis. RESULTS: A total of 206 (2.5%) and 303 (4.3%) persons reported to have been injured in the urban and rural area respectively. Although the overall incidence was higher in the rural area, the incidence of major injuries (≥ 30 disability days) was similar in both areas. Males were at a higher risk of having an injury than females. Rural residents were more likely to experience injuries due to falls (OR = 1.6; 95% CI = 1.1 – 2.3) and cuts (OR = 4.3; 95% CI = 3.0 – 6.2) but had a lower risk of transport injuries. The most common causes of injury in the urban area were transport injuries and falls. In the rural area, cuts and stabs, of which two thirds were related to agriculture, formed the most common cause. Age was an important risk factor for certain types of injuries. Poverty levels were not significantly associated with experiencing a nonfatal injury. CONCLUSION: The patterns of injury differ in urban and rural areas partly as a reflection of livelihoods and infrastructure. Rural residents are at a higher overall injury risk than urban residents. This may be important in the development of injury prevention strategies

    Evaluation of alternative mosquito sampling methods for malaria vectors in Lowland South - East Zambia.

    Get PDF
    Sampling malaria vectors and measuring their biting density is of paramount importance for entomological surveys of malaria transmission. Human landing catch (HLC) has been traditionally regarded as a gold standard method for surveying human exposure to mosquito bites. However, due to the risk of human participant exposure to mosquito-borne parasites and viruses, a variety of alternative, exposure-free trapping methods were compared in lowland, south-east Zambia. Centres for Disease Control and Prevention miniature light trap (CDC-LT), Ifakara Tent Trap model C (ITT-C), resting boxes (RB) and window exit traps (WET) were all compared with HLC using a 3 × 3 Latin Squares design replicated in 4 blocks of 3 houses with long lasting insecticidal nets, half of which were also sprayed with a residual deltamethrin formulation, which was repeated for 10 rounds of 3 nights of rotation each during both the dry and wet seasons. The mean catches of HLC indoor, HLC outdoor, CDC-LT, ITT-C, WET, RB indoor and RB outdoor, were 1.687, 1.004, 3.267, 0.088, 0.004, 0.000 and 0.008 for Anopheles quadriannulatus Theobald respectively, and 7.287, 6.784, 10.958, 5.875, 0.296, 0.158 and 0.458, for An. funestus Giles, respectively. Indoor CDC-LT was more efficient in sampling An. quadriannulatus and An. funestus than HLC indoor (Relative rate [95% Confidence Interval] = 1.873 [1.653, 2.122] and 1.532 [1.441, 1.628], respectively, P < 0.001 for both). ITT-C was the only other alternative which had comparable sensitivity (RR = 0.821 [0.765, 0.881], P < 0.001), relative to HLC indoor other than CDC-LT for sampling An. funestus. While the two most sensitive exposure-free techniques primarily capture host-seeking mosquitoes, both have substantial disadvantages for routine community-based surveillance applications: the CDC-LT requires regular recharging of batteries while the bulkiness of ITT-C makes it difficult to move between sampling locations. RB placed indoors or outdoors and WET had consistently poor sensitivity so it may be useful to evaluate additional alternative methods, such as pyrethrum spray catches and back packer aspirators, for catching resting mosquitoes

    Long-Term Results of External Upper Esophageal Sphincter Myotomy for Oropharyngeal Dysphagia

    Get PDF
    The aim of this work was to assess the efficacy of external myotomy of the upper esophageal sphincter (UES) for oropharyngeal dysphagia. In the period 1991–2006, 28 patients with longstanding dysphagia and/or aspiration problems of different etiologies underwent UES myotomy as a single surgical treatment. The main symptoms were difficulties in swallowing of a solid-food bolus, aspiration, and recurrent incidents of solid-food blockages. Pre- and postoperative manometry and videofluoroscopy were used to assess deglutition and aspiration. Outcome was defined as success in the case of complete relief or marked improvement of dysphagia and aspiration and as failure in the case of partial improvement or no improvement. Initial results showed success in 21 and failure in 7 patients. The best outcomes were observed in patients with dysphagia of unknown origin, noncancer-related iatrogenic etiology, and neuromuscular disease. No correlation was found between preoperative constrictor pharyngeal muscle activity and success rate. After follow-up of more than 1 year, 20 patients were marked as success and 3 as failure. All successful patients had full oral intake with a normal bolus consistency without clinically significant aspiration. We conclude that in select cases of oropharyngeal dysphagia success may be achieved by UES myotomy with restoration of oral intake of normal bolus consistency

    Drivers of increased soil erosion in East Africa’s agro-pastoral systems: changing interactions between the social, economic and natural domains

    Get PDF
    This is the final version. Available on open access from Springer via the DOI in this recordIncreased soil erosion is one of the main drivers of land degradation in East Africa’s agricultural and pastoral landscapes. This wicked problem is rooted in historic disruptions to co-adapted agro-pastoral systems. Introduction of agricultural growth policies by centralised governance resulted in temporal and spatial scale mismatches with the complex and dynamic East African environment, which subsequently contributed to soil exhaustion, declining fertility and increased soil erosion. Coercive policies of land use, privatisation, sedentarisation, exclusion and marginalisation led to a gradual erosion of the indigenous social and economic structures. Combined with the inability of the new nation-states to provide many of the services necessary for (re)developing the social and economic domains, many communities are lacking key components enabling sustainable adaptation to changing internal and external shocks and pressures. Exemplary is the absence of growth in agricultural productivity and livelihood options outside of agriculture, which prohibits the absorption of an increasing population and pushes communities towards overexploitation of natural resources. This further increases social and economic pressures on ecosystems, locking agro-pastoral systems in a downward spiral of degradation. For the development and implementation of sustainable land management plans to be sustainable, authorities need to take the complex drivers of increased soil erosion into consideration. Examples from sustainable intensification responses to the demands of population increase, demonstrate that the integrity of locally adapted systems needs to be protected, but not isolated, from external pressures. Communities have to increase productivity and diversify their economy by building upon, not abandoning, existing linkages between the social, economic and natural domains. Locally adapted management practices need to be integrated in regional, national and supra-national institutions. A nested political and economic framework, wherein local communities are able to access agricultural technologies and state services, is a key prerequisite towards regional development of sustainable agro-pastoral systems that safeguard soil health, food and livelihood security.Natural Environment Research Council (NERC)British Academ

    Association of Fidaxomicin with C. difficile spores: Effects of Persistence on Subsequent Spore Recovery, Outgrowth and Toxin Production.

    Get PDF
    Background: We have previously shown that fidaxomicin instillation prevents spore recovery in an in-vitro gut model, whereas vancomycin does not. The reasons for this are unclear. Here, we have investigated persistence of fidaxomicin and vancomycin on C. difficile spores, and examined post-antibiotic exposure spore recovery, outgrowth and toxin production. Methods: Prevalent UK C. difficile ribotypes (n=10) were incubated with 200mg/L fidaxomicin, vancomycin or a non-antimicrobial containing control for 1 h in faecal filtrate or Phosphate Buffered Saline. Spores were washed three times with faecal filtrate or phosphate buffered saline, and residual spore-associated antimicrobial activity was determined by bioassay. For three ribotypes (027, 078, 015), antimicrobial-exposed, faecal filtrate-washed spores and controls were inoculated into broth. Viable vegetative and spore counts were enumerated on CCEYL agar. Percentage phase bright spores, phase dark spores and vegetative cells were enumerated by phase contrast microscopy at 0, 3, 6, 24 and 48 h post-inoculation. Toxin levels (24 and 48h) were determined by cell cytotoxicity assay. Results: Fidaxomicin, but not vancomycin persisted on spores of all ribotypes following washing in saline (mean=10.1mg/L; range= 4.0-14mg/L) and faecal filtrate (mean =17.4mg/L; 8.4-22.1mg/L). Outgrowth and proliferation rates of vancomycin-exposed spores were similar to controls, whereas fidaxomicin-exposed spores showed no vegetative cell growth after 24 and 48 h. At 48h, toxin levels averaged 3.7 and 3.3 relative units (RU) in control and vancomycin-exposed samples, respectively, but were undetectable in fidaxomicin-exposed samples. Conclusion: Fidaxomicin persists on C. difficile spores, whereas vancomycin does not. This persistence prevents subsequent growth and toxin production in vitro. This may have implications on spore viability, thereby impacting CDI recurrence and transmission rates

    The impact of the introduction of fidaxomicin on the management of Clostridium difficile infection in seven NHS secondary care hospitals in England: a series of local service evaluations.

    Get PDF
    Clostridium difficile infection (CDI) is associated with high mortality. Reducing incidence is a priority for patients, clinicians, the National Health Service (NHS) and Public Health England alike. In June 2012, fidaxomicin (FDX) was launched for the treatment of adults with CDI. The objective of this evaluation was to collect robust real-world data to understand the effectiveness of FDX in routine practice. In seven hospitals introducing FDX between July 2012 and July 2013, data were collected retrospectively from medical records on CDI episodes occurring 12 months before/after the introduction of FDX. All hospitalised patients aged ≥18 years with primary CDI (diarrhoea with presence of toxin A/B without a previous CDI in the previous 3 months) were included. Recurrence was defined as in-patient diarrhoea re-emergence requiring treatment any time within 3 months after the first episode. Each hospital had a different protocol for the use of FDX. In hospitals A and B, where FDX was used first line for all primary and recurrent episodes, the recurrence rate reduced from 10.6 % to 3.1 % and from 16.3 % to 3.1 %, with a significant difference in 28-day mortality from 18.2 % to 3.1 % (p < 0.05) and 17.3 % to 6.3 % (p < 0.05) for hospitals A and B, respectively. In hospitals using FDX in selected patients only, the changes in recurrence rates and mortality were less marked. The pattern of adoption of FDX appears to affect its impact on CDI outcome, with maximum reduction in recurrence and all-cause mortality where it is used as first-line treatment
    corecore