4,770 research outputs found

    Does aspirin or non-aspirin non-steroidal anti-inflammatory drug use prevent colorectal cancer in inflammatory bowel disease?

    Get PDF
    AIM: To determine whether aspirin or non-aspirin non-steroidal anti-inflammatory drugs (NA-NSAIDs) prevent colorectal cancer (CRC) in patients with inflammatory bowel disease (IBD). METHODS: We performed a systematic review and meta-analysis. We searched for articles reporting the risk of CRC in patients with IBD related to aspirin or NA-NSAID use. Pooled odds ratios (OR) and 95%CIs were determined using a random-effects model. Publication bias was assessed using Funnel plots and Egger’s test. Heterogeneity was assessed using Cochran’s Q and the I2 statistic. RESULTS: Eight studies involving 14917 patients and 3 studies involving 1282 patients provided data on the risk of CRC in patients with IBD taking NA-NSAIDs and aspirin respectively. The pooled OR of developing CRC after exposure to NA-NSAIDs in patients with IBD was 0.80 (95%CI: 0.39-1.21) and after exposure to aspirin it was 0.66 (95%CI: 0.06-1.39). There was significant heterogeneity (I2 > 50%) between the studies. There was no change in the effect estimates on subgroup analyses of the population studied or whether adjustment or matching was performed. CONCLUSION: There is a lack of high quality evidence on this important clinical topic. From the available evidence NA-NSAID or aspirin use does not appear to be chemopreventative for CRC in patients with IBD

    Participation and Learning Relationships: A Service-Learning Case Study

    Get PDF

    Colonic lesion characterization in inflammatory bowel disease: A systematic review and meta-analysis

    Get PDF
    Aim: To perform a systematic review and meta-analysis for the diagnostic accuracy of in vivo lesion characterization in colonic inflammatory bowel disease (IBD), using optical imaging techniques, including virtual chro-moendoscopy (VCE), dye-based chromoendoscopy (DBC), magnification endoscopy and confocal laser endomicroscopy (CLE). Methods: We searched Medline, Embase and the Cochrane library. We performed a bivariate meta-analysis to calculate the pooled estimate sensitivities, specificities, positive and negative likelihood ratios (+LHR, -LHR), diagnostic odds ratios (DOR), and area under the SROC curve (AUSROC) for each technology group. A subgroup analysis was performed to investigate differences in real-time non-magnified Kudo pit patterns (with VCE and DBC) and real-time CLE. Results: We included 22 studies [1491 patients; 4674 polyps, of which 539 (11.5%) were neoplastic]. Real-time CLE had a pooled sensitivity of 91% (95%CI: 66%-98%), specificity of 97% (95%CI: 94%-98%), and an AUSROC of 0.98 (95%CI: 0.97-0.99). Magnification endoscopy had a pooled sensitivity of 90% (95%CI: 77%-96%) and specificity of 87% (95%CI: 81%-91%). VCE had a pooled sensitivity of 86% (95%CI: 62%-95%) and specificity of 87% (95%CI: 72%-95%). DBC had a pooled sensitivity of 67% (95%CI: 44%-84%) and specificity of 86% (95%CI: 72%-94%). Conclusion: Real-time CLE is a highly accurate technology for differentiating neoplastic from non-neoplastic lesions in patients with colonic IBD. However, most CLE studies were performed by single expert users within tertiary centres, potentially confounding these results

    Relativity Theory and Time Perception: Single or Multiple Clocks?

    Get PDF
    BACKGROUND:Current theories of interval timing assume that humans and other animals time as if using a single, absolute stopwatch that can be stopped or reset on command. Here we evaluate the alternative view that psychological time is represented by multiple clocks, and that these clocks create separate temporal contexts by which duration is judged in a relative manner. Two predictions of the multiple-clock hypothesis were tested. First, that the multiple clocks can be manipulated (stopped and/or reset) independently. Second, that an event of a given physical duration would be perceived as having different durations in different temporal contexts, i.e., would be judged differently by each clock. METHODOLOGY/PRINCIPAL FINDINGS:Rats were trained to time three durations (e.g., 10, 30, and 90 s). When timing was interrupted by an unexpected gap in the signal, rats reset the clock used to time the "short" duration, stopped the "medium" duration clock, and continued to run the "long" duration clock. When the duration of the gap was manipulated, the rats reset these clocks in a hierarchical order, first the "short", then the "medium", and finally the "long" clock. Quantitative modeling assuming re-allocation of cognitive resources in proportion to the relative duration of the gap to the multiple, simultaneously timed event durations was used to account for the results. CONCLUSIONS/SIGNIFICANCE:These results indicate that the three event durations were effectively timed by separate clocks operated independently, and that the same gap duration was judged relative to these three temporal contexts. Results suggest that the brain processes the duration of an event in a manner similar to Einstein's special relativity theory: A given time interval is registered differently by independent clocks dependent upon the context

    Risk of gastrointestinal bleeding with direct oral anticoagulants: a systematic review and network meta-analysis

    Get PDF
    Background: Direct oral anticoagulants are increasingly used for a wide range of indications. However, data are conflicting about the risk of major gastrointestinal bleeding with these drugs. We compared the risk of gastrointestinal bleeding with direct oral anticoagulants, warfarin, and low-molecular-weight heparin. Methods: For this systematic review and meta-analysis, we searched MEDLINE and Embase from database inception to April 1, 2016, for prospective and retrospective studies that reported the risk of gastrointestinal bleeding with use of a direct oral anticoagulant compared with warfarin or low-molecular-weight heparin for all indications. We also searched the Cochrane Library for systematic reviews and assessment evaluations, the National Health Service (UK) Economic Evaluation Database, and ISI Web of Science for conference abstracts and proceedings (up to April 1, 2016). The primary outcome was the incidence of major gastrointestinal bleeding, with all gastrointestinal bleeding as a secondary outcome. We did a Bayesian network meta-analysis to produce incidence rate ratios (IRRs) with 95% credible intervals (CrIs). Findings: We identified 38 eligible articles, of which 31 were included in the primary analysis, including 287 692 patients exposed to 230 090 years of anticoagulant drugs. The risk of major gastrointestinal bleeding with direct oral anticoagulants did not differ from that with warfarin or low-molecular-weight heparin (factor Xa vs warfarin IRR 0·78 [95% CrI 0·47−1·08]; warfarin vs dabigatran 0·88 [0·59−1·36]; factor Xa vs low-molecular-weight heparin 1·02 [0·42−2·70]; and low-molecular-weight heparin vs dabigatran 0·67 [0·20−1·82]). In the secondary analysis, factor Xa inhibitors were associated with a reduced risk of all severities of gastrointestinal bleeding compared with warfarin (0·25 [0.07–0.76]) or dabigatran (0.24 [0.07–0.77]). Interpretation: Our findings show no increase in risk of major gastrointestinal bleeding with direct oral anticoagulants compared with warfarin or low-molecular-weight heparin. These findings support the continued use of direct oral anticoagulants. Funding: Leeds Teaching Hospitals Charitable Foundation

    The learning experiences of health and social care paraprofessionals on a foundation degree

    Get PDF
    Foundation degrees have been developed in the UK as a means of meeting the learning needs of paraprofessionals in health and social care and the services within which they work in a cost-effective fashion. Workplace learning is an intrinsic component to these degrees. Taking a socio-cultural perspective, this paper examines how the students' workplaces, life circumstances and sense of career trajectory shaped their learning experience and motivation. A small-scale evaluation study, using semi-structured interviews, focused on the learning experiences of a group of paraprofessionals enrolled in a foundation degree in health and social care. Data revealed fragmented employment patterns, underpinned by consistent vocational drives. While the study resonated with vocation, participants were ambivalent or lacked information about career progression. Workplace conditions, relationships and limited time shaped learning and coping strategies. A strategic and focused approach to student learning is required and includes attention to career pathways, workforce development strategy, the requirements of a range of stakeholders, workplace supervision and support for learning

    Independent adaptation mechanisms for numerosity and size perception provide evidence against a common sense of magnitude

    Get PDF
    Abstract How numerical quantity is processed is a central issue for cognition. On the one hand the “number sense theory” claims that numerosity is perceived directly, and may represent an early precursor for acquisition of mathematical skills. On the other, the “theory of magnitude” notes that numerosity correlates with many continuous properties such as size and density, and may therefore not exist as an independent feature, but be part of a more general system of magnitude. In this study we examined interactions in sensitivity between numerosity and size perception. In a group of children, we measured psychophysically two sensory parameters: perceptual adaptation and discrimination thresholds for both size and numerosity. Neither discrimination thresholds nor adaptation strength for numerosity and size correlated across participants. This clear lack of correlation (confirmed by Bayesian analyses) suggests that numerosity and size interference effects are unlikely to reflect a shared sensory representation. We suggest these small interference effects may rather result from top-down phenomena occurring at late decisional levels rather than a primary “sense of magnitude”

    Field Pea Response to Seeding Rate, Depth, and Inoculant in West-Central Nebraska

    Get PDF
    Increased market demand and larger adoption of field pea (Pisum sativum L.) in semiarid west-central Nebraska has provided opportunities to replace summer fallow and diversify crop rotations. As a relatively new crop, its response to different seeding practices has not been evaluated in this eco-region. Field pea grain yield response to seeding depth (25, 50, and 75 mm), inoculation with Rhizobium leguminosarum bv. viciae (yes and no rhizobia inoculant), and seeding rates (35, 50, 65, 75, 90, 105, and 120 plants m–2) was investigated in 2015 and 2016 at five sites in Perkins County, NE. There were no differences in yield for field pea planted at depths of 25, 50, and 75 mm. Yield differences between inoculated and noninoculated field pea were not observed; however, a lack of nodules on noninoculated field pea plants suggests that carryover of rhizobia in soil with a history of field grown 2 to 3 yr previously was not sufficient to initiate nodulation. Seeding rates resulting in plant populations of 45 to 60 plants m–2 provided the highest economic return; an economic penalty (~$1.05 ha–1) may occur for each additional plant per square meter attained over this plant population. Increasing the seeding rate, however, may help farmers manage risks of hail injury, enhance weed suppression, and increase harvest efficiency. Therefore, field pea grown in semiarid west-central Nebraska should be properly inoculated with rhizobia at every planting, seeded in good moisture at depths ranging from 25 to 75 mm, and have final plant population of at least 60 plants m–2

    Harold M. Frost T J Musculoskel Neuron Interact 2001; 2(2):117-119 William F. Neuman Awardee 2001

    Get PDF
    Tribute to Harold M. Frost, honorary president of ISMNI, who received the William F. Neuman Award from the American Society of Bone and Mineral Research October 2001
    corecore