94 research outputs found

    Study of university donor recognition event

    Get PDF
    Festivals, fairs, conventions, ballgames, concerts and fundraising activities are types of special events that individuals typically attend to participate in the festivities and enjoy with friends, family and colleagues (Causin & McCarthy, 2017; Causin et. al, 2010). Fundraising in higher education has been around for hundreds of years. In the early 1900’s, the driving motivator for fundraising was to provide funds for students specifically for housing expenses, books and food (Cook & Lasher, 1996). Research has shown that donors are more likely to give when their contributions are made known to the public (Sheremeta & Samek, 2017). In an effort to create awareness and appreciation, many institutions have incorporated events to honor donors. However, there is little research on how and what type of donor recognition is most effective and will encourage donors to give. Determining what motivates a donor to give large contributions and if publicly recognizing donors at a major event motivates others to give is still to be determined. Although research has shown that donor recognition is important in cultivating relationships, the type of recognition that is most effective is unknown (Sheremeta & Samek, 2017)

    Blockage of saline intrusions in restricted, two-layer exchange flows across a submerged sill obstruction

    Get PDF
    The work has been supported by European Community’s Seventh Framework Programme through the grant to the budget of the Integrating Activity HYDRALAB IV within the Transnational Access Activities, Contract No. 261520.Results are presented from a series of large-scale experiments investigating the internal and near-bed dynamics of bi-directional stratified flows with a net-barotropic component across a submerged, trapezoidal, sill obstruction. High-resolution velocity and density profiles are obtained in the vicinity of the obstruction to observe internal-flow dynamics under a range of parametric forcing conditions (i.e. variable saline and fresh water volume fluxes; density differences; sill obstruction submergence depths). Detailed synoptic velocity fields are measured across the sill crest using 2D particle image velocimetry, while the density structure of the two-layer exchange flows is measured using micro-conductivity probes at several sill locations. These measurements are designed to aid qualitative and quantitative interpretation of the internal-flow processes associated with the lower saline intrusion layer blockage conditions, and indicate that the primary mechanism for this blockage is mass exchange from the saline intrusion layer due to significant interfacial mixing and entrainment under dominant, net-barotropic, flow conditions in the upper freshwater layer. This interfacial mixing is quantified by considering both the isopycnal separation of vertically-sorted density profiles across the sill, as well as calculation of corresponding Thorpe overturning length scales. Analysis of the synoptic velocity fields and density profiles also indicates that the net exchange flow conditions remain subcritical (G < 1) across the sill for all parametric conditions tested. An analytical two-layer exchange flow model is then developed to include frictional and entrainment effects, both of which are needed to account for turbulent stresses and saline entrainment into the upper freshwater layer. The experimental results are used to validate two key model parameters: (1) the internal-flow head loss associated with boundary friction and interfacial shear; and (2) the mass exchange from the lower saline layer into the upper fresh layer due to entrainment.Publisher PDFPeer reviewe

    Method for evaluating prediction models that apply the results of randomized trials to individual patients

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>The clinical significance of a treatment effect demonstrated in a randomized trial is typically assessed by reference to differences in event rates at the group level. An alternative is to make individualized predictions for each patient based on a prediction model. This approach is growing in popularity, particularly for cancer. Despite its intuitive advantages, it remains plausible that some prediction models may do more harm than good. Here we present a novel method for determining whether predictions from a model should be used to apply the results of a randomized trial to individual patients, as opposed to using group level results.</p> <p>Methods</p> <p>We propose applying the prediction model to a data set from a randomized trial and examining the results of patients for whom the treatment arm recommended by a prediction model is congruent with allocation. These results are compared with the strategy of treating all patients through use of a net benefit function that incorporates both the number of patients treated and the outcome. We examined models developed using data sets regarding adjuvant chemotherapy for colorectal cancer and Dutasteride for benign prostatic hypertrophy.</p> <p>Results</p> <p>For adjuvant chemotherapy, we found that patients who would opt for chemotherapy even for small risk reductions, and, conversely, those who would require a very large risk reduction, would on average be harmed by using a prediction model; those with intermediate preferences would on average benefit by allowing such information to help their decision making. Use of prediction could, at worst, lead to the equivalent of an additional death or recurrence per 143 patients; at best it could lead to the equivalent of a reduction in the number of treatments of 25% without an increase in event rates. In the Dutasteride case, where the average benefit of treatment is more modest, there is a small benefit of prediction modelling, equivalent to a reduction of one event for every 100 patients given an individualized prediction.</p> <p>Conclusion</p> <p>The size of the benefit associated with appropriate clinical implementation of a good prediction model is sufficient to warrant development of further models. However, care is advised in the implementation of prediction modelling, especially for patients who would opt for treatment even if it was of relatively little benefit.</p

    Population-based type-specific prevalence of high-risk human papillomavirus infection in Estonia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Effective prophylactic vaccines are available against human papillomavirus (HPV) types 6, 11, 16, and 18 which are licensed for routine use among young women. Monitoring is needed to demonstrate protection against cervical cancer, to verify duration of protection, and assess replacement frequency of non-vaccine types among vaccinated cohorts.</p> <p>Methods</p> <p>Data from a population-based study were used to assess the type-specific prevalence of HPV in a non-vaccinated population in Estonia: 845 self-administered surveys and self-collected vaginal swabs were distributed, 346 were collected by mail and tested for HPV DNA from female participants 18-35 years of age.</p> <p>Results</p> <p>The overall HPV prevalence (weighted estimate to account for the sampling method) in the study population (unvaccinated women aged 18-35) was calculated to be 38% (95% CI 31-45%), with estimated prevalences of high- and low-risk HPV types 21% (95% CI 16-26%), and 10% (95% CI 7-14%), respectively. Of the high-risk HPV types, HPV 16 was detected most frequently (6.4%; 95% CI 4.0-9.8%) followed by HPV 53 (4.3%; 95% CI 2.3-7.2%) and HPV 66 (2.8%; 95% CI 1.3-5.2%).</p> <p>Conclusions</p> <p>We observed a high prevalence of total and high-risk type HPV in an Eastern European country. The most common high-risk HPV types detected were HPV 16, 53, and 66.</p

    Prognostic markers in cancer: the evolution of evidence from single studies to meta-analysis, and beyond

    Get PDF
    In oncology, prognostic markers are clinical measures used to help elicit an individual patient's risk of a future outcome, such as recurrence of disease after primary treatment. They thus facilitate individual treatment choice and aid in patient counselling. Evidence-based results regarding prognostic markers are therefore very important to both clinicians and their patients. However, there is increasing awareness that prognostic marker studies have been neglected in the drive to improve medical research. Large protocol-driven, prospective studies are the ideal, with appropriate statistical analysis and clear, unbiased reporting of the methods used and the results obtained. Unfortunately, published prognostic studies rarely meet such standards, and systematic reviews and meta-analyses are often only able to draw attention to the paucity of good-quality evidence. We discuss how better-quality prognostic marker evidence can evolve over time from initial exploratory studies, to large protocol-driven primary studies, and then to meta-analysis or even beyond, to large prospectively planned pooled analyses and to the initiation of tumour banks. We highlight articles that facilitate each stage of this process, and that promote current guidelines aimed at improving the design, analysis, and reporting of prognostic marker research. We also outline why collaborative, multi-centre, and multi-disciplinary teams should be an essential part of future studies

    Comparative Phylogeography of a Coevolved Community: Concerted Population Expansions in Joshua Trees and Four Yucca Moths

    Get PDF
    Comparative phylogeographic studies have had mixed success in identifying common phylogeographic patterns among co-distributed organisms. Whereas some have found broadly similar patterns across a diverse array of taxa, others have found that the histories of different species are more idiosyncratic than congruent. The variation in the results of comparative phylogeographic studies could indicate that the extent to which sympatrically-distributed organisms share common biogeographic histories varies depending on the strength and specificity of ecological interactions between them. To test this hypothesis, we examined demographic and phylogeographic patterns in a highly specialized, coevolved community – Joshua trees (Yucca brevifolia) and their associated yucca moths. This tightly-integrated, mutually interdependent community is known to have experienced significant range changes at the end of the last glacial period, so there is a strong a priori expectation that these organisms will show common signatures of demographic and distributional changes over time. Using a database of >5000 GPS records for Joshua trees, and multi-locus DNA sequence data from the Joshua tree and four species of yucca moth, we combined paleaodistribution modeling with coalescent-based analyses of demographic and phylgeographic history. We extensively evaluated the power of our methods to infer past population size and distributional changes by evaluating the effect of different inference procedures on our results, comparing our palaeodistribution models to Pleistocene-aged packrat midden records, and simulating DNA sequence data under a variety of alternative demographic histories. Together the results indicate that these organisms have shared a common history of population expansion, and that these expansions were broadly coincident in time. However, contrary to our expectations, none of our analyses indicated significant range or population size reductions at the end of the last glacial period, and the inferred demographic changes substantially predate Holocene climate changes

    Iron Behaving Badly: Inappropriate Iron Chelation as a Major Contributor to the Aetiology of Vascular and Other Progressive Inflammatory and Degenerative Diseases

    Get PDF
    The production of peroxide and superoxide is an inevitable consequence of aerobic metabolism, and while these particular "reactive oxygen species" (ROSs) can exhibit a number of biological effects, they are not of themselves excessively reactive and thus they are not especially damaging at physiological concentrations. However, their reactions with poorly liganded iron species can lead to the catalytic production of the very reactive and dangerous hydroxyl radical, which is exceptionally damaging, and a major cause of chronic inflammation. We review the considerable and wide-ranging evidence for the involvement of this combination of (su)peroxide and poorly liganded iron in a large number of physiological and indeed pathological processes and inflammatory disorders, especially those involving the progressive degradation of cellular and organismal performance. These diseases share a great many similarities and thus might be considered to have a common cause (i.e. iron-catalysed free radical and especially hydroxyl radical generation). The studies reviewed include those focused on a series of cardiovascular, metabolic and neurological diseases, where iron can be found at the sites of plaques and lesions, as well as studies showing the significance of iron to aging and longevity. The effective chelation of iron by natural or synthetic ligands is thus of major physiological (and potentially therapeutic) importance. As systems properties, we need to recognise that physiological observables have multiple molecular causes, and studying them in isolation leads to inconsistent patterns of apparent causality when it is the simultaneous combination of multiple factors that is responsible. This explains, for instance, the decidedly mixed effects of antioxidants that have been observed, etc...Comment: 159 pages, including 9 Figs and 2184 reference

    Criteria for the use of omics-based predictors in clinical trials: Explanation and elaboration

    Get PDF
    High-throughput 'omics' technologies that generate molecular profiles for biospecimens have been extensively used in preclinical studies to reveal molecular subtypes and elucidate the biological mechanisms of disease, and in retrospective studies on clinical specimens to develop mathematical models to predict clinical endpoints. Nevertheless, the translation of these technologies into clinical tests that are useful for guiding management decisions for patients has been relatively slow. It can be difficult to determine when the body of evidence for an omics-based test is sufficiently comprehensive and reliable to support claims that it is ready for clinical use, or even that it is ready for definitive evaluation in a clinical trial in which it may be used to direct patient therapy. Reasons for this difficulty include the exploratory and retrospective nature of many of these studies, the complexity of these assays and their application to clinical specimens, and the many potential pitfalls inherent in the development of mathematical predictor models from the very high-dimensional data generated by these omics technologies. Here we present a checklist of criteria to consider when evaluating the body of evidence supporting the clinical use of a predictor to guide patient therapy. Included are issues pertaining to specimen and assay requirements, the soundness of the process for developing predictor models, expectations regarding clinical study design and conduct, and attention to regulatory, ethical, and legal issues. The proposed checklist should serve as a useful guide to investigators preparing proposals for studies involving the use of omics-based tests. The US National Cancer Institute plans to refer to these guidelines for review of proposals for studies involving omics tests, and it is hoped that other sponsors will adopt the checklist as well. © 2013 McShane et al.; licensee BioMed Central Ltd
    corecore