396 research outputs found

    The Institution of Well-Being: Embodying a Culture of Flourishing at the Shawnee Institute

    Get PDF
    We apply principles of positive organizational psychology to a hospitality inn and resort that seeks to focus on “positive hospitality” – the provision of immersive positive education and well-being to guests and employees alike. The Shawnee Institute aspires to serve as a bridge, linking the science of well-being with organizations across the globe. Integrating well-being throughout the Institute’s employees is desired to boost both employee and visitor experiences and distinguish the Institute from other resorts in the region as a destination, and as an employer. We propose an approach to the broad engagement with organizational well-being, discussing the role of cultural change and the needs of both full-time and seasonal employees. We recommend the use of the psychological capital framework to measure and improve well-being across all employees, and provide an implementation plan that includes immersive education for managers, a holistic appreciative inquiry kick-off for all employees, and well-being implementation exercises for on-boarding new employees. This work can assist other organizations, particularly those in the hospitality industry, that seek to improve the well-being of a diverse employee base

    New OH Zeeman measurements of magnetic field strengths in molecular clouds

    Get PDF
    We present the results of a new survey of 23 molecular clouds for the Zeeman effect in OH undertaken with the ATNF Parkes 64-m radio telescope and the NRAO Green Bank 43-m radio telescope. The Zeeman effect was clearly detected in the cloud associated with the HII region RCW 38, with a field strength of 38+/-3 micro-Gauss, and possibly detected in a cloud associated with the HII region RCW 57, with a field strength of -203+/-24 micro-Gauss. The remaining 21 measurements give formal upper limits to the magnetic field strength, with typical 1-sigma sensitivities <20 micro-Gauss. For 22 of the molecular clouds we are also able to determine thecolumn density of the gas in which we have made a sensitive search for the Zeeman effect. We combine these results with previous Zeeman studies of 29 molecular clouds, most of which were compiled by Crutcher (1999), for a comparsion of theoretical models with the data. This comparison implies that if the clouds can be modeled as initially spherical with uniform magnetic fields and densities that evolve to their final equilibrium state assuming flux-freezing then the typical cloud is magnetically supercritical, as was found by Crutcher (1999). If the clouds can be modeled as highly flattened sheets threaded by uniform perpendicular fields, then the typical cloud is approximately magnetically critical, in agreement with Shu et al. (1999), but only if the true values of the field for the non-detections are close to the 3-sigma upper limits. If instead these values are significantly lower (for example, similar to the 1-sigma limits), then the typical cloud is generally magnetically supercritical.Comment: 39 pages, 7 figures. Accepted for publication in Ap

    A complete view of the broad-line radio galaxy 4C+74.26 with XMM-Newton

    Full text link
    This paper presents a timing study and broadband spectral analysis of the broad-line radio galaxy 4C+74.26 based on a 35 ks XMM-Newton observation. As found in previous datasets, the source exhibits no evidence for rapid variability, and its 0.2-10 keV lightcurve is well fit by a constant. An excellent fit to the pn 0.3-12 keV spectrum was found using a continuum that combines an ionized and a neutral reflector, augmented by both cold and warm absorption. There is no evidence for a soft excess. The column of cold absorption was greater than the Galactic value with an intrinsic column of \~1.9\times 10^{21} cm^{-2}. Evidence for the warm absorber was found from O VII and O VIII absorption edges with maximum optical depths of \tau_{O VII}=0.3 and \tau_{O VIII}=0.03, respectively. A joint pn-MOS fit increased the O VIII optical depth to \tau_{O VIII}=0.1. A simple, one-zone warm absorber model yielded a column of ~9\times 10^{20} cm^{-2} and an ionization parameter of \~60. Partial covering models provide significantly worse fits than ones including a relativistically broadened Fe K line, strengthening the case for the existence of such a line. On the whole, the X-ray spectrum of 4C+74.26 exhibits many features typical of both a radio-loud quasar (excess absorption) and radio-quiet Seyfert~1 galaxies (\fe emission and warm absorption). We also show that a spurious absorption line at ~8 keV can be created by the subtraction of an instrumental Cu K\alpha emission line.Comment: 6 pages, 4 figures, accepted by MNRA

    On the Identification of High Mass Star Forming Regions using IRAS: Contamination by Low-Mass Protostars

    Full text link
    We present the results of a survey of a small sample (14) of low-mass protostars (L_IR < 10^3 Lsun) for 6.7 GHz methanol maser emission performed using the ATNF Parkes radio telescope. No new masers were discovered. We find that the lower luminosity limit for maser emission is near 10^3 Lsun, by comparison of the sources in our sample with previously detected methanol maser sources. We examine the IRAS properties of our sample and compare them with sources previously observed for methanol maser emission, almost all of which satisfy the Wood & Churchwell criterion for selecting candidate UCHII regions. We find that about half of our sample satisfy this criterion, and in addition almost all of this subgroup have integrated fluxes between 25 and 60 microns that are similar to sources with detectable methanol maser emission. By identifying a number of low-mass protostars in this work and from the literature that satisfy the Wood & Churchwell criterion for candidate UCHII regions, we show conclusively for the first time that the fainter flux end of their sample is contaminated by lower-mass non-ionizing sources, confirming the suggestion by van der Walt and Ramesh & Sridharan.Comment: 8 pages with 2 figures. Accepted by Ap

    Dynamical Patterns of Cattle Trade Movements

    Get PDF
    Despite their importance for the spread of zoonotic diseases, our understanding of the dynamical aspects characterizing the movements of farmed animal populations remains limited as these systems are traditionally studied as static objects and through simplified approximations. By leveraging on the network science approach, here we are able for the first time to fully analyze the longitudinal dataset of Italian cattle movements that reports the mobility of individual animals among farms on a daily basis. The complexity and inter-relations between topology, function and dynamical nature of the system are characterized at different spatial and time resolutions, in order to uncover patterns and vulnerabilities fundamental for the definition of targeted prevention and control measures for zoonotic diseases. Results show how the stationarity of statistical distributions coexists with a strong and non-trivial evolutionary dynamics at the node and link levels, on all timescales. Traditional static views of the displacement network hide important patterns of structural changes affecting nodes' centrality and farms' spreading potential, thus limiting the efficiency of interventions based on partial longitudinal information. By fully taking into account the longitudinal dimension, we propose a novel definition of dynamical motifs that is able to uncover the presence of a temporal arrow describing the evolution of the system and the causality patterns of its displacements, shedding light on mechanisms that may play a crucial role in the definition of preventive actions

    Dynamical Patterns of Cattle Trade Movements

    Get PDF
    Despite their importance for the spread of zoonotic diseases, our understanding of the dynamical aspects characterizing the movements of farmed animal populations remains limited as these systems are traditionally studied as static objects and through simplified approximations. By leveraging on the network science approach, here we are able for the first time to fully analyze the longitudinal dataset of Italian cattle movements that reports the mobility of individual animals among farms on a daily basis. The complexity and inter-relations between topology, function and dynamical nature of the system are characterized at different spatial and time resolutions, in order to uncover patterns and vulnerabilities fundamental for the definition of targeted prevention and control measures for zoonotic diseases. Results show how the stationarity of statistical distributions coexists with a strong and non-trivial evolutionary dynamics at the node and link levels, on all timescales. Traditional static views of the displacement network hide important patterns of structural changes affecting nodes' centrality and farms' spreading potential, thus limiting the efficiency of interventions based on partial longitudinal information. By fully taking into account the longitudinal dimension, we propose a novel definition of dynamical motifs that is able to uncover the presence of a temporal arrow describing the evolution of the system and the causality patterns of its displacements, shedding light on mechanisms that may play a crucial role in the definition of preventive actions

    Cancer diagnostic tools to aid decision-making in primary care: mixed-methods systematic reviews and cost-effectiveness analysis

    Get PDF
    This is the final version. Available on open access from the NIHR Journals Library via the DOI in this recordBackground: Tools based on diagnostic prediction models are available to help general practitioners diagnose cancer. It is unclear whether or not tools expedite diagnosis or affect patient quality of life and/or survival. Objectives: The objectives were to evaluate the evidence on the validation, clinical effectiveness, cost-effectiveness, and availability and use of cancer diagnostic tools in primary care. Methods: Two systematic reviews were conducted to examine the clinical effectiveness (review 1) and the development, validation and accuracy (review 2) of diagnostic prediction models for aiding general practitioners in cancer diagnosis. Bibliographic searches were conducted on MEDLINE, MEDLINE In-Process, EMBASE, Cochrane Library and Web of Science) in May 2017, with updated searches conducted in November 2018. A decision-analytic model explored the tools’ clinical effectiveness and cost-effectiveness in colorectal cancer. The model compared patient outcomes and costs between strategies that included the use of the tools and those that did not, using the NHS perspective. We surveyed 4600 general practitioners in randomly selected UK practices to determine the proportions of general practices and general practitioners with access to, and using, cancer decision support tools. Association between access to these tools and practice-level cancer diagnostic indicators was explored. Results: Systematic review 1 – five studies, of different design and quality, reporting on three diagnostic tools, were included. We found no evidence that using the tools was associated with better outcomes. Systematic review 2 – 43 studies were included, reporting on prediction models, in various stages of development, for 14 cancer sites (including multiple cancers). Most studies relate to QCancer® (ClinRisk Ltd, Leeds, UK) and risk assessment tools. Decision model: In the absence of studies reporting their clinical outcomes, QCancer and risk assessment tools were evaluated against faecal immunochemical testing. A linked data approach was used, which translates diagnostic accuracy into time to diagnosis and treatment, and stage at diagnosis. Given the current lack of evidence, the model showed that the cost-effectiveness of diagnostic tools in colorectal cancer relies on demonstrating patient survival benefits. Sensitivity of faecal immunochemical testing and specificity of QCancer and risk assessment tools in a low-risk population were the key uncertain parameters. Survey: Practitioner- and practice-level response rates were 10.3% (476/4600) and 23.3% (227/975), respectively. Cancer decision support tools were available in 83 out of 227 practices (36.6%, 95% confidence interval 30.3% to 43.1%), and were likely to be used in 38 out of 227 practices (16.7%, 95% confidence interval 12.1% to 22.2%). The mean 2-week-wait referral rate did not differ between practices that do and practices that do not have access to QCancer or risk assessment tools (mean difference of 1.8 referrals per 100,000 referrals, 95% confidence interval –6.7 to 10.3 referrals per 100,000 referrals). Limitations: There is little good-quality evidence on the clinical effectiveness and cost-effectiveness of diagnostic tools. Many diagnostic prediction models are limited by a lack of external validation. There are limited data on current UK practice and clinical outcomes of diagnostic strategies, and there is no evidence on the quality-of-life outcomes of diagnostic results. The survey was limited by low response rates. Conclusion: The evidence base on the tools is limited. Research on how general practitioners interact with the tools may help to identify barriers to implementation and uptake, and the potential for clinical effectiveness. Future work: Continued model validation is recommended, especially for risk assessment tools. Assessment of the tools’ impact on time to diagnosis and treatment, stage at diagnosis, and health outcomes is also recommended, as is further work to understand how tools are used in general practitioner consultations. Study registration: This study is registered as PROSPERO CRD42017068373 and CRD42017068375.National Institute for Health Research (NIHR

    The Support for Economic Inequality Scale: Development and Adjudication

    Get PDF
    Past research has documented myriad pernicious psychological effects of high economic inequality, prompting interest into how people perceive, evaluate, and react to inequality. Here we propose, refine, and validate the Support for Economic Inequality Scale (SEIS)–a novel measure of attitudes towards economic inequality. In Study 1, we distill eighteen items down to five, providing evidence for unidimensionality and reliability. In Study 2, we replicate the scale’s unidimensionality and reliability and demonstrate its validity. In Study 3, we evaluate a United States version of the SEIS. Finally, in Studies 4–5, we demonstrate the SEIS’s convergent and predictive validity, as well as evidence for the SEIS being distinct from other conceptually similar measures. The SEIS is a valid and reliable instrument for assessing perceptions of and reactions to economic inequality and provides a useful tool for researchers investigating the psychological underpinnings of economic inequality

    Standardized NEON organismal data for biodiversity research

    Get PDF
    Understanding patterns and drivers of species distribution and abundance, and thus biodiversity, is a core goal of ecology. Despite advances in recent decades, research into these patterns and processes is currently limited by a lack of standardized, high-quality, empirical data that span large spatial scales and long time periods. The NEON fills this gap by providing freely available observational data that are generated during robust and consistent organismal sampling of several sentinel taxonomic groups within 81 sites distributed across the United States and will be collected for at least 30 years. The breadth and scope of these data provide a unique resource for advancing biodiversity research. To maximize the potential of this opportunity, however, it is critical that NEON data be maximally accessible and easily integrated into investigators\u27 workflows and analyses. To facilitate its use for biodiversity research and synthesis, we created a workflow to process and format NEON organismal data into the ecocomDP (ecological community data design pattern) format that were available through the ecocomDP R package; we then provided the standardized data as an R data package (neonDivData). We briefly summarize sampling designs and data wrangling decisions for the major taxonomic groups included in this effort. Our workflows are open-source so the biodiversity community may: add additional taxonomic groups; modify the workflow to produce datasets appropriate for their own analytical needs; and regularly update the data packages as more observations become available. Finally, we provide two simple examples of how the standardized data may be used for biodiversity research. By providing a standardized data package, we hope to enhance the utility of NEON organismal data in advancing biodiversity research and encourage the use of the harmonized ecocomDP data design pattern for community ecology data from other ecological observatory networks
    corecore