131 research outputs found

    Social Media Policy to Support Employee Productivity in the Finance Industry

    Get PDF
    Business leaders may see social media as a distraction for their workers; however, blocking access could lead to a reduction in productivity. Using social media technologies with knowledge workers could achieve cost reductions for payroll of 30% to 35%. The purpose of this multiple case study was to explore how business leaders used a social media policy to support employee productivity. The conceptual framework for this study was social exchange theory, which supports the notion that dyad and small group interactions make up most interactions, and such interactions enhance employees\u27 productivity. The research question was to explore how finance industry leaders are using a social media policy to enhance productivity. The target population for this study was leaders from financial companies in Charlotte, North Carolina, who have experience in using social media policies to increase employee productivity. Data collection included semistructured interviews with 9 technology leaders and company documents at two companies related to the research phenomenon. Yin\u27s 5-step data analysis approach resulted in 3 themes: employee productivity, communication, and open company culture. Business leaders should consider using a social media policy to engage employees to support productivity, enhance communication both externally and internally, and enrich company culture in a way that is visible to employees. Employee engagement in a social media platform to connect and communicate with people could lead to a happier workplace and encourage employees to volunteer more frequently for social good

    Achieving change in primary care—causes of the evidence to practice gap : systematic reviews of reviews

    Get PDF
    Acknowledgements The Evidence to Practice Project (SPCR FR4 project number: 122) is funded by the National Institute of Health Research (NIHR) School for Primary Care Research (SPCR). KD is part-funded by the National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Research and Care West Midlands and by a Knowledge Mobilisation Research Fellowship (KMRF-2014-03-002) from the NIHR. This paper presents independent research funded by the National Institute of Health Research (NIHR). The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health. Funding This study is funded by the National Institute for Health Research (NIHR) School for Primary Care Research (SPCR).Peer reviewedPublisher PD

    Does Simplicity Compromise Accuracy in ACS Risk Prediction? A Retrospective Analysis of the TIMI and GRACE Risk Scores

    Get PDF
    BACKGROUND: The Thrombolysis in Myocardial Infarction (TIMI) risk scores for Unstable Angina/Non-ST-elevation myocardial infarction (UA/NSTEMI) and ST-elevation myocardial infarction (STEMI) and the Global Registry of Acute Coronary Events (GRACE) risk scores for in-hospital and 6-month mortality are established tools for assessing risk in Acute Coronary Syndrome (ACS) patients. The objective of our study was to compare the discriminative abilities of the TIMI and GRACE risk scores in a broad-spectrum, unselected ACS population and to assess the relative contributions of model simplicity and model composition to any observed differences between the two scoring systems. METHODOLOGY/PRINCIPAL FINDINGS: ACS patients admitted to the University of Michigan between 1999 and 2005 were divided into UA/NSTEMI (n = 2753) and STEMI (n = 698) subpopulations. The predictive abilities of the TIMI and GRACE scores for in-hospital and 6-month mortality were assessed by calibration and discrimination. There were 137 in-hospital deaths (4%), and among the survivors, 234 (7.4%) died by 6 months post-discharge. In the UA/NSTEMI population, the GRACE risk scores demonstrated better discrimination than the TIMI UA/NSTEMI score for in-hospital (C = 0.85, 95% CI: 0.81-0.89, versus 0.54, 95% CI: 0.48-0.60; p<0.01) and 6-month (C = 0.79, 95% CI: 0.76-0.83, versus 0.56, 95% CI: 0.52-0.60; p<0.01) mortality. Among STEMI patients, the GRACE and TIMI STEMI scores demonstrated comparably excellent discrimination for in-hospital (C = 0.84, 95% CI: 0.78-0.90 versus 0.83, 95% CI: 0.78-0.89; p = 0.83) and 6-month (C = 0.72, 95% CI: 0.63-0.81, versus 0.71, 95% CI: 0.64-0.79; p = 0.79) mortality. An analysis of refitted multivariate models demonstrated a marked improvement in the discriminative power of the TIMI UA/NSTEMI model with the incorporation of heart failure and hemodynamic variables. Study limitations included unaccounted for confounders inherent to observational, single institution studies with moderate sample sizes. CONCLUSIONS/SIGNIFICANCE: The GRACE scores provided superior discrimination as compared with the TIMI UA/NSTEMI score in predicting in-hospital and 6-month mortality in UA/NSTEMI patients, although the GRACE and TIMI STEMI scores performed equally well in STEMI patients. The observed discriminative deficit of the TIMI UA/NSTEMI score likely results from the omission of key risk factors rather than from the relative simplicity of the scoring system

    Improving the normalization of complex interventions: measure development based on normalization process theory (NoMAD): study protocol

    Get PDF
    &lt;b&gt;Background&lt;/b&gt; Understanding implementation processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Objectives&lt;/b&gt; The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Methods&lt;/b&gt; A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Discussion&lt;/b&gt; The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices

    Controlling the passage of light through metal microchannels by nanocoatings of phospholipids

    Get PDF
    The flow of polarized light through a metal film with an array of microchannels is controlled by the phase of an optically active, phospholipid nanocoating, even though the coating does not cover the open area of the microchannels. The molecular details of the assembly (DPPC phospholipid monolayer/bilayer on a hexadecanethiol monolayer on a copper-or nickel-coated microarray) were determined using the infrared, surfaceplasmon-mediated, extraordinary transmission of the metal microarrays. Infrared absorption spectra with greatly enhanced absorptions by comparison to literature were recorded and used as a diagnostic for the phase, composition, and molecular geometry of these nanocoatings. This approach presents new tools for nanoscale construction in constricted microspaces, which may ultimately be useful with individual microchannels

    Reconstructing an Ancestral Mammalian Immune Supercomplex from a Marsupial Major Histocompatibility Complex

    Get PDF
    The first sequenced marsupial genome promises to reveal unparalleled insights into mammalian evolution. We have used theMonodelphis domestica (gray short-tailed opossum) sequence to construct the first map of a marsupial major histocompatibility complex (MHC). The MHC is the most gene-dense region of the mammalian genome and is critical to immunity and reproductive success. The marsupial MHC bridges the phylogenetic gap between the complex MHC of eutherian mammals and the minimal essential MHC of birds. Here we show that the opossum MHC is gene dense and complex, as in humans, but shares more organizational features with non-mammals. The Class I genes have amplified within the Class II region, resulting in a unique Class I/II region. We present a model of the organization of the MHC in ancestral mammals and its elaboration during mammalian evolution. The opossum genome, together with other extant genomes, reveals the existence of an ancestral “immune supercomplex” that contained genes of both types of natural killer receptors together with antigen processing genes and MHC genes

    The European Space Agency BIOMASS mission: Measuring forest above-ground biomass from space

    Get PDF
    The primary objective of the European Space Agency's 7th Earth Explorer mission, BIOMASS, is to determine the worldwide distribution of forest above-ground biomass (AGB) in order to reduce the major uncertainties in calculations of carbon stocks and fluxes associated with the terrestrial biosphere, including carbon fluxes associated with Land Use Change, forest degradation and forest regrowth. To meet this objective it will carry, for the first time in space, a fully polarimetric P-band synthetic aperture radar (SAR). Three main products will be provided: global maps of both AGB and forest height, with a spatial resolution of 200 m, and maps of severe forest disturbance at 50 m resolution (where “global” is to be understood as subject to Space Object tracking radar restrictions). After launch in 2022, there will be a 3-month commissioning phase, followed by a 14-month phase during which there will be global coverage by SAR tomography. In the succeeding interferometric phase, global polarimetric interferometry Pol-InSAR coverage will be achieved every 7 months up to the end of the 5-year mission. Both Pol-InSAR and TomoSAR will be used to eliminate scattering from the ground (both direct and double bounce backscatter) in forests. In dense tropical forests AGB can then be estimated from the remaining volume scattering using non-linear inversion of a backscattering model. Airborne campaigns in the tropics also indicate that AGB is highly correlated with the backscatter from around 30 m above the ground, as measured by tomography. In contrast, double bounce scattering appears to carry important information about the AGB of boreal forests, so ground cancellation may not be appropriate and the best approach for such forests remains to be finalized. Several methods to exploit these new data in carbon cycle calculations have already been demonstrated. In addition, major mutual gains will be made by combining BIOMASS data with data from other missions that will measure forest biomass, structure, height and change, including the NASA Global Ecosystem Dynamics Investigation lidar deployed on the International Space Station after its launch in December 2018, and the NASA-ISRO NISAR L- and S-band SAR, due for launch in 2022. More generally, space-based measurements of biomass are a core component of a carbon cycle observation and modelling strategy developed by the Group on Earth Observations. Secondary objectives of the mission include imaging of sub-surface geological structures in arid environments, generation of a true Digital Terrain Model without biases caused by forest cover, and measurement of glacier and icesheet velocities. In addition, the operations needed for ionospheric correction of the data will allow very sensitive estimates of ionospheric Total Electron Content and its changes along the dawn-dusk orbit of the mission
    • 

    corecore