11 research outputs found

    Characterizing Hospital Workers' Willingness to Respond to a Radiological Event

    Get PDF
    Terrorist use of a radiological dispersal device (RDD, or "dirty bomb"), which combines a conventional explosive device with radiological materials, is among the National Planning Scenarios of the United States government. Understanding employee willingness to respond is critical for planning experts. Previous research has demonstrated that perception of threat and efficacy is key in the assessing willingness to respond to a RDD event.An anonymous online survey was used to evaluate the willingness of hospital employees to respond to a RDD event. Agreement with a series of belief statements was assessed, following a methodology validated in previous work. The survey was available online to all 18,612 employees of the Johns Hopkins Hospital from January to March 2009.Surveys were completed by 3426 employees (18.4%), whose demographic distribution was similar to overall hospital staff. 39% of hospital workers were not willing to respond to a RDD scenario if asked but not required to do so. Only 11% more were willing if required. Workers who were hesitant to agree to work additional hours when required were 20 times less likely to report during a RDD emergency. Respondents who perceived their peers as likely to report to work in a RDD emergency were 17 times more likely to respond during a RDD event if asked. Only 27.9% of the hospital employees with a perception of low efficacy declared willingness to respond to a severe RDD event. Perception of threat had little impact on willingness to respond among hospital workers.Radiological scenarios such as RDDs are among the most dreaded emergency events yet studied. Several attitudinal indicators can help to identify hospital employees unlikely to respond. These risk-perception modifiers must then be addressed through training to enable effective hospital response to a RDD event

    Characterizing hospital workers' willingness to report to duty in an influenza pandemic through threat- and efficacy-based assessment

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Hospital-based providers' willingness to report to work during an influenza pandemic is a critical yet under-studied phenomenon. Witte's Extended Parallel Process Model (EPPM) has been shown to be useful for understanding adaptive behavior of public health workers to an unknown risk, and thus offers a framework for examining scenario-specific willingness to respond among hospital staff.</p> <p>Methods</p> <p>We administered an anonymous online EPPM-based survey about attitudes/beliefs toward emergency response, to all 18,612 employees of the Johns Hopkins Hospital from January to March 2009. Surveys were completed by 3426 employees (18.4%), approximately one third of whom were health professionals.</p> <p>Results</p> <p>Demographic and professional distribution of respondents was similar to all hospital staff. Overall, more than one-in-four (28%) hospital workers indicated they were not willing to respond to an influenza pandemic scenario if asked but not required to do so. Only an additional 10% were willing if required. One-third (32%) of participants reported they would be unwilling to respond in the event of a more severe pandemic influenza scenario. These response rates were consistent across different departments, and were one-third lower among nurses as compared with physicians. Respondents who were hesitant to agree to work additional hours when required were 17 times less likely to respond during a pandemic if asked. Sixty percent of the workers perceived their peers as likely to report to work in such an emergency, and were ten times more likely than others to do so themselves. Hospital employees with a perception of high efficacy had 5.8 times higher declared rates of willingness to respond to an influenza pandemic.</p> <p>Conclusions</p> <p>Significant gaps exist in hospital workers' willingness to respond, and the EPPM is a useful framework to assess these gaps. Several attitudinal indicators can help to identify hospital employees unlikely to respond. The findings point to certain hospital-based communication and training strategies to boost employees' response willingness, including promoting pre-event plans for home-based dependents; ensuring adequate supplies of personal protective equipment, vaccines and antiviral drugs for all hospital employees; and establishing a subjective norm of awareness and preparedness.</p

    Same data, different conclusions : radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis

    No full text
    In this crowdsourced initiative, independent analysts used the same dataset to test two hypotheses regarding the effects of scientists’ gender and professional status on verbosity during group meetings. Not only the analytic approach but also the operationalizations of key variables were left unconstrained and up to individual analysts. For instance, analysts could choose to operationalize status as job title, institutional ranking, citation counts, or some combination. To maximize transparency regarding the process by which analytic choices are made, the analysts used a platform we developed called DataExplained to justify both preferred and rejected analytic paths in real time. Analyses lacking sufficient detail, reproducible code, or with statistical errors were excluded, resulting in 29 analyses in the final sample. Researchers reported radically different analyses and dispersed empirical outcomes, in a number of cases obtaining significant effects in opposite directions for the same research question. A Boba multiverse analysis demonstrates that decisions about how to operationalize variables explain variability in outcomes above and beyond statistical choices (e.g., covariates). Subjective researcher decisions play a critical role in driving the reported empirical results, underscoring the need for open data, systematic robustness checks, and transparency regarding both analytic paths taken and not taken. Implications for organizations and leaders, whose decision making relies in part on scientific findings, consulting reports, and internal analyses by data scientists, are discussed

    Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis

    Get PDF
    In this crowdsourced initiative, independent analysts used the same dataset to test two hypotheses regarding the effects of scientists’ gender and professional status on verbosity during group meetings. Not only the analytic approach but also the operationalizations of key variables were left unconstrained and up to individual analysts. For instance, analysts could choose to operationalize status as job title, institutional ranking, citation counts, or some combination. To maximize transparency regarding the process by which analytic choices are made, the analysts used a platform we developed called DataExplained to justify both preferred and rejected analytic paths in real time. Analyses lacking sufficient detail, reproducible code, or with statistical errors were excluded, resulting in 29 analyses in the final sample. Researchers reported radically different analyses and dispersed empirical outcomes, in a number of cases obtaining significant effects in opposite directions for the same research question. A Boba multiverse analysis demonstrates that decisions about how to operationalize variables explain variability in outcomes above and beyond statistical choices (e.g., covariates). Subjective researcher decisions play a critical role in driving the reported empirical results, underscoring the need for open data, systematic robustness checks, and transparency regarding both analytic paths taken and not taken. Implications for organizations and leaders, whose decision making relies in part on scientific findings, consulting reports, and internal analyses by data scientists, are discussed
    corecore