156 research outputs found

    'To live and die [for] Dixie': Irish civilians and the Confederate States of America

    Get PDF
    Around 20,000 Irishmen served in the Confederate army in the Civil War. As a result, they left behind, in various Southern towns and cities, large numbers of friends, family, and community leaders. As with native-born Confederates, Irish civilian support was crucial to Irish participation in the Confederate military effort. Also, Irish civilians served in various supporting roles: in factories and hospitals, on railroads and diplomatic missions, and as boosters for the cause. They also, however, suffered in bombardments, sieges, and the blockade. Usually poorer than their native neighbours, they could not afford to become 'refugees' and move away from the centres of conflict. This essay, based on research from manuscript collections, contemporary newspapers, British Consular records, and Federal military records, will examine the role of Irish civilians in the Confederacy, and assess the role this activity had on their integration into Southern communities. It will also look at Irish civilians in the defeat of the Confederacy, particularly when they came under Union occupation. Initial research shows that Irish civilians were not as upset as other whites in the South about Union victory. They welcomed a return to normalcy, and often 'collaborated' with Union authorities. Also, Irish desertion rates in the Confederate army were particularly high, and I will attempt to gauge whether Irish civilians played a role in this. All of the research in this paper will thus be put in the context of the Drew Gilpin Faust/Gary Gallagher debate on the influence of the Confederate homefront on military performance. By studying the Irish civilian experience one can assess how strong the Confederate national experiment was. Was it a nation without a nationalism

    Sequential application of hyperspectral indices for delineation of stripe rust infection and nitrogen deficiency in wheat

    Full text link
    © 2015, Springer Science+Business Media New York. Nitrogen (N) fertilization is crucial for the growth and development of wheat crops, and yet increased use of N can also result in increased stripe rust severity. Stripe rust infection and N deficiency both cause changes in foliar physiological activity and reduction in plant pigments that result in chlorosis. Furthermore, stripe rust produce pustules on the leaf surface which similar to chlorotic regions have a yellow color. Quantifying the severity of each factor is critical for adopting appropriate management practices. Eleven widely-used vegetation indices, based on mathematic combinations of narrow-band optical reflectance measurements in the visible/near infrared wavelength range were evaluated for their ability to discriminate and quantify stripe rust severity and N deficiency in a rust-susceptible wheat variety (H45) under varying conditions of nitrogen status. The physiological reflectance index (PhRI) and leaf and canopy chlorophyll index (LCCI) provided the strongest correlation with levels of rust infection and N-deficiency, respectively. When PhRI and LCCI were used in a sequence, both N deficiency and rust infection levels were correctly classified in 82.5 and 55 % of the plots at Zadoks growth stage 47 and 75, respectively. In misclassified plots, an overestimation of N deficiency was accompanied by an underestimation of the rust infection level or vice versa. In 18 % of the plots, there was a tendency to underestimate the severity of stripe rust infection even though the N-deficiency level was correctly predicted. The contrasting responses of the PhRI and LCCI to stripe rust infection and N deficiency, respectively, and the relative insensitivity of these indices to the other parameter makes their use in combination suitable for quantifying levels of stripe rust infection and N deficiency in wheat crops under field conditions

    The duty cycle of local radio galaxies

    Full text link
    We use a volume- and flux-limited sample of local (0.03z0.10.03 \leq z \leq 0.1) radio galaxies with optical counterparts to address the question of how long a typical galaxy spends in radio-active and quiescent states. The length of the active phase has a strong dependence on the stellar mass of the host galaxy. Radio sources in the most massive hosts are also retriggered more frequently. The time spent in the active phase has the same dependence on stellar mass as does the gas cooling rate, suggesting the onset of the quiescent phase is due to fuel depletion. We find radio and emission line AGN activity to be independent, consistent with these corresponding to different accretion states.Comment: accepted for publication in MNRAS; 15 pages, 14 figure

    Comparison of Rx-defined morbidity groups and diagnosis- based risk adjusters for predicting healthcare costs in Taiwan

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Medication claims are commonly used to calculate the risk adjustment for measuring healthcare cost. The Rx-defined Morbidity Groups (Rx-MG) which combine the use of medication to indicate morbidity have been incorporated into the Adjusted Clinical Groups (ACG) Case Mix System, developed by the Johns Hopkins University. This study aims to verify that the Rx-MG can be used for adjusting risk and for explaining the variations in the healthcare cost in Taiwan.</p> <p>Methods</p> <p>The Longitudinal Health Insurance Database 2005 (LHID2005) was used in this study. The year 2006 was chosen as the baseline to predict healthcare cost (medication and total cost) in 2007. The final sample size amounted to 793 239 (81%) enrolees, and excluded any cases with discontinued enrolment. Two different kinds of models were built to predict cost: the concurrent model and the prospective model. The predictors used in the predictive models included age, gender, Aggregated Diagnosis Groups (ADG, diagnosis- defined morbidity groups), and Rx-defined Morbidity Groups. Multivariate OLS regression was used in the cost prediction modelling.</p> <p>Results</p> <p>The concurrent model adjusted for Rx-defined Morbidity Groups for total cost, and controlled for age and gender had a better predictive R-square = 0.618, compared to the model adjusted for ADGs (R<sup>2 </sup>= 0.411). The model combined with Rx-MGs and ADGs performed the best for concurrently predicting total cost (R<sup>2 </sup>= 0.650). For prospectively predicting total cost, the model combined Rx-MGs and ADGs (R<sup>2 </sup>= 0.382) performed better than the models adjusted by Rx-MGs (R<sup>2 </sup>= 0.360) or ADGs (R<sup>2 </sup>= 0.252) only. Similarly, the concurrent model adjusted for Rx-MGs predicting pharmacy cost had a better performance (R-square = 0.615), than the model adjusted for ADGs (R<sup>2 </sup>= 0.431). The model combined with Rx-MGs and ADGs performed the best in concurrently as well as prospectively predicting pharmacy cost (R<sup>2 </sup>= 0.638 and 0.505, respectively). The prospective models showed a remarkable improvement when adjusted by prior cost.</p> <p>Conclusions</p> <p>The medication-based Rx-Defined Morbidity Groups was useful in predicting pharmacy cost as well as total cost in Taiwan. Combining the information on medication and diagnosis as adjusters could arguably be the best method for explaining variations in healthcare cost.</p

    Living Well with Diabetes: a randomized controlled trial of a telephone-delivered intervention for maintenance of weight loss, physical activity and glycaemic control in adults with type 2 diabetes

    Get PDF
    Background By 2025, it is estimated that approximately 1.8 million Australian adults (approximately 8.4% of the adult population) will have diabetes, with the majority having type 2 diabetes. Weight management via improved physical activity and diet is the cornerstone of type 2 diabetes management. However, the majority of weight loss trials in diabetes have evaluated short-term, intensive clinic-based interventions that, while producing short-term outcomes, have failed to address issues of maintenance and broad population reach. Telephone-delivered interventions have the potential to address these gaps. Methods/Design Using a two-arm randomised controlled design, this study will evaluate an 18-month, telephone-delivered, behavioural weight loss intervention focussing on physical activity, diet and behavioural therapy, versus usual care, with follow-up at 24 months. Three-hundred adult participants, aged 20-75 years, with type 2 diabetes, will be recruited from 10 general practices via electronic medical records search. The Social-Cognitive Theory driven intervention involves a six-month intensive phase (4 weekly calls and 11 fortnightly calls) and a 12-month maintenance phase (one call per month). Primary outcomes, assessed at 6, 18 and 24 months, are: weight loss, physical activity, and glycaemic control (HbA1c), with weight loss and physical activity also measured at 12 months. Incremental cost-effectiveness will also be examined. Study recruitment began in February 2009, with final data collection expected by February 2013. Discussion This is the first study to evaluate the telephone as the primary method of delivering a behavioural weight loss intervention in type 2 diabetes. The evaluation of maintenance outcomes (6 months following the end of intervention), the use of accelerometers to objectively measure physical activity, and the inclusion of a cost-effectiveness analysis will advance the science of broad reach approaches to weight control and health behaviour change, and will build the evidence base needed to advocate for the translation of this work into population health practice

    Micronutrient fortification of food and its impact on woman and child health: A systematic review

    Get PDF
    Background: Vitamins and minerals are essential for growth and metabolism. The World Health Organization estimates that more than 2 billion people are deficient in key vitamins and minerals. Groups most vulnerable to these micronutrient deficiencies are pregnant and lactating women and young children, given their increased demands. Food fortification is one of the strategies that has been used safely and effectively to prevent vitamin and mineral deficiencies.Methods: A comprehensive search was done to identify all available evidence for the impact of fortification interventions. Studies were included if food was fortified with a single, dual or multiple micronutrients and impact of fortification was analyzed on the health outcomes and relevant biochemical indicators of women and children. We performed a meta-analysis of outcomes using Review Manager Software version 5.1.Results: Our systematic review identified 201 studies that we reviewed for outcomes of relevance. Fortification for children showed significant impacts on increasing serum micronutrient concentrations. Hematologic markers also improved, including hemoglobin concentrations, which showed a significant rise when food was fortified with vitamin A, iron and multiple micronutrients. Fortification with zinc had no significant adverse impact on hemoglobin levels. Multiple micronutrient fortification showed non-significant impacts on height for age, weight for age and weight for height Z-scores, although they showed positive trends. The results for fortification in women showed that calcium and vitamin D fortification had significant impacts in the post-menopausal age group. Iron fortification led to a significant increase in serum ferritin and hemoglobin levels in women of reproductive age and pregnant women. Folate fortification significantly reduced the incidence of congenital abnormalities like neural tube defects without increasing the incidence of twinning. The number of studies pooled for zinc and multiple micronutrients for women were few, though the evidence suggested benefit. There was a dearth of evidence for the impact of fortification strategies on morbidity and mortality outcomes in women and children.Conclusion: Fortification is potentially an effective strategy but evidence from the developing world is scarce. Programs need to assess the direct impact of fortification on morbidity and mortality

    Linking electronic medical records use to physicians’ performance:a contextual analysis

    Get PDF
    Electronic Medical Records (EMR) studies have broadly tested EMR use and outcomes, producing mixed and inconclusive results. This study carefully considers the healthcare delivery context and examines relevant mediating variables. We consider key characteristics of: 1) interdependence in healthcare delivery processes, 2) physician autonomy, and 3) the trend of hospital employment of physicians, and draw on theoretical perspectives in coordination, shared values, and agency to explain how the use of EMR can improve physicians’ performance. In order to examine the effects of physician employment on work practices in the hospital, we collected 583 data points from 302 hospitals in 47 states in the USA to test two models; one for employed and another for non-employed physicians. Results show that information sharing and shared values among healthcare delivery professionals fully mediate the relationship between EMR use and physicians’ performance. Next, physician employment determines which mediating variable constitutes the pathway from EMR use to physicians’ performance. Finally, we highlight the impact of shared values between the hospital and physicians in enhancing information sharing and physicians’ performance, extending studies of these behaviors among network partners in industrial settings. Overall our study shows that EMR use should be complemented by processual (information sharing), social (shared values) and structural (physician employment) mechanisms to yield positive effects on physicians’ performance

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    Background: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. Methods: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. Findings: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96–1·28). Interpretation: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. Funding: National Institute for Health Research Health Services and Delivery Research Programme

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    BACKGROUND: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. METHODS: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. FINDINGS: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96-1·28). INTERPRETATION: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. FUNDING: National Institute for Health Research Health Services and Delivery Research Programme
    corecore