222 research outputs found
Organisational response to the 2007 Ruapehu Crater Lake breakout lahar in New Zealand: Use of communication in creating an effective response
When Mt. Ruapehu erupted in 1995-1996 in New Zealand, a tephra barrier was created alongside Crater Lake on the top of Mt. Ruapehu. This barrier acted as a dam, with Crater Lake rising behind it over time. In 2007 the lake breached the dam and a lahar occurred down the Whangaehu Valley and across the volcano’s broad alluvial ring-plain. Given the lahar history from Ruapehu, the risk from the 2007 event was identified beforehand and steps taken to reduce the risks to life and infrastructure. An early warning system was set up to notify when the dam had broken and the lahar had occurred. Physical works to mitigate the risk were put in place. A planning group was also formed and emergency management plans were put in place to respond to the risk. To assess the effectiveness of planning for and responding to the lahar, semi-structured interviews were undertaken with personnel from key organisations both before and after the lahar event. This chapter discusses the findings from the interviews in the context of communications, and highlights how good communications contributed to an effective emergency management response. As the potential for a lahar was identifiable, approximately 10 years of lead-up time was available to install warning system hardware, implement physical mitigation measures, create emergency management plans, and practice exercises for the lahar. The planning and exercising developed effective internal communications, engendered relationships, and moved individuals towards a shared mental model of how a respond to the event. Consequently, the response played out largely as planned with only minor communication issues occurring on the day of the lahar. The minor communication issues were due to strong personal connections leading to at least one case of the plan being bypassed. Communication levels during the lahar event itself were also different from that experienced in exercises, and in some instances communications were seen to increase almost three-fold. This increase in level of communication, led to some difficulty in getting through to the main Incident Control Point. A final thought regarding public communications prior to the event was that more effort could have been given to developing and integrating public information about the lahar, to allow for ease of understanding about the event and integration of information across agencies.falseSubmitte
Community responses to communication campaigns for influenza A (H1N1): a focus group study
<p>Abstract</p> <p>Background</p> <p>This research was a part of a contestable rapid response initiative launched by the Health Research Council of New Zealand and the Ministry of Health in response to the 2009 influenza A pandemic. The aim was to provide health authorities in New Zealand with evidence-based practical information to guide the development and delivery of effective health messages for H1N1 and other health campaigns. This study contributed to the initiative by providing qualitative data about community responses to key health messages in the 2009 and 2010 H1N1 campaigns, the impact of messages on behavioural change and the differential impact on vulnerable groups in New Zealand.</p> <p>Methods</p> <p>Qualitative data were collected on community responses to key health messages in the 2009 and 2010 Ministry of Health H1N1 campaigns, the impact of messages on behaviour and the differential impact on vulnerable groups. Eight focus groups were held in the winter of 2010 with 80 participants from groups identified by the Ministry of Health as vulnerable to the H1N1 virus, such as people with chronic health conditions, pregnant women, children, Pacific Peoples and Māori. Because this study was part of a rapid response initiative, focus groups were selected as the most efficient means of data collection in the time available. For Māori, focus group discussion (hui) is a culturally appropriate methodology.</p> <p>Results</p> <p>Thematic analysis of data identified four major themes: personal and community risk, building community strategies, responsibility and information sources. People wanted messages about specific actions that they could take to protect themselves and their families and to mitigate any consequences. They wanted transparent and factual communication where both good and bad news is conveyed by people who they could trust.</p> <p>Conclusions</p> <p>The responses from all groups endorsed the need for community based risk management including information dissemination. Engaging with communities will be essential to facilitate preparedness and build community resilience to future pandemic events. This research provides an illustration of the complexities of how people understand and respond to health messages related to the H1N1 pandemic. The importance of the differences identified in the analysis is not the differences per se but highlight problems with a "one size fits all" pandemic warning strategy.</p
Neurocognitive function in HIV infected patients on antiretroviral therapy
OBJECTIVE
To describe factors associated with neurocognitive (NC) function in HIV-positive patients on stable combination antiretroviral therapy.
DESIGN
We undertook a cross-sectional analysis assessing NC data obtained at baseline in patients entering the Protease-Inhibitor-Monotherapy-Versus-Ongoing-Triple therapy (PIVOT) trial.
MAIN OUTCOME MEASURE
NC testing comprised of 5 domains. Raw results were z-transformed using standard and demographically adjusted normative datasets (ND). Global z-scores (NPZ-5) were derived from averaging the 5 domains and percentage of subjects with test scores >1 standard deviation (SD) below population means in at least two domains (abnormal Frascati score) calculated. Patient characteristics associated with NC results were assessed using multivariable linear regression.
RESULTS
Of the 587 patients in PIVOT, 557 had full NC results and were included. 77% were male, 68% Caucasian and 28% of Black ethnicity. Mean (SD) baseline and nadir CD4+ lymphocyte counts were 553(217) and 177(117) cells/µL, respectively, and HIV RNA was <50 copies/mL in all. Median (IQR) NPZ-5 score was -0.5 (-1.2/-0) overall, and -0.3 (-0.7/0.1) and -1.4 (-2/-0.8) in subjects of Caucasian and Black ethnicity, respectively. Abnormal Frascati scores using the standard-ND were observed in 51%, 38%, and 81%, respectively, of subjects overall, Caucasian and Black ethnicity (p<0.001), but in 62% and 69% of Caucasian and Black subjects using demographically adjusted-ND (p = 0.20). In the multivariate analysis, only Black ethnicity was associated with poorer NPZ-5 scores (P<0.001).
CONCLUSIONS
In this large group of HIV-infected subjects with viral load suppression, ethnicity but not HIV-disease factors is closely associated with NC results. The prevalence of abnormal results is highly dependent on control datasets utilised.
TRIAL REGISTRY
ClinicalTrials.gov, NCT01230580
Strong mitochondrial DNA support for a Cretaceous origin of modern avian lineages
<p>Abstract</p> <p>Background</p> <p>Determining an absolute timescale for avian evolutionary history has proven contentious. The two sources of information available, paleontological data and inference from extant molecular genetic sequences (colloquially, 'rocks' and 'clocks'), have appeared irreconcilable; the fossil record supports a Cenozoic origin for most modern lineages, whereas molecular genetic estimates suggest that these same lineages originated deep within the Cretaceous and survived the K-Pg (Cretaceous-Paleogene; formerly Cretaceous-Tertiary or K-T) mass-extinction event. These two sources of data therefore appear to support fundamentally different models of avian evolution. The paradox has been speculated to reflect deficiencies in the fossil record, unrecognized biases in the treatment of genetic data or both. Here we attempt to explore uncertainty and limit bias entering into molecular divergence time estimates through: (i) improved taxon (<it>n </it>= 135) and character (<it>n = </it>4594 bp mtDNA) sampling; (ii) inclusion of multiple cladistically tested internal fossil calibration points (<it>n </it>= 18); (iii) correction for lineage-specific rate heterogeneity using a variety of methods (<it>n </it>= 5); (iv) accommodation of uncertainty in tree topology; and (v) testing for possible effects of episodic evolution.</p> <p>Results</p> <p>The various 'relaxed clock' methods all indicate that the major (basal) lineages of modern birds originated deep within the Cretaceous, although temporal intraordinal diversification patterns differ across methods. We find that topological uncertainty had a systematic but minor influence on date estimates for the origins of major clades, and Bayesian analyses assuming fixed topologies deliver similar results to analyses with unconstrained topologies. We also find that, contrary to expectation, rates of substitution are not autocorrelated across the tree in an ancestor-descendent fashion. Finally, we find no signature of episodic molecular evolution related to either speciation events or the K-Pg boundary that could systematically mislead inferences from genetic data.</p> <p>Conclusion</p> <p>The 'rock-clock' gap has been interpreted by some to be a result of the vagaries of molecular genetic divergence time estimates. However, despite measures to explore different forms of uncertainty in several key parameters, we fail to reconcile molecular genetic divergence time estimates with dates taken from the fossil record; instead, we find strong support for an ancient origin of modern bird lineages, with many extant orders and families arising in the mid-Cretaceous, consistent with previous molecular estimates. Although there is ample room for improvement on both sides of the 'rock-clock' divide (e.g. accounting for 'ghost' lineages in the fossil record and developing more realistic models of rate evolution for molecular genetic sequences), the consistent and conspicuous disagreement between these two sources of data more likely reflects a genuine difference between estimated ages of (i) stem-group origins and (ii) crown-group morphological diversifications, respectively. Further progress on this problem will benefit from greater communication between paleontologists and molecular phylogeneticists in accounting for error in avian lineage age estimates.</p
Why Can't Rodents Vomit? A Comparative Behavioral, Anatomical, and Physiological Study
The vomiting (emetic) reflex is documented in numerous mammalian species, including primates and carnivores, yet laboratory rats and mice appear to lack this response. It is unclear whether these rodents do not vomit because of anatomical constraints (e.g., a relatively long abdominal esophagus) or lack of key neural circuits. Moreover, it is unknown whether laboratory rodents are representative of Rodentia with regards to this reflex. Here we conducted behavioral testing of members of all three major groups of Rodentia; mouse-related (rat, mouse, vole, beaver), Ctenohystrica (guinea pig, nutria), and squirrel-related (mountain beaver) species. Prototypical emetic agents, apomorphine (sc), veratrine (sc), and copper sulfate (ig), failed to produce either retching or vomiting in these species (although other behavioral effects, e.g., locomotion, were noted). These rodents also had anatomical constraints, which could limit the efficiency of vomiting should it be attempted, including reduced muscularity of the diaphragm and stomach geometry that is not well structured for moving contents towards the esophagus compared to species that can vomit (cat, ferret, and musk shrew). Lastly, an in situ brainstem preparation was used to make sensitive measures of mouth, esophagus, and shoulder muscular movements, and phrenic nerve activity-key features of emetic episodes. Laboratory mice and rats failed to display any of the common coordinated actions of these indices after typical emetic stimulation (resiniferatoxin and vagal afferent stimulation) compared to musk shrews. Overall the results suggest that the inability to vomit is a general property of Rodentia and that an absent brainstem neurological component is the most likely cause. The implications of these findings for the utility of rodents as models in the area of emesis research are discussed. © 2013 Horn et al
Hypothermia in a surgical intensive care unit
BACKGROUND: Inadvertent hypothermia is not uncommon in the immediate postoperative period and it is associated with impairment and abnormalities in various organs and systems that can lead to adverse outcomes. The aim of this study was to estimate the prevalence, the predictive factors and outcome of core hypothermia on admission to a surgical ICU. METHODS: All consecutive 185 adult patients who underwent scheduled or emergency noncardiac surgery admitted to a surgical ICU between April and July 2004 were admitted to the study. Tympanic membrane core temperature (Tc) was measured before surgery, on arrival at ICU and every two hours until 6 hours after admission. The following variables were also recorded: age, sex, body weight and height, ASA physical status, type of surgery, magnitude of surgical procedure, anesthesia technique, amount of intravenous fluids administered during anesthesia, use of temperature monitoring and warming techniques, duration of the anesthesia, ICU length of stay, hospital length of stay and SAPS II score. Patients were classified as either hypothermic (Tc ≤ 35°C) or normothermic (Tc> 35°C). Univariate analysis and multiple regression binary logistic with an odds ratio (OR) and its 95% Confidence Interval (95%CI) were used to compare the two groups of patients and assess the relationship between each clinical predictor and hypothermia. Outcome measured as ICU length of stay and mortality was also assessed. RESULTS: Prevalence of hypothermia on ICU admission was 57.8%. In univariate analysis temperature monitoring, use of warming techniques and higher previous body temperature were significant protective factors against core hypothermia. In this analysis independent predictors of hypothermia on admission to ICU were: magnitude of surgery, use of general anesthesia or combined epidural and general anesthesia, total intravenous crystalloids administrated and total packed erythrocytes administrated, anesthesia longer than 3 hours and SAPS II scores. In multiple logistic regression analysis significant predictors of hypothermia on admission to the ICU were magnitude of surgery (OR 3.9, 95% CI, 1.4–10.6, p = 0.008 for major surgery; OR 3.6, 95% CI, 1.5–9.0, p = 0.005 for medium surgery), intravenous administration of crystalloids (in litres) (OR 1.4, 95% CI, 1.1–1.7, p = 0.012) and SAPS score (OR 1.0, 95% CI 1.0–1.7, p = 0.014); higher previous temperature in ward was a significant protective factor (OR 0.3, 95% CI 0.1–0.7, p = 0.003). Hypothermia was neither a risk factor for hospital mortality nor a predictive factor for staying longer in ICU. CONCLUSION: The prevalence of patient hypothermia on ICU arrival was high. Hypothermia at time of admission to the ICU was not an independent factor for mortality or for staying longer in ICU
The Primary Prevention of PTSD in Firefighters: Preliminary Results of an RCT with 12-Month Follow-Up
AIM: To develop and evaluate an evidence-based and theory driven program for the primary prevention of Post-traumatic Stress Disorder (PTSD). DESIGN: A pre-intervention / post-intervention / follow up control group design with clustered random allocation of participants to groups was used. The "control" group received "Training as Usual" (TAU). METHOD: Participants were 45 career recruits within the recruit school at the Department of Fire and Emergency Services (DFES) in Western Australia. The intervention group received a four-hour resilience training intervention (Mental Agility and Psychological Strength training) as part of their recruit training school curriculum. Data was collected at baseline and at 6- and 12-months post intervention. RESULTS: We found no evidence that the intervention was effective in the primary prevention of mental health issues, nor did we find any significant impact of MAPS training on social support or coping strategies. A significant difference across conditions in trauma knowledge is indicative of some impact of the MAPS program. CONCLUSION: While the key hypotheses were not supported, this study is the first randomised control trial investigating the primary prevention of PTSD. Practical barriers around the implementation of this program, including constraints within the recruit school, may inform the design and implementation of similar programs in the future. TRIAL REGISTRATION: Australian New Zealand Clinical Trials Registry (ANZCTR) ACTRN12615001362583
Persistent Gastric Colonization with Burkholderia pseudomallei and Dissemination from the Gastrointestinal Tract following Mucosal Inoculation of Mice
Melioidosis is a disease of humans caused by opportunistic infection with the soil and water bacterium Burkholderia pseudomallei. Melioidosis can manifest as an acute, overwhelming infection or as a chronic, recurrent infection. At present, it is not clear where B. pseudomallei resides in the mammalian host during the chronic, recurrent phase of infection. To address this question, we developed a mouse low-dose mucosal challenge model of chronic B. pseudomallei infection and investigated sites of bacterial persistence over 60 days. Sensitive culture techniques and selective media were used to quantitate bacterial burden in major organs, including the gastrointestinal (GI) tract. We found that the GI tract was the primary site of bacterial persistence during the chronic infection phase, and was the only site from which the organism could be consistently cultured during a 60-day infection period. The organism could be repeatedly recovered from all levels of the GI tract, and chronic infection was accompanied by sustained low-level fecal shedding. The stomach was identified as the primary site of GI colonization as determined by fluorescent in situ hybridization. Organisms in the stomach were associated with the gastric mucosal surface, and the propensity to colonize the gastric mucosa was observed with 4 different B. pseudomallei isolates. In contrast, B. pseudomallei organisms were present at low numbers within luminal contents in the small and large intestine and cecum relative to the stomach. Notably, inflammatory lesions were not detected in any GI tissue examined in chronically-infected mice. Only low-dose oral or intranasal inoculation led to GI colonization and development of chronic infection of the spleen and liver. Thus, we concluded that in a mouse model of melioidosis B. pseudomallei preferentially colonizes the stomach following oral inoculation, and that the chronically colonized GI tract likely serves as a reservoir for dissemination of infection to extra-intestinal sites
Quantitative modeling of the physiology of ascites in portal hypertension
Although the factors involved in cirrhotic ascites have been studied for a century, a number of observations are not understood, including the action of diuretics in the treatment of ascites and the ability of the plasma-ascitic albumin gradient to diagnose portal hypertension. This communication presents an explanation of ascites based solely on pathophysiological alterations within the peritoneal cavity. A quantitative model is described based on experimental vascular and intraperitoneal pressures, lymph flow, and peritoneal space compliance. The model's predictions accurately mimic clinical observations in ascites, including the magnitude and time course of changes observed following paracentesis or diuretic therapy
Comparative Genome Analysis of Filamentous Fungi Reveals Gene Family Expansions Associated with Fungal Pathogenesis
Fungi and oomycetes are the causal agents of many of the most serious diseases of plants. Here we report a detailed comparative analysis of the genome sequences of thirty-six species of fungi and oomycetes, including seven plant pathogenic species, that aims to explore the common genetic features associated with plant disease-causing species. The predicted translational products of each genome have been clustered into groups of potential orthologues using Markov Chain Clustering and the data integrated into the e-Fungi object-oriented data warehouse (http://www.e-fungi.org.uk/). Analysis of the species distribution of members of these clusters has identified proteins that are specific to filamentous fungal species and a group of proteins found only in plant pathogens. By comparing the gene inventories of filamentous, ascomycetous phytopathogenic and free-living species of fungi, we have identified a set of gene families that appear to have expanded during the evolution of phytopathogens and may therefore serve important roles in plant disease. We have also characterised the predicted set of secreted proteins encoded by each genome and identified a set of protein families which are significantly over-represented in the secretomes of plant pathogenic fungi, including putative effector proteins that might perturb host cell biology during plant infection. The results demonstrate the potential of comparative genome analysis for exploring the evolution of eukaryotic microbial pathogenesis
- …