74 research outputs found
Fighting Ebola with novel spore decontamination technologies for the military
AbstractRecently, global public health organizations such as Doctors without Borders (MSF), the World Health Organization (WHO), Public Health Canada, National Institutes of Health (NIH), and the U.S. government developed and deployed Field Decontamination Kits (FDKs), a novel, lightweight, compact, reusable decontamination technology to sterilize Ebola-contaminated medical devices at remote clinical sites lacking infra-structure in crisis-stricken regions of West Africa (medical waste materials are placed in bags and burned). The basis for effectuating sterilization with FDKs is chlorine dioxide (ClO2) produced from a patented invention developed by researchers at the US Army – Natick Soldier RD&E Center (NSRDEC) and commercialized as a dry mixed-chemical for bacterial spore decontamination. In fact, the NSRDEC research scientists developed an ensemble of ClO2 technologies designed for different applications in decontaminating fresh produce; food contact and handling surfaces; personal protective equipment; textiles used in clothing, uniforms, tents, and shelters; graywater recycling; airplanes; surgical instruments; and hard surfaces in latrines, laundries, and deployable medical facilities. These examples demonstrate the far-reaching impact, adaptability, and versatility of these innovative technologies. We present herein the unique attributes of NSRDEC’s novel decontamination technologies and a Case Study of the development of FDKs that were deployed in West Africa by international public health organizations to sterilize Ebola-contaminated medical equipment. FDKs use bacterial spores as indicators of sterility. We review the properties and structures of spores and the mechanisms of bacterial spore inactivation by ClO2. We also review mechanisms of bacterial spore inactivation by novel, emerging, and established nonthermal technologies for food preservation, such as high pressure processing, irradiation, cold plasma, and chemical sanitizers, using an array of
‘Trial and error…’, ‘…happy patients’ and ‘…an old toy in the cupboard’: a qualitative investigation of factors that influence practitioners in their prescription of foot orthoses
Background: Foot orthoses are used to manage of a plethora of lower limb conditions. However, whilst the theoretical
foundations might be relatively consistent, actual practices and therefore the experience of patients is likely to be less so.
The factors that affect the prescription decisions that practitioners make about individual patients is unknown and hence
the way in which clinical experience interacts with knowledge from training is not understood. Further, other influences
on orthotic practice may include the adoption (or not) of technology. Hence the aim of this study was to explore, for
the first time, the influences on orthotic practice.
Methods: A qualitative approach was adopted utilising two focus groups (16 consenting participants in total; 15
podiatrists and 1 orthotist) in order to collect the data. An opening question “What factors influence your orthotic
practice?” was followed with trigger questions, which were used to maintain focus. The dialogue was recorded
digitally, transcribed verbatim and a thematic framework was used to analyse the data.
Results: There were five themes: (i) influences on current practice, (ii) components of current practice, (iii) barriers
to technology being used in clinical practice, (iv) how technology could enhance foot orthoses prescription and
measurement of outcomes, and (v) how technology could provide information for practitioners and patients. A final
global theme was agreed by the researchers and the participants: ‘Current orthotic practice is variable and does not
embrace technology as it is perceived as being not fit for purpose in the clinical environment. However, practitioners
do have a desire for technology that is usable and enhances patient focussed assessment, the interventions, the clinical
outcomes and the patient’s engagement throughout these processes’.
Conclusions: In relation to prescribing foot orthoses, practice varies considerably due to multiple influences.
Measurement of outcomes from orthotic practice is a priority but there are no current norms for achieving this.
There have been attempts by practitioners to integrate technology into their practice, but with largely negative
experiences. The process of technology development needs to improve and have a more practice, rather than
technology focus
Diagnosis and prevalence of two new species of haplosporidians infecting shore crabs Carcinus maenas: Haplosporidium carcini n. sp., and H. cranc n. sp.
This study provides a morphological and phylogenetic characterization of two novel species of the order Haplosporida (Haplosporidium carcini n. sp., and H. cranc n. sp.) infecting the common shore crab Carcinus maenas collected at one location in Swansea Bay, South Wales, UK. Both parasites were observed in the haemolymph, gills and hepatopancreas. The prevalence of clinical infections (i.e. parasites seen directly in fresh haemolymph preparations) was low, at ~1%, whereas subclinical levels, detected by polymerase chain reaction, were slightly higher at ~2%. Although no spores were found in any of the infected crabs examined histologically (n = 334), the morphology of monokaryotic and dikaryotic unicellular stages of the parasites enabled differentiation between the two new species. Phylogenetic analyses of the new species based on the small subunit (SSU) rDNA gene placed H. cranc in a clade of otherwise uncharacterized environmental sequences from marine samples, and H. carcini in a clade with other crustacean-associated lineages
One-year outcomes after low-dose intracoronary alteplase during primary percutaneous coronary intervention. The T-TIME randomized trial
No abstract available
The impact of trans-catheter aortic valve replacement induced leftbundle branch block on cardiac reverse remodeling
Background Left bundle branch block (LBBB) is common following trans-catheter aortic valve replacement (TAVR) and has been linked to increased mortality, although whether this is related to less favourable cardiac reverse remodeling is unclear. The aim of the study was to investigate the impact of TAVR induced LBBB on cardiac reverse remodeling. Methods 48 patients undergoing TAVR for severe aortic stenosis were evaluated. 24 patients with new LBBB (LBBB-T) following TAVR were matched with 24 patients with a narrow post-procedure QRS (nQRS). Patients underwent cardiovascular magnetic resonance (CMR) prior to and 6 m post-TAVR. Measured cardiac reverse remodeling parameters included left ventricular (LV) size, ejection fraction (LVEF) and global longitudinal strain (GLS). Inter- and intra-ventricular dyssynchrony were determined using time to peak radial strain derived from CMR Feature Tracking. Results In the LBBB-T group there was an increase in QRS duration from 96 ± 14 to 151 ± 12 ms (P < 0.001) leading to inter- and intra-ventricular dyssynchrony (inter: LBBB-T 130 ± 73 vs nQRS 23 ± 86 ms, p < 0.001; intra: LBBB-T 118 ± 103 vs. nQRS 13 ± 106 ms, p = 0.001). Change in indexed LV end-systolic volume (LVESVi), LVEF and GLS was significantly different between the two groups (LVESVi: nQRS -7.9 ± 14.0 vs. LBBB-T -0.6 ± 10.2 ml/m2, p = 0.02, LVEF: nQRS +4.6 ± 7.8 vs LBBB-T -2.1 ± 6.9%, p = 0.002; GLS: nQRS -2.1 ± 3.6 vs. LBBB-T +0.2 ± 3.2%, p = 0.024). There was a significant correlation between change in QRS and change in LVEF (r = -0.434, p = 0.002) and between change in QRS and change in GLS (r = 0.462, p = 0.001). Post-procedure QRS duration was an independent predictor of change in LVEF and GLS at 6 months. Conclusion TAVR-induced LBBB is associated with less favourable cardiac reverse remodeling at medium term follow up. In view of this, every effort should be made to prevent TAVR-induced LBBB, especially as TAVR is now being extended to a younger, lower risk population
A Phase 1 Trial of MSP2-C1, a Blood-Stage Malaria Vaccine Containing 2 Isoforms of MSP2 Formulated with Montanide® ISA 720
Background: In a previous Phase 1/2b malaria vaccine trial testing the 3D7 isoform of the malaria vaccine candidate Merozoite surface protein 2 (MSP2), parasite densities in children were reduced by 62%. However, breakthrough parasitemias were disproportionately of the alternate dimorphic form of MSP2, the FC27 genotype. We therefore undertook a dose-escalating, double-blinded, placebo-controlled Phase 1 trial in healthy, malaria-naïve adults of MSP2-C1, a vaccine containing recombinant forms of the two families of msp2 alleles, 3D7 and FC27 (EcMSP2-3D7 and EcMSP2-FC27), formulated in equal amounts with Montanide® ISA 720 as a water-in-oil emulsion. Methodology/Principal Findings: The trial was designed to include three dose cohorts (10, 40, and 80 μg), each with twelve subjects receiving the vaccine and three control subjects receiving Montanide® ISA 720 adjuvant emulsion alone, in a schedule of three doses at 12-week intervals. Due to unexpected local reactogenicity and concern regarding vaccine stability, the trial was terminated after the second immunisation of the cohort receiving the 40 μg dose; no subjects received the 80 μg dose. Immunization induced significant IgG responses to both isoforms of MSP2 in the 10 μg and 40 μg dose cohorts, with antibody levels by ELISA higher in the 40 μg cohort. Vaccine-induced antibodies recognised native protein by Western blots of parasite protein extracts and by immunofluorescence microscopy. Although the induced anti-MSP2 antibodies did not directly inhibit parasite growth in vitro, IgG from the majority of individuals tested caused significant antibody-dependent cellular inhibition (ADCI) of parasite growth. Conclusions/Significance: As the majority of subjects vaccinated with MSP2-C1 developed an antibody responses to both forms of MSP2, and that these antibodies mediated ADCI provide further support for MSP2 as a malaria vaccine candidate. However, in view of the reactogenicity of this formulation, further clinical development of MSP2-C1 will require formulation of MSP2 in an alternative adjuvant. Trial Registration: Australian New Zealand Clinical Trials Registry 12607000552482
Risk stratification guided by the index of microcirculatory resistance and left ventricular end-diastolic pressure in acute myocardial infarction
Background:
The index of microcirculatory resistance (IMR) of the infarct-related artery and left ventricular end-diastolic pressure (LVEDP) are acute, prognostic biomarkers in patients undergoing primary percutaneous coronary intervention. The clinical significance of IMR and LVEDP in combination is unknown.
Methods:
IMR and LVEDP were prospectively measured in a prespecified substudy of the T-TIME clinical trial (Trial of Low Dose Adjunctive Alteplase During Primary PCI). IMR was measured using a pressure- and temperature-sensing guidewire following percutaneous coronary intervention. Prognostically established thresholds for IMR (>32) and LVEDP (>18 mm Hg) were predefined. Contrast-enhanced cardiovascular magnetic resonance imaging (1.5 Tesla) was acquired 2 to 7 days and 3 months postmyocardial infarction. The primary end point was major adverse cardiac events, defined as cardiac death/nonfatal myocardial infarction/heart failure hospitalization at 1 year.
Results:
IMR and LVEDP were both measured in 131 patients (mean age 59±10.7 years, 103 [78.6%] male, 48 [36.6%] with anterior myocardial infarction). The median IMR was 29 (interquartile range, 17–55), the median LVEDP was 17 mm Hg (interquartile range, 12–21), and the correlation between them was not statistically significant (r=0.15; P=0.087). Fifty-three patients (40%) had low IMR (≤32) and low LVEDP (≤18), 18 (14%) had low IMR and high LVEDP, 31 (24%) had high IMR and low LVEDP, while 29 (22%) had high IMR and high LVEDP. Infarct size (% LV mass), LV ejection fraction, final myocardial perfusion grade ≤1, TIMI (Thrombolysis In Myocardial Infarction) flow grade ≤2, and coronary flow reserve were associated with LVEDP/IMR group, as was hospitalization for heart failure (n=18 events; P=0.045) and major adverse cardiac events (n=21 events; P=0.051). LVEDP>18 and IMR>32 combined was associated with major adverse cardiac events, independent of age, estimated glomerular filtration rate, and infarct-related artery (odds ratio, 5.80 [95% CI, 1.60–21.22] P=0.008). The net reclassification improvement for detecting major adverse cardiac events was 50.6% (95% CI, 2.7–98.2; P=0.033) when LVEDP>18 was added to IMR>32.
Conclusions:
IMR and LVEDP in combination have incremental value for risk stratification following primary percutaneous coronary intervention.
Registration:
URL: https://www.clinicaltrials.gov. Unique identifier: NCT02257294
Clinical Outcomes With a Repositionable Self-Expanding Transcatheter Aortic Valve Prosthesis: The International FORWARD Study
Background Clinical outcomes in large patient populations from real-world clinical practice with a next-generation self-expanding transcatheter aortic valve are lacking. Objectives This study sought to document the clinical and device performance outcomes of transcatheter aortic valve replacement (TAVR) with a next-generation, self-expanding transcatheter heart valve (THV) system in patients with severe symptomatic aortic stenosis (AS) in routine clinical practice. Methods The FORWARD (CoreValve Evolut R FORWARD) study is a prospective, single-arm, multinational, multicenter, observational study. An independent clinical events committee adjudicated safety endpoints based on Valve Academic Research Consortium-2 definitions. An independent echocardiographic core laboratory evaluated all echocardiograms. From January 2016 to December 2016, TAVR with the next-generation self-expanding THV was attempted in 1,038 patients with symptomatic, severe AS at 53 centers on 4 continents. Results Mean age was 81.8 ± 6.2 years, 64.9% were women, the mean Society of Thoracic Surgeons Predicted Risk of Mortality was 5.5 ± 4.5%, and 33.9% of patients were deemed frail. The repositioning feature of the THV was applied in 25.8% of patients. A single valve was implanted in the proper anatomic location in 98.9% of patients. The mean aortic valve gradient was 8.5 ± 5.6 mm Hg, and moderate or severe aortic regurgitation was 1.9% at discharge. All-cause mortality was 1.9%, and disabling stroke occurred in 1.8% at 30 days. The expected-to-observed early surgical mortality ratio was 0.35. A pacemaker was implanted in 17.5% of patients. Conclusions TAVR using the next-generation THV is clinically safe and effective for treating older patients with severe AS at increased operative risk. (CoreValve Evolut R FORWARD Study [FORWARD]; NCT02592369
Safety and Immunogenicity Following Administration of a Live, Attenuated Monovalent 2009 H1N1 Influenza Vaccine to Children and Adults in Two Randomized Controlled Trials
BACKGROUND: The safety, tolerability, and immunogenicity of a monovalent intranasal 2009 A/H1N1 live attenuated influenza vaccine (LAIV) were evaluated in children and adults. METHODS/PRINCIPAL FINDINGS: Two randomized, double-blind, placebo-controlled studies were completed in children (2-17 y) and adults (18-49 y). Subjects were assigned 4:1 to receive 2 doses of H1N1 LAIV or placebo 28 days apart. The primary safety endpoint was fever ≥38.3°C during days 1-8 after the first dose; the primary immunogenicity endpoint was the proportion of subjects experiencing a postdose seroresponse. Solicited symptoms and adverse events were recorded for 14 days after each dose and safety data were collected for 180 days post-final dose. In total, 326 children (H1N1 LAIV, n = 261; placebo, n = 65) and 300 adults (H1N1 LAIV, n = 240; placebo, n = 60) were enrolled. After dose 1, fever ≥38.3°C occurred in 4 (1.5%) pediatric vaccine recipients and 1 (1.5%) placebo recipient (rate difference, 0%; 95% CI: -6.4%, 3.1%). No adults experienced fever following dose 1. Seroresponse rates in children (H1N1 LAIV vs. placebo) were 11.1% vs. 6.3% after dose 1 (rate difference, 4.8%; 95% CI: -9.6%, 13.8%) and 32.0% vs. 14.5% after dose 2 (rate difference, 17.5%; 95% CI: 5.5%, 27.1%). Seroresponse rates in adults were 6.1% vs. 0% (rate difference, 6.1%; 95% CI: -5.6%, 12.6%) and 14.9% vs. 5.6% (rate difference, 9.3%; 95% CI: -0.8%, 16.3%) after dose 1 and dose 2, respectively. Solicited symptoms after dose 1 (H1N1 LAIV vs. placebo) occurred in 37.5% vs. 32.3% of children and 41.7% vs. 31.7% of adults. Solicited symptoms occurred less frequently after dose 2 in adults and children. No vaccine-related serious adverse events occurred. CONCLUSIONS/SIGNIFICANCE: In subjects aged 2 to 49 years, two doses of H1N1 LAIV have a safety and immunogenicity profile similar to other previously studied and efficacious formulations of seasonal trivalent LAIV. TRIAL REGISTRATION: ClinicalTrials.gov NCT00946101, NCT00945893
- …