466 research outputs found

    Prolonged high-dose intravenous magnesium therapy for severe tetanus in the intensive care unit: a case series

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>Tetanus rarely occurs in developed countries, but it can result in fatal complications including respiratory failure due to generalized muscle spasms. Magnesium infusion has been used to treat spasticity in tetanus, and its effectiveness is supported by several case reports and a recent randomized controlled trial.</p> <p>Case presentations</p> <p>Three Caucasian Greek men aged 30, 50 and 77 years old were diagnosed with tetanus and admitted to a general 12-bed intensive care unit in 2006 and 2007 for respiratory failure due to generalized spasticity. Intensive care unit treatment included antibiotics, hydration, enteral nutrition, early tracheostomy and mechanical ventilation. Intravenous magnesium therapy controlled spasticity without the need for additional muscle relaxants. Their medications were continued for up to 26 days, and adjusted as needed to control spasticity. Plasma magnesium levels, which were measured twice a day, remained in the 3 to 4.5 mmol/L range. We did not observe hemodynamic instability, arrhythmias or other complications related to magnesium therapy in these patients. All patients improved, came off mechanical ventilation, and were discharged from the intensive care unit in a stable condition.</p> <p>Conclusion</p> <p>In comparison with previous reports, our case series contributes the following meaningful additional information: intravenous magnesium therapy was used on patients already requiring mechanical ventilation and remained effective for up to 26 days (significantly longer than in previous reports) without significant toxicity in two patients. The overall outcome was good in all our patients. However, the optimal dose, optimal duration and maximum safe duration of intravenous magnesium therapy are unknown. Therefore, until more data on the safety and efficacy of magnesium therapy are available, its use should be limited to carefully selected tetanus cases.</p

    Variability and Diversity of Nasopharyngeal Microbiota in Children: A Metagenomic Analysis

    Get PDF
    The nasopharynx is the ecological niche for many commensal bacteria and for potential respiratory or invasive pathogens like Streptococcus pneumoniae, Haemophilus influenzae, and Neisseria meningitidis. Disturbance of a balanced nasopharyngeal (NP) microbiome might be involved in the onset of symptomatic infections with these pathogens, which occurs primarily in fall and winter. It is unknown whether seasonal infection patterns are associated with concomitant changes in NP microbiota. As young children are generally prone to respiratory and invasive infections, we characterized the NP microbiota of 96 healthy children by barcoded pyrosequencing of the V5–V6 hypervariable region of the 16S-rRNA gene, and compared microbiota composition between children sampled in winter/fall with children sampled in spring. The approximately 1000000 sequences generated represented 13 taxonomic phyla and approximately 250 species-level phyla types (OTUs). The 5 most predominant phyla were Proteobacteria (64%), Firmicutes (21%), Bacteroidetes (11%), Actinobacteria (3%) and Fusobacteria (1,4%) with Moraxella, Haemophilus, Streptococcus, Flavobacteria, Dolosigranulum, Corynebacterium and Neisseria as predominant genera. The inter-individual variability was that high that on OTU level a core microbiome could not be defined. Microbiota profiles varied strongly with season, with in fall/winter a predominance of Proteobacteria (relative abundance (% of all sequences): 75% versus 51% in spring) and Fusobacteria (absolute abundance (% of children): 14% versus 2% in spring), and in spring a predominance of Bacteroidetes (relative abundance: 19% versus 3% in fall/winter, absolute abundance: 91% versus 54% in fall/winter), and Firmicutes. The latter increase is mainly due to (Brevi)bacillus and Lactobacillus species (absolute abundance: 96% versus 10% in fall/winter) which are like Bacteroidetes species generally related to healthy ecosystems. The observed seasonal effects could not be attributed to recent antibiotics or viral co-infection

    Development of appropriateness explicit criteria for cataract extraction by phacoemulsification

    Get PDF
    BACKGROUND: Consensus development techniques were used in the late 1980s to create explicit criteria for the appropriateness of cataract extraction. We developed a new appropriateness of indications tool for cataract following the RAND method. We tested the validity of our panel results. METHODS: Criteria were developed using a modified Delphi panel judgment process. A panel of 12 ophthalmologists was assembled. Ratings were analyzed regarding the level of agreement among panelists. We studied the influence of all variables on the final panel score using linear and logistic regression models. The explicit criteria developed were summarized by classification and regression tree analysis. RESULTS: Of the 765 indications evaluated by the main panel in the second round, 32.9% were found appropriate, 30.1% uncertain, and 37% inappropriate. Agreement was found in 53% of the indications and disagreement in 0.9%. Seven variables were considered to create the indications and divided into three groups: simple cataract, with diabetic retinopathy, or with other ocular pathologies. The preoperative visual acuity in the cataractous eye and visual function were the variables that best explained the panel scoring. The panel results were synthesized and presented in three decision trees. Misclassification error in the decision trees, as compared with the panel original criteria, was 5.3%. CONCLUSION: The parameters tested showed acceptable validity for an evaluation tool. These results support the use of this indication algorithm as a screening tool for assessing the appropriateness of cataract extraction in field studies and for the development of practice guidelines

    Expression of Distal-less, dachshund, and optomotor blind in Neanthes arenaceodentata (Annelida, Nereididae) does not support homology of appendage-forming mechanisms across the Bilateria

    Get PDF
    The similarity in the genetic regulation of arthropod and vertebrate appendage formation has been interpreted as the product of a plesiomorphic gene network that was primitively involved in bilaterian appendage development and co-opted to build appendages (in modern phyla) that are not historically related as structures. Data from lophotrochozoans are needed to clarify the pervasiveness of plesiomorphic appendage forming mechanisms. We assayed the expression of three arthropod and vertebrate limb gene orthologs, Distal-less (Dll), dachshund (dac), and optomotor blind (omb), in direct-developing juveniles of the polychaete Neanthes arenaceodentata. Parapodial Dll expression marks premorphogenetic notopodia and neuropodia, becoming restricted to the bases of notopodial cirri and to ventral portions of neuropodia. In outgrowing cephalic appendages, Dll activity is primarily restricted to proximal domains. Dll expression is also prominent in the brain. dac expression occurs in the brain, nerve cord ganglia, a pair of pharyngeal ganglia, presumed interneurons linking a pair of segmental nerves, and in newly differentiating mesoderm. Domains of omb expression include the brain, nerve cord ganglia, one pair of anterior cirri, presumed precursors of dorsal musculature, and the same pharyngeal ganglia and presumed interneurons that express dac. Contrary to their roles in outgrowing arthropod and vertebrate appendages, Dll, dac, and omb lack comparable expression in Neanthes appendages, implying independent evolution of annelid appendage development. We infer that parapodia and arthropodia are not structurally or mechanistically homologous (but their primordia might be), that Dll’s ancestral bilaterian function was in sensory and central nervous system differentiation, and that locomotory appendages possibly evolved from sensory outgrowths

    Quantitative methods to monitor RNA biomarkers in myotonic dystrophy

    Get PDF
    Myotonic dystrophy type 1 (DM1) and type 2 (DM2) are human neuromuscular disorders associated with mutations of simple repetitive sequences in afected genes. The abnormal expansion of CTG repeats in the 3′-UTR of the DMPK gene elicits DM1, whereas elongated CCTG repeats in intron 1 of ZNF9/CNBP triggers DM2. Pathogenesis of both disorders is manifested by nuclear retention of expanded repeat containing RNAs and aberrant alternative splicing. The precise determination of absolute numbers of mutant RNA molecules is important for a better understanding of disease complexity and for accurate evaluation of the efficacy of therapeutic drugs. We present two quantitative methods, Multiplex Ligation-Dependent Probe Amplifcation and droplet digital PCR, for studying the mutant DMPK transcript (DMPKexpRNA) and the aberrant alternative splicing in DM1 and DM2 human tissues and cells. We demonstrate that in DM1, the DMPKexpRNA is detected in higher copy number than its normal counterpart. Moreover, the absolute number of the mutant transcript indicates its low abundance with only a few copies per cell in DM1 fibroblasts. Most importantly, in conjunction with fuorescence in-situ hybridization experiments, our results suggest that in DM1 fibroblasts, the vast majority of nuclear RNA foci consist of a few molecules of DMPKexpRNA

    Distribution and Extinction of Ungulates during the Holocene of the Southern Levant

    Get PDF
    BACKGROUND: The southern Levant (Israel, Palestinian Authority and Jordan) has been continuously and extensively populated by succeeding phases of human cultures for the past 15,000 years. The long human impact on the ancient landscape has had great ecological consequences, and has caused continuous and accelerating damage to the natural environment. The rich zooarchaeological data gathered at the area provide a unique opportunity to reconstruct spatial and temporal changes in wild species distribution, and correlate them with human demographic changes. METHODOLOGY: Zoo-archaeological data (382 animal bone assemblages from 190 archaeological sites) from various time periods, habitats and landscapes were compared. The bone assemblages were sorted into 12 major cultural periods. Distribution maps showing the presence of each ungulate species were established for each period. CONCLUSIONS: The first major ungulate extinction occurred during the local Iron Age (1,200-586 BCE), a period characterized by significant human population growth. During that time the last of the largest wild ungulates, the hartebeest (Alcelaphus buselaphus), aurochs (Bos primigenius) and the hippopotamus (Hippopotamus amphibius) became extinct, followed by a shrinking distribution of forest-dwelling cervids. A second major wave of extinction occurred only in the 19th and 20th centuries CE. Furthermore, a negative relationship was found between the average body mass of ungulate species that became extinct during the Holocene and their extinction date. It is thus very likely that the intensified human activity through habitat destruction and uncontrolled hunting were responsible for the two major waves of ungulate extinction in the southern Levant during the late Holocene

    Survey of childhood empyema in Asia: Implications for detecting the unmeasured burden of culture-negative bacterial disease

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Parapneumonic empyema continues to be a disease of significant morbidity and mortality among children despite recent advances in medical management. To date, only a limited number of studies have assessed the burden of empyema in Asia.</p> <p>Methods</p> <p>We surveyed medical records of four representative large pediatric hospitals in China, Korea, Taiwan and Vietnam using <it>ICD</it>-10 diagnostic codes to identify children <16 years of age hospitalized with empyema or pleural effusion from 1995 to 2005. We also accessed microbiology records of cultured empyema and pleural effusion specimens to describe the trends in the epidemiology and microbiology of empyema.</p> <p>Results</p> <p>During the study period, we identified 1,379 children diagnosed with empyema or pleural effusion (China, n = 461; Korea, n = 134; Taiwan, n = 119; Vietnam, n = 665). Diagnoses of pleural effusion (n = 1,074) were 3.5 times more common than of empyema (n = 305), although the relative proportions of empyema and pleural effusion noted in hospital records varied widely between the four sites, most likely because of marked differences in coding practices. Although pleural effusions were reported more often than empyema, children with empyema were more likely to have a cultured pathogen. In addition, we found that median age and gender distribution of children with these conditions were similar across the four countries. Among 1,379 empyema and pleural effusion specimens, 401 (29%) were culture positive. <it>Staphylococcus aureus </it>(n = 126) was the most common organism isolated, followed by <it>Streptococcus pneumoniae </it>(n = 83), <it>Pseudomonas aeruginosa </it>(n = 37) and <it>Klebsiella </it>(n = 35) and <it>Acinetobacter </it>species (n = 34).</p> <p>Conclusion</p> <p>The age and gender distribution of empyema and pleural effusion in children in these countries are similar to the US and Western Europe. <it>S. pneumoniae </it>was the second leading bacterial cause of empyema and pleural effusion among Asian children. The high proportion of culture-negative specimens among patients with pleural effusion or empyema suggests that culture may not be a sufficiently sensitive diagnostic method to determine etiology in the majority of cases. Future prospective studies in different countries would benefit from standardized case definitions and coding practices for empyema. In addition, more sensitive diagnostic methods would improve detection of pathogens and could result in better prevention, treatment and outcomes of this severe disease.</p

    Sociodemographic differences in linkage error: An examination of four large-scale datasets

    Get PDF
    © 2018 The Author(s). Background: Record linkage is an important tool for epidemiologists and health planners. Record linkage studies will generally contain some level of residual record linkage error, where individual records are either incorrectly marked as belonging to the same individual, or incorrectly marked as belonging to separate individuals. A key question is whether errors in linkage quality are distributed evenly throughout the population, or whether certain subgroups will exhibit higher rates of error. Previous investigations of this issue have typically compared linked and un-linked records, which can conflate bias caused by record linkage error, with bias caused by missing records (data capture errors). Methods: Four large administrative datasets were individually de-duplicated, with results compared to an available 'gold-standard' benchmark, allowing us to avoid methodological issues with comparing linked and un-linked records. Results were compared by gender, age, geographic remoteness (major cities, regional or remote) and socioeconomic status. Results: Results varied between datasets, and by sociodemographic characteristic. The most consistent findings were worse linkage quality for younger individuals (seen in all four datasets) and worse linkage quality for those living in remote areas (seen in three of four datasets). The linkage quality within sociodemographic categories varied between datasets, with the associations with linkage error reversed across different datasets due to quirks of the specific data collection mechanisms and data sharing practices. Conclusions: These results suggest caution should be taken both when linking younger individuals and those in remote areas, and when analysing linked data from these subgroups. Further research is required to determine the ramifications of worse linkage quality in these subpopulations on research outcomes

    Apparent temperature and acute myocardial infarction hospital admissions in Copenhagen, Denmark: a case-crossover study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The influence of temperature on acute myocardial infarction (AMI) has not been investigated as extensively as the effects of broader outcomes of morbidity and mortality. Sixteen studies reported inconsistent results and two considered confounding by air pollution. We addressed some of the methodological limitations of the previous studies in this study.</p> <p>Methods</p> <p>This is the first study of the association between the daily 3-hour maximum apparent temperature (Tapp<sub>max</sub>) and AMI hospital admissions in Copenhagen. The study period covered 1 January 1999-31 December 2006, stratified in warm (April - September) and cold (October - March) periods. A case-crossover epidemiology study design was applied. Models were adjusted for public holidays and influenza, confounding by PM<sub>10</sub>, NO<sub>2 </sub>and CO was investigated, the lag and non-linear effects of Tapp<sub>max </sub>was examined, effect modification by age, sex and SES was explored, and the results of the case-crossover models were compared to those of the generalised additive Poisson time-series and generalised estimating equation models.</p> <p>Results</p> <p>14 456 AMI hospital admissions (12 995 people) occurred during the study period. For an inter-quartile range (6 or 7°C) increase in the 5-day cumulative average of Tapp<sub>max</sub>, a 4% (95% CI:-2%; 10%) and 9% (95% CI: 3%; 14%) decrease in the AMI admission rate was observed in the warm and cold periods, respectively. The 19-65 year old group, men and highest SES group seemed to be more susceptible in the cold period.</p> <p>Conclusion</p> <p>An increase in Tapp<sub>max </sub>is associated with a decrease in AMI admissions during the colder months.</p
    corecore