2,843 research outputs found

    Relation of Sources of Systemic Fluoride to Prevalence of Dental Fluorosis

    Full text link
    The prevalence of dental fluorosis in a nonfluoridated area was determined and related to the reported fluoride ingestion histories of the children examined. A convenience sample of 543 schoolchildren in rural areas of Michigan was examined for fluorosis using the Tooth Surface Index of Fluorosis. Questionnaires that asked about previous use of fluorides were sent to parents of all children examined. The response rate was 76 percent (412 usable questionnaires). A criterion for inclusion in the data analysis stipulated that only fluorosed surfaces that occurred bilaterally would be included. Fluorosis was found on 7 percent of all tooth surfaces and only in the mild form. Twenty-two percent of the subjects were classified as having fluorosis. Dietary supplement was the only fluoride that was found to be significantly related to the occurrence of fluorosis. A greater proportion of the subjects with fluorosis fisted physicians, rather than dentists, as the source of fluoride prescriptions. The results demonstrate similarities to the fluorosis reported in other studies in non-fluoridated areas, but also suggest the need to minimize the occurrence of fluorosis through proper assessment of a child's fluoride exposure and the judicious use of additional fluoride.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/65695/1/j.1752-7325.1989.tb02030.x.pd

    What is the real impact of acute kidney injury?

    Get PDF
    Background: Acute kidney injury (AKI) is a common clinical problem. Studies have documented the incidence of AKI in a variety of populations but to date we do not believe the real incidence of AKI has been accurately documented in a district general hospital setting. The aim here was to describe the detected incidence of AKI in a typical general hospital setting in an unselected population, and describe associated short and long-term outcomes. Methods: A retrospective observational database study from secondary care in East Kent (adult catchment population of 582,300). All adult patients (18 years or over) admitted between 1st February 2009 and 31st July 2009, were included. Patients receiving chronic renal replacement therapy (RRT), maternity and day case admissions were excluded. AKI was defined by the acute kidney injury network (AKIN) criteria. A time dependent risk analysis with logistic regression and Cox regression was used for the analysis of in-hospital mortality and survival. Results: The incidence of AKI in the 6 month period was 15,325 pmp/yr (adults) (69% AKIN1, 18% AKIN2 and 13% AKIN3). In-hospital mortality, length of stay and ITU utilisation all increased with severity of AKI. Patients with AKI had an increase in care on discharge and an increase in hospital readmission within 30 days. Conclusions: This data comes closer to the real incidence and outcomes of AKI managed in-hospital than any study published in the literature to date. Fifteen percent of all admissions sustained an episode of AKI with increased subsequent short and long term morbidity and mortality, even in those with AKIN1. This confers an increased burden and cost to the healthcare economy, which can now be quantified. These results will furnish a baseline for quality improvement projects aimed at early identification, improved management, and where possible prevention, of AKI

    Do acute elevations of serum creatinine in primary care engender an increased mortality risk?

    Get PDF
    Background: The significant impact Acute Kidney Injury (AKI) has on patient morbidity and mortality emphasizes the need for early recognition and effective treatment. AKI presenting to or occurring during hospitalisation has been widely studied but little is known about the incidence and outcomes of patients experiencing acute elevations in serum creatinine in the primary care setting where people are not subsequently admitted to hospital. The aim of this study was to define this incidence and explore its impact on mortality. Methods: The study cohort was identified by using hospital data bases over a six month period. Inclusion criteria: People with a serum creatinine request during the study period, 18 or over and not on renal replacement therapy. The patients were stratified by a rise in serum creatinine corresponding to the Acute Kidney Injury Network (AKIN) criteria for comparison purposes. Descriptive and survival data were then analysed. Ethical approval was granted from National Research Ethics Service (NRES) Committee South East Coast and from the National Information Governance Board. Results: The total study population was 61,432. 57,300 subjects with ‘no AKI’, mean age 64.The number (mean age) of acute serum creatinine rises overall were, ‘AKI 1’ 3,798 (72), ‘AKI 2’ 232 (73), and ‘AKI 3’ 102 (68) which equates to an overall incidence of 14,192 pmp/year (adult). Unadjusted 30 day survival was 99.9% in subjects with ‘no AKI’, compared to 98.6%, 90.1% and 82.3% in those with ‘AKI 1’, ‘AKI 2’ and ‘AKI 3’ respectively. After multivariable analysis adjusting for age, gender, baseline kidney function and co-morbidity the odds ratio of 30 day mortality was 5.3 (95% CI 3.6, 7.7), 36.8 (95% CI 21.6, 62.7) and 123 (95% CI 64.8, 235) respectively, compared to those without acute serum creatinine rises as defined. Conclusions: People who develop acute elevations of serum creatinine in primary care without being admitted to hospital have significantly worse outcomes than those with stable kidney function

    High Effective Coverage of Vector Control Interventions in Children After Achieving Low Malaria Transmission in Zanzibar, Tanzania.

    Get PDF
    \ud \ud Formerly a high malaria transmission area, Zanzibar is now targeting malaria elimination. A major challenge is to avoid resurgence of malaria, the success of which includes maintaining high effective coverage of vector control interventions such as bed nets and indoor residual spraying (IRS). In this study, caretakers' continued use of preventive measures for their children is evaluated, following a sharp reduction in malaria transmission. A cross-sectional community-based survey was conducted in June 2009 in North A and Micheweni districts in Zanzibar. Households were randomly selected using two-stage cluster sampling. Interviews were conducted with 560 caretakers of under-five-year old children, who were asked about perceptions on the malaria situation, vector control, household assets, and intention for continued use of vector control as malaria burden further decreases. Effective coverage of vector control interventions for under-five children remains high, although most caretakers (65%; 363/560) did not perceive malaria as presently being a major health issue. Seventy percent (447/643) of the under-five children slept under a long-lasting insecticidal net (LLIN) and 94% (607/643) were living in houses targeted with IRS. In total, 98% (628/643) of the children were covered by at least one of the vector control interventions. Seasonal bed-net use for children was reported by 25% (125/508) of caretakers of children who used bed nets. A high proportion of caretakers (95%; 500/524) stated that they intended to continue using preventive measures for their under-five children as malaria burden further reduces. Malaria risk perceptions and different perceptions of vector control were not found to be significantly associated with LLIN effective coverage While the majority of caretakers felt that malaria had been reduced in Zanzibar, effective coverage of vector control interventions remained high. Caretakers appreciated the interventions and recognized the value of sustaining their use. Thus, sustaining high effective coverage of vector control interventions, which is crucial for reaching malaria elimination in Zanzibar, can be achieved by maintaining effective delivery of these interventions

    Analysis of human immune responses in quasi-experimental settings: tutorial in biostatistics

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Human immunology is a growing field of research in which experimental, clinical, and analytical methods of many life science disciplines are utilized. Classic epidemiological study designs, including observational longitudinal birth cohort studies, offer strong potential for gaining new knowledge and insights into immune response to pathogens in humans. However, rigorous discussion of methodological issues related to designs and statistical analysis that are appropriate for longitudinal studies is lacking.</p> <p>Methods</p> <p>In this communication we address key questions of quality and validity of traditional and recently developed statistical tools applied to measures of immune responses. For this purpose we use data on humoral immune response (IR) associated with the first cryptosporidial diarrhea in a birth cohort of children residing in an urban slum in south India. The main objective is to detect the difference and derive inferences for a change in IR measured at two time points, before (pre) and after (post) an event of interest. We illustrate the use and interpretation of analytical and data visualization techniques including generalized linear and additive models, data-driven smoothing, and combinations of box-, scatter-, and needle-plots.</p> <p>Results</p> <p>We provide step-by-step instructions for conducting a thorough and relatively simple analytical investigation, describe the challenges and pitfalls, and offer practical solutions for comprehensive examination of data. We illustrate how the assumption of time irrelevance can be handled in a study with a pre-post design. We demonstrate how one can study the dynamics of IR in humans by considering the timing of response following an event of interest and seasonal fluctuation of exposure by proper alignment of time of measurements. This alignment of calendar time of measurements and a child's age at the event of interest allows us to explore interactions between IR, seasonal exposures and age at first infection.</p> <p>Conclusions</p> <p>The use of traditional statistical techniques to analyze immunological data derived from observational human studies can result in loss of important information. Detailed analysis using well-tailored techniques allows the depiction of new features of immune response to a pathogen in longitudinal studies in humans. The proposed staged approach has prominent implications for future study designs and analyses.</p

    Low pH immobilizes and kills human leukocytes and prevents transmission of cell-associated HIV in a mouse model

    Get PDF
    BACKGROUND: Both cell-associated and cell-free HIV virions are present in semen and cervical secretions of HIV-infected individuals. Thus, topical microbicides may need to inactivate both cell-associated and cell-free HIV to prevent sexual transmission of HIV/AIDS. To determine if the mild acidity of the healthy vagina and acid buffering microbicides would prevent transmission by HIV-infected leukocytes, we measured the effect of pH on leukocyte motility, viability and intracellular pH and tested the ability of an acidic buffering microbicide (BufferGel(®)) to prevent the transmission of cell-associated HIV in a HuPBL-SCID mouse model. METHODS: Human lymphocyte, monocyte, and macrophage motilities were measured as a function of time and pH using various acidifying agents. Lymphocyte and macrophage motilities were measured using video microscopy. Monocyte motility was measured using video microscopy and chemotactic chambers. Peripheral blood mononuclear cell (PBMC) viability and intracellular pH were determined as a function of time and pH using fluorescent dyes. HuPBL-SCID mice were pretreated with BufferGel, saline, or a control gel and challenged with HIV-1-infected human PBMCs. RESULTS: Progressive motility was completely abolished in all cell types between pH 5.5 and 6.0. Concomitantly, at and below pH 5.5, the intracellular pH of PBMCs dropped precipitously to match the extracellular medium and did not recover. After acidification with hydrochloric acid to pH 4.5 for 60 min, although completely immotile, 58% of PBMCs excluded ethidium homodimer-1 (dead-cell dye). In contrast, when acidified to this pH with BufferGel, a microbicide designed to maintain vaginal acidity in the presence of semen, only 4% excluded dye at 10 min and none excluded dye after 30 min. BufferGel significantly reduced transmission of HIV-1 in HuPBL-SCID mice (1 of 12 infected) compared to saline (12 of 12 infected) and a control gel (5 of 7 infected). CONCLUSION: These results suggest that physiologic or microbicide-induced acid immobilization and killing of infected white blood cells may be effective in preventing sexual transmission of cell-associated HIV

    Characteristics of transposable element exonization within human and mouse

    Get PDF
    Insertion of transposed elements within mammalian genes is thought to be an important contributor to mammalian evolution and speciation. Insertion of transposed elements into introns can lead to their activation as alternatively spliced cassette exons, an event called exonization. Elucidation of the evolutionary constraints that have shaped fixation of transposed elements within human and mouse protein coding genes and subsequent exonization is important for understanding of how the exonization process has affected transcriptome and proteome complexities. Here we show that exonization of transposed elements is biased towards the beginning of the coding sequence in both human and mouse genes. Analysis of single nucleotide polymorphisms (SNPs) revealed that exonization of transposed elements can be population-specific, implying that exonizations may enhance divergence and lead to speciation. SNP density analysis revealed differences between Alu and other transposed elements. Finally, we identified cases of primate-specific Alu elements that depend on RNA editing for their exonization. These results shed light on TE fixation and the exonization process within human and mouse genes.Comment: 11 pages, 4 figure
    corecore