38 research outputs found
Atovaquone Compared with Dapsone for the Prevention of Pneumocystis carinii Pneumonia in Patients with HIV Infection Who Cannot Tolerate Trimethoprim, Sulfonamides, or Both
BACKGROUND
Although trimethoprim–sulfamethoxazole is the drug of choice for the prevention of Pneumocystis carinii pneumonia, many patients cannot tolerate it and must switch to an alternative agent. METHODS
We conducted a multicenter, open-label, randomized trial comparing daily atovaquone (1500-mg suspension) with daily dapsone (100 mg) for the prevention of P. carinii pneumonia among patients infected with the human immunodeficiency virus who could not tolerate trimethoprim–sulfamethoxazole. The median follow-up period was 27 months. RESULTS
Of 1057 patients enrolled, 298 had a history of P. carinii pneumonia.P. cariniipneumonia developed in 122 of 536 patients assigned to atovaquone (15.7 cases per 100 person-years), as compared with 135 of 521 in the dapsone group (18.4 cases per 100 person-years; relative risk for atovaquone vs. dapsone, 0.85; 95 percent confidence interval, 0.67 to 1.09; P=0.20). The relative risk of death was 1.07 (95 percent confidence interval, 0.89 to 1.30; P=0.45), and the relative risk of discontinuation of the assigned medication because of adverse events was 0.94 (95 percent confidence interval, 0.74 to 1.19; P=0.59). Among the 546 patients who were receiving dapsone at base line, the relative risk of discontinuation because of adverse events was 3.78 for atovaquone as compared with dapsone (95 percent confidence interval, 2.37 to 6.01; P CONCLUSIONS
Among patients who cannot tolerate trimethoprim–sulfamethoxazole, atovaquone and dapsone are similarly effective for the prevention ofP. carinii pneumonia. Our results support the continuation of dapsone prophylaxis among patients who are already receiving it. However, among those not receiving dapsone, atovaquone is better tolerated and may be the preferred choice for prophylaxis against P. cariniipneumonia
A controlled trial of two nucleoside analogues plus indinavir in persons with human immunodeficiency virus infection and CD4 cell counts of 200 per cubic millimeter or less
Background: The efficacy and safety of adding a protease inhibitor to two nucleoside analogues to
treat human immunodeficiency virus type 1 (HIV-1)
infection are not clear. We compared treatment with
the protease inhibitor indinavir in addition to zidovudine
and lamivudine with treatment with the two nucleosides
alone in HIV-infected adults previously treated
with zidovudine.
Methods: A total of 1156 patients not previously
treated with lamivudine or protease inhibitors were
stratified according to CD4 cell count (50 or fewer vs.
51 to 200 cells per cubic millimeter) and randomly
assigned to one of two daily regimens: 600 mg of zidovudine
and 300 mg of lamivudine, or that regimen
with 2400 mg of indinavir. Stavudine could be substituted
for zidovudine. The primary end point was
the time to the development of the acquired immunodeficiency
syndrome (AIDS) or death.
Results: The proportion of patients whose disease
progressed to AIDS or death was lower with indinavir,
zidovudine (or stavudine), and lamivudine (6 percent)
than with zidovudine (or stavudine) and lamivudine
alone (11 percent; estimated hazard ratio,
0.50; 95 percent confidence interval, 0.33 to 0.76;
P�0.001). Mortality in the two groups was 1.4 percent
and 3.1 percent, respectively (estimated hazard
ratio, 0.43; 95 percent confidence interval, 0.19 to
0.99; P=0.04). The effects of treatment were similar
in both CD4 cell strata. The responses of CD4 cells
and plasma HIV-1 RNA paralleled the clinical results.
Conclusions: Treatment with indinavir, zidovudine,
and lamivudine as compared with zidovudine and
lamivudine alone significantly slows the progression
of HIV-1 disease in patients with 200 CD4 cells or
fewer per cubic millimeter and prior exposure to zidovudine.
(N Engl J Med 1997;337:725-33.
Incident type 2 diabetes attributable to suboptimal diet in 184 countries
The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.814.4 million) incident T2D cases, representing 70.3% (68.871.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.027.1%)), excess refined rice and wheat intake (24.6% (22.327.2%)) and excess processed meat intake (20.3% (18.323.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.487.7%)) and Latin America and the Caribbean (81.8% (80.183.4%)); and lowest proportional burdens were in South Asia (55.4% (52.160.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally. (c) 2023, The Author(s)
Children's and adolescents' rising animal-source food intakes in 1990-2018 were impacted by age, region, parental education and urbanicity
Animal-source foods (ASF) provide nutrition for children and adolescents physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the worlds child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 1519 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes. (c) 2023, The Author(s)
Specific Detection of Human BK Polyomavirus in Urine Samples of Immunocompromised Patients
A semiquantitative PCR assay for the detection of BK virus in urine was developed using primers for BK virus that specifically amplified BK but not JC virus. DNA was extracted from urine through treatment with proteinase K followed by DNA precipitation with sodium acetate. Semiquantitation was achieved by amplifying serial dilutions (1:1, 1:10, 1:100, and 1:1,000) of the urine specimens. Each assay included both positive (stock BK virus and previously positive patient urine) and negative (no template) controls. A urine sample was interpreted as positive if any of the serial dilutions showed amplification of the DNA fragment of the expected size. For some patient-derived samples, amplification of the expected-size fragment was achieved with a dilute template whereas no amplification was achieved with a concentrated template. This was attributed to interfering substances in the urine. PCR results were compared with urine cytology and shown to be more sensitive. Validation studies were performed at the University of Nebraska Medical Center, utilizing a separate qualitative PCR assay that detects both BK and JC virus and distinguishes between them by restriction enzyme digestion patterns. Of 46 urine samples analyzed using both methods, 22 were positive by both assays, 18 were negative by both assays, 5 were positive only by the Nebraska method, and 1 was positive only by our method. In comparison with the Nebraska PCR, our PCR assay had a sensitivity of 81% and specificity of 95%. For twenty-one (43%) of 49 immunocompromised patients, tests were postive when specimens were submitted because of clinical suspicion of BK virus infection
The Incubation Period of Primary Epstein-Barr Virus Infection: Viral Dynamics and Immunologic Events
<div><p>Epstein-Barr virus (EBV) is a human herpesvirus that causes acute infectious mononucleosis and is associated with cancer and autoimmune disease. While many studies have been performed examining acute disease in adults following primary infection, little is known about the virological and immunological events during EBV’s lengthy 6 week incubation period owing to the challenge of collecting samples from this stage of infection. We conducted a prospective study in college students with special emphasis on frequent screening to capture blood and oral wash samples during the incubation period. Here we describe the viral dissemination and immune response in the 6 weeks prior to onset of acute infectious mononucleosis symptoms. While virus is presumed to be present in the oral cavity from time of transmission, we did not detect viral genomes in the oral wash until one week before symptom onset, at which time viral genomes were present in high copy numbers, suggesting loss of initial viral replication control. In contrast, using a sensitive nested PCR method, we detected viral genomes at low levels in blood about 3 weeks before symptoms. However, high levels of EBV in the blood were only observed close to symptom onset–coincident with or just after increased viral detection in the oral cavity. These data imply that B cells are the major reservoir of virus in the oral cavity prior to infectious mononucleosis. The early presence of viral genomes in the blood, even at low levels, correlated with a striking decrease in the number of circulating plasmacytoid dendritic cells well before symptom onset, which remained depressed throughout convalescence. On the other hand, natural killer cells expanded only after symptom onset. Likewise, CD4+ Foxp3+ regulatory T cells decreased two fold, but only after symptom onset. We observed no substantial virus specific CD8 T cell expansion during the incubation period, although polyclonal CD8 activation was detected in concert with viral genomes increasing in the blood and oral cavity, possibly due to a systemic type I interferon response. This study provides the first description of events during the incubation period of natural EBV infection in humans and definitive data upon which to formulate theories of viral control and disease pathogenesis.</p></div
Plasmacytoid DC declined in the circulation during the incubation period and remained depressed through convalescence.
<p>(A) Representative flow cytometry plots of pDC frequencies amongst non-lymphoid cells (CD3, CD56, CD14, CD20 negative) from samples collected at multiple timepoints for one subject (5524). (B) The percentage of pDC from 5524 over time. (C) Frequencies of pDC over time are shown for all subjects. (D) Frequencies of conventional DC (cDC) (CD11c<sup>+</sup>, HLA-DR<sup>+</sup> cells) are shown over time for all subjects. (E) Numbers of pDC per mL of whole blood are shown for all subjects. (F) shows the percentage of pDC in samples where viral genomes were detected in the blood by nested PCR (Blood lo) or qPCR (Blood hi). Statistical analysis was performed using a one-way ANOVA with multiple test comparison. Light pink symbols indicate a significant difference (p<0.05) compared to pre-infection; darker pink symbols (p<0.001); red symbols (p<0.0001). Gray symbols indicate no statistical difference.</p
Viral genome detection during the incubation period.
<p>Quantitative viral load was determined by qPCR using DNA from oral wash cell pellets (A) or blood (B). Data are expressed as Log<sub>10</sub> viral genome copies/mL of sample. The dashed gray line represents the limit of detection. (C) and (D) show the time to the first positive measurement for each subject for viral genomes detected in the blood (D) by non-quantitative nested PCR (filled squares) or qPCR (filled inverted triangles), or in the oral cavity (C) by nested or qPCR (same results were obtained with both assays) (open circles). The theoretical presence of virus shown in (C) is the estimated time period in which study participants were initially exposed to oral virus. (E) In sequential samples from the incubation period, subjects were scored for which compartment viral genomes were first detected in: blood by nested PCR (blood (lo)), blood by qPCR (blood (hi)), oral, or a simultaneous positive in both compartments. (F) Shows an inset comparing blood and oral cavity for the time period close to symptom onset. The results for twenty-six subjects who had a sample collected within the first two weeks of symptom onset are shown.</p