34 research outputs found

    Point-of-care C-reactive protein testing to reduce inappropriate use of antibiotics for non-severe acute respiratory infections in Vietnamese primary health care: a randomised controlled trial

    Get PDF
    Background Inappropriate antibiotic use for acute respiratory tract infections is common in primary health care, but distinguishing serious from self-limiting infections is diffi cult, particularly in low-resource settings. We assessed whether C-reactive protein point-of-care testing can safely reduce antibiotic use in patients with non-severe acute respiratory tract infections in Vietnam. Method We did a multicentre open-label randomised controlled trial in ten primary health-care centres in northern Vietnam. Patients aged 1–65 years with at least one focal and one systemic symptom of acute respiratory tract infection were assigned 1:1 to receive either C-reactive protein point-of-care testing or routine care, following which antibiotic prescribing decisions were made. Patients with severe acute respiratory tract infection were excluded. Enrolled patients were reassessed on day 3, 4, or 5, and on day 14 a structured telephone interview was done blind to the intervention. Randomised assignments were concealed from prescribers and patients but not masked as the test result was used to assist treatment decisions. The primary outcome was antibiotic use within 14 days of follow-up. All analyses were prespecifi ed in the protocol and the statistical analysis plan. All analyses were done on the intention-totreat population and the analysis of the primary endpoint was repeated in the per-protocol population. This trial is registered under number NCT01918579. Findings Between March 17, 2014, and July 3, 2015, 2037 patients (1028 children and 1009 adults) were enrolled and randomised. One adult patient withdrew immediately after randomisation. 1017 patients were assigned to receive C-reactive protein point-of-care testing, and 1019 patients were assigned to receive routine care. 115 patients in the C-reactive protein point-of-care group and 72 patients in the routine care group were excluded in the intention-to-treat analysis due to missing primary endpoint. The number of patients who used antibiotics within 14 days was 581 (64%) of 902 patients in the C-reactive protein group versus 738 (78%) of 947 patients in the control group (odds ratio [OR] 0·49, 95% CI 0·40–0·61; p<0·0001). Highly signifi cant diff erences were seen in both children and adults, with substantial heterogeneity of the intervention eff ect across the 10 sites (I²=84%, 95% CI 66–96). 140 patients in the C-reactive protein group and 137 patients in the routine care group missed the urine test on day 3, 4, or 5. Antibiotic activity in urine on day 3, 4, or 5 was found in 267 (30%) of 877 patients in the C-reactive protein group versus 314 (36%) of 882 patients in the routine treatment group (OR 0·78, 95% CI 0·63–0·95; p=0·015). Time to resolution of symptoms was similar in both groups. Adverse events were rare, with no deaths and a total of 14 hospital admissions (six in the C-reactive protein group and eight in the control group). Interpretation C-reactive protein point-of-care testing reduced antibiotic use for non-severe acute respiratory tract infection without compromising patients’ recovery in primary health care in Vietnam. Health-care providers might have become familiar with the clinical picture of low C-reactive protein, leading to reduction in antibiotic prescribing in both groups, but this would have led to a reduction in observed eff ect, rather than overestimation. Qualitative analysis is needed to address diff erences in context in order to implement this strategy to improve rational antibiotic use for patients with acute respiratory infection in low-income and middle-income countries

    Intensified treatment with high dose Rifampicin and Levofloxacin compared to standard treatment for adult patients with Tuberculous Meningitis (TBM-IT): protocol for a randomized controlled trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Tuberculous meningitis is the most severe form of tuberculosis. Mortality for untreated tuberculous meningitis is 100%. Despite the introduction of antibiotic treatment for tuberculosis the mortality rate for tuberculous meningitis remains high; approximately 25% for HIV-negative and 67% for HIV positive patients with most deaths occurring within one month of starting therapy. The high mortality rate in tuberculous meningitis reflects the severity of the condition but also the poor antibacterial activity of current treatment regimes and relatively poor penetration of these drugs into the central nervous system. Improving the antitubercular activity in the central nervous system of current therapy may help improve outcomes. Increasing the dose of rifampicin, a key drug with known poor cerebrospinal fluid penetration may lead to higher drug levels at the site of infection and may improve survival. Of the second generation fluoroquinolones, levofloxacin may have the optimal pharmacological features including cerebrospinal fluid penetration, with a ratio of Area Under the Curve (AUC) in cerebrospinal fluid to AUC in plasma of >75% and strong bactericidal activity against <it>Mycobacterium tuberculosis</it>. We propose a randomized controlled trial to assess the efficacy of an intensified anti-tubercular treatment regimen in tuberculous meningitis patients, comparing current standard tuberculous meningitis treatment regimens with standard treatment intensified with high-dose rifampicin and additional levofloxacin.</p> <p>Methods/Design</p> <p>A randomized, double blind, placebo-controlled trial with two parallel arms, comparing standard Vietnamese national guideline treatment for tuberculous meningitis with standard treatment <it>plus </it>an increased dose of rifampicin (to 15 mg/kg/day total) and additional levofloxacin. The study will include 750 patients (375 per treatment group) including a minimum of 350 HIV-positive patients. The calculation assumes an overall mortality of 40% vs. 30% in the two arms, respectively (corresponding to a target hazard ratio of 0.7), a power of 80% and a two-sided significance level of 5%. Randomization ratio is 1:1. The primary endpoint is overall survival, i.e. time from randomization to death during a follow-up period of 9 months. Secondary endpoints are: neurological disability at 9 months, time to new neurological event or death, time to new or recurrent AIDS-defining illness or death (in HIV-positive patients only), severe adverse events, and rate of treatment interruption for adverse events.</p> <p>Discussion</p> <p>Currently very few options are available for the treatment of TBM and the mortality rate remains unacceptably high with severe disabilities seen in many of the survivors. This trial is based on the hypothesis that current anti-mycobacterial treatment schedules for TBM are not potent enough and that outcomes will be improved by increasing the CSF penetrating power of this regimen by optimising dosage and using additional drugs with better CSF penetration.</p> <p>Trial registration</p> <p>International Standard Randomised Controlled Trial Number <a href="http://www.controlled-trials.com/ISRCTN61649292">ISRCTN61649292</a></p

    A common variant near TGFBR3 is associated with primary open angle glaucoma

    Get PDF
    Primary open angle glaucoma (POAG), a major cause of blindness worldwide, is a complex disease with a significant genetic contribution. We performed Exome Array (Illumina) analysis on 3504 POAG cases and 9746 controls with replication of the most significant findings in 9173 POAG cases and 26 780 controls across 18 collections of Asian, African and European descent. Apart from confirming strong evidence of association at CDKN2B-AS1 (rs2157719 [G], odds ratio [OR] = 0.71, P = 2.81 × 10−33), we observed one SNP showing significant association to POAG (CDC7–TGFBR3 rs1192415, ORG-allele = 1.13, Pmeta = 1.60 × 10−8). This particular SNP has previously been shown to be strongly associated with optic disc area and vertical cup-to-disc ratio, which are regarded as glaucoma-related quantitative traits. Our study now extends this by directly implicating it in POAG disease pathogenesis

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Downloaded from

    Get PDF
    Abstract Primary open angle glaucoma (POAG), a major cause of blindness worldwide, is a complex disease with a significant genetic contribution. We performed Exome Array ), we observed one SNP showing significant association to POAG (CDC7-TGFBR3 rs1192415, OR G-allele = 1.13, P meta = 1.60 × 10 −8 ). This particular SNP has previously been shown to be strongly associated with optic disc area and vertical cup-to-disc ratio, which are regarded as glaucoma-related quantitative traits. Our study now extends this by directly implicating it in POAG disease pathogenesis

    Socializing One Health: an innovative strategy to investigate social and behavioral risks of emerging viral threats

    Get PDF
    In an effort to strengthen global capacity to prevent, detect, and control infectious diseases in animals and people, the United States Agency for International Development’s (USAID) Emerging Pandemic Threats (EPT) PREDICT project funded development of regional, national, and local One Health capacities for early disease detection, rapid response, disease control, and risk reduction. From the outset, the EPT approach was inclusive of social science research methods designed to understand the contexts and behaviors of communities living and working at human-animal-environment interfaces considered high-risk for virus emergence. Using qualitative and quantitative approaches, PREDICT behavioral research aimed to identify and assess a range of socio-cultural behaviors that could be influential in zoonotic disease emergence, amplification, and transmission. This broad approach to behavioral risk characterization enabled us to identify and characterize human activities that could be linked to the transmission dynamics of new and emerging viruses. This paper provides a discussion of implementation of a social science approach within a zoonotic surveillance framework. We conducted in-depth ethnographic interviews and focus groups to better understand the individual- and community-level knowledge, attitudes, and practices that potentially put participants at risk for zoonotic disease transmission from the animals they live and work with, across 6 interface domains. When we asked highly-exposed individuals (ie. bushmeat hunters, wildlife or guano farmers) about the risk they perceived in their occupational activities, most did not perceive it to be risky, whether because it was normalized by years (or generations) of doing such an activity, or due to lack of information about potential risks. Integrating the social sciences allows investigations of the specific human activities that are hypothesized to drive disease emergence, amplification, and transmission, in order to better substantiate behavioral disease drivers, along with the social dimensions of infection and transmission dynamics. Understanding these dynamics is critical to achieving health security--the protection from threats to health-- which requires investments in both collective and individual health security. Involving behavioral sciences into zoonotic disease surveillance allowed us to push toward fuller community integration and engagement and toward dialogue and implementation of recommendations for disease prevention and improved health security

    Improvement of precision of numerical calculations using “multiple precision computation” package

    Get PDF
    In this work, the program so-called “Multiple Precision Computation” (MPC) proposed by Smith in 2003 is introduced and embedded into conventional codes for considerably improving the precision of numerical calculations. The procedure is evaluated for improvement and validated by using the comparison between calculations incorporating MPC and those using regular double-precision declarations and results obtained with well-known software Mathematica, respectively. Several representatively fundamental problems are taken into account for illustration

    Doppler-Free Spectroscopy Measurements of Isotope Shifts and Hyperfine Components of Near-IR Xenon Lines XENON AS A PROPELLANT GAS FOR ELECTRIC THRUSTERS

    No full text
    Abstract. Xenon is currently used as propellant gas in electric thrusters, in which ejection of corresponding ions produces the desired thrust. As such a gas contains 9 stable isotopes, a non-intrusive determination of the velocity distribution function of atoms and ions in the thruster plasma plume, by means of absorption or fluorescence techniques, requires a precise knowledge of the line structure. We used Doppler-free Lamb-dip spectroscopy to determine isotopic shifts and hyperfine components of odd isotopes of several spectral lines of Xe atom and Xe + ion in the 825 -835 nm range XENON AS A PROPELLANT GAS FOR ELECTRIC THRUSTERS Xenon is currently used as propellant gas in electric thrusters like gridded ion engines and Hall effect thrusters A Hall Effect Thruster (HET) can be seen as a hollow annular ceramic channel confining a magnetized low pressure DC discharge generated between an external hollow cathode and an anode. Electrons emitted from the cathode are trapped around Larmor orbits and experience an azimuthal drift. The resulting low electron mobility induces a strong electric field that is concentrated in the vicinity of thruster exhaust. The propellant gas, which is fed through the anode, is efficiently ionized by trapped electrons in the downstream region. Created ions are then instantly accelerated along the thruster axis by the local electric field Profound understanding of physical phenomena governing the plasma dynamics inside a HET necessitates to accurately measure physical parameters such as the gas temperature, the magnetic field magnitude during thrusters&apos; operation and especially the ion velocity distribution function. In order not to modify thruster plasma properties, these three quantities must be determined by means of laser-aided diagnostics providing information about Doppler broadening, Doppler shift and Zeeman splitting of xenon atomic and ionic lines. However, due to the existence of numerous Xe isotopes, some with non-zero nuclear spin, such measurements require a precise knowledge of the isotopic and hyperfine structure (HFS) of considered optical transitions. DOPPLER-FREE SPECTROSCOPY The hyperfine structure of a spectral line can be resolved when overcoming the limitation set by Doppler broadening. In this contribution, a non-linear absorption technique based on selective saturation of individual atomic transitions, the so-called Doppler free, or saturation, spectroscopy method, was applied with the use of a tunable single-mode diode lase

    Genetic Profiling and Individualized Prognosis of Fracture

    No full text
    Fragility fracture is a serious public health problem in the world. The risk of fracture is determined by genetic and nongenetic clinical risk factors. This study sought to quantify the contribution of genetic profiling to fracture prognosis. The study was built on the ongoing Dubbo Osteoporosis Epidemiology Study, in which fracture and risk factors of 858 men and 1358 women had been monitored continuously from 1989 and 2008. Fragility fracture was ascertained by radiologic reports. Bone mineral density at the femoral neck was measured by dual-energy X-ray absorptiometry (DXA). Fifty independent genes with allele frequencies ranging from 0.01 to 0.60 and relative risks (RRs) ranging from 1.01 to 3.0 were simulated. Three predictive models were fitted to the data in which fracture was a function of (1) clinical risk factors only, (2) genes only, and (3) clinical risk factors and 50 genes. The area under the curve (AUC) for model 1 was 0.77, which was lower than that of model II (AUC=0.82). Adding genes into the clinical risk factors model (model 3) increased the AUC to 0.88 and improved the accuracy of fracture classification by 45%, with most (41%) improvement in specificity. In the presence of clinical risk factors, the number of genes required to achieve an AUC of 0.85 was around 25. These results suggest that genetic profiling could enhance the predictive accuracy of fracture prognosis and help to identify high-risk individuals for appropriate management of osteoporosis or intervention
    corecore