309 research outputs found

    Accelerated in vivo proliferation of memory phenotype CD4+ T-cells in human HIV-1 infection irrespective of viral chemokine co-receptor tropism.

    Get PDF
    CD4(+) T-cell loss is the hallmark of HIV-1 infection. CD4 counts fall more rapidly in advanced disease when CCR5-tropic viral strains tend to be replaced by X4-tropic viruses. We hypothesized: (i) that the early dominance of CCR5-tropic viruses results from faster turnover rates of CCR5(+) cells, and (ii) that X4-tropic strains exert greater pathogenicity by preferentially increasing turnover rates within the CXCR4(+) compartment. To test these hypotheses we measured in vivo turnover rates of CD4(+) T-cell subpopulations sorted by chemokine receptor expression, using in vivo deuterium-glucose labeling. Deuterium enrichment was modeled to derive in vivo proliferation (p) and disappearance (d*) rates which were related to viral tropism data. 13 healthy controls and 13 treatment-naive HIV-1-infected subjects (CD4 143-569 cells/ul) participated. CCR5-expression defined a CD4(+) subpopulation of predominantly CD45R0(+) memory cells with accelerated in vivo proliferation (p = 2.50 vs 1.60%/d, CCR5(+) vs CCR5(-); healthy controls; P<0.01). Conversely, CXCR4 expression defined CD4(+) T-cells (predominantly CD45RA(+) naive cells) with low turnover rates. The dominant effect of HIV infection was accelerated turnover of CCR5(+)CD45R0(+)CD4(+) memory T-cells (p = 5.16 vs 2.50%/d, HIV vs controls; P<0.05), naïve cells being relatively unaffected. Similar patterns were observed whether the dominant circulating HIV-1 strain was R5-tropic (n = 9) or X4-tropic (n = 4). Although numbers were small, X4-tropic viruses did not appear to specifically drive turnover of CXCR4-expressing cells (p = 0.54 vs 0.72 vs 0.44%/d in control, R5-tropic, and X4-tropic groups respectively). Our data are most consistent with models in which CD4(+) T-cell loss is primarily driven by non-specific immune activation

    IL-4-secreting CD4+ T cells are crucial to the development of CD8+ T-cell responses against malaria liver stages.

    No full text
    CD4+ T cells are crucial to the development of CD8+ T cell responses against hepatocytes infected with malaria parasites. In the absence of CD4+ T cells, CD8+ T cells initiate a seemingly normal differentiation and proliferation during the first few days after immunization. However, this response fails to develop further and is reduced by more than 90%, compared to that observed in the presence of CD4+ T cells. We report here that interleukin-4 (IL-4) secreted by CD4+ T cells is essential to the full development of this CD8+ T cell response. This is the first demonstration that IL-4 is a mediator of CD4/CD8 cross-talk leading to the development of immunity against an infectious pathogen

    Serial counts of Mycobacterium tuberculosis in sputum as surrogate markers of the sterilising activity of rifampicin and pyrazinamide in treating pulmonary tuberculosis

    Get PDF
    BACKGROUND: Since the sterilising activity of new antituberculosis drugs is difficult to assess by conventional phase III studies, surrogate methods related to eventual relapse rates are required. METHODS: A suitable method is suggested by a retrospective analysis of viable counts of Mycobacterium tuberculosis in 12-hr sputum collections from 122 newly diagnosed patients with pulmonary tuberculosis in Nairobi, done pretreatment and at 2, 7, 14 and 28 days. Treatment was with isoniazid and streptomycin, supplemented with either thiacetazone (SHT) or rifampicin + pyrazinamide (SHRZ). RESULTS: During days 0–2, a large kill due to isoniazid occurred, unrelated to treatment or HIV status; thereafter it decreased exponentially. SHRZ appeared to have greater sterilising activity than SHT during days 2–7 (p = 0.044), due to rifampicin, and during days 14–28, probably due mainly to pyrazinamide. The greatest discrimination between SHRZ and SHT treatments was found between regression estimates of kill over days 2–28 (p = 0.0005) in patients who remained positive up to 28 days with homogeneous kill rates. No associations were found between regression estimates and the age, sex, and extent of disease or cavitation. An increased kill in HIV seropositive patients, unrelated to the treatment effect, was evident during days 2–28 (p = 0.007), mainly during days 2–7. CONCLUSIONS: Surrogate marker studies should either be in small groups treated with monotherapy during days 2 to about 7 or as add-ons or replacements in isoniazid-containing standard regimens from days 2 to 28 in large groups

    Rectal Transmission of Transmitted/Founder HIV-1 Is Efficiently Prevented by Topical 1% Tenofovir in BLT Humanized Mice

    Get PDF
    Rectal microbicides are being developed to prevent new HIV infections in both men and women. We focused our in vivo preclinical efficacy study on rectally-applied tenofovir. BLT humanized mice (n = 43) were rectally inoculated with either the primary isolate HIV-1(JRCSF) or the MSM-derived transmitted/founder (T/F) virus HIV-1(THRO) within 30 minutes following treatment with topical 1% tenofovir or vehicle. Under our experimental conditions, in the absence of drug treatment we observed 50% and 60% rectal transmission by HIV-1(JRCSF) and HIV-1(THRO), respectively. Topical tenofovir reduced rectal transmission to 8% (1/12; log rank p = 0.03) for HIV-1(JRCSF) and 0% (0/6; log rank p = 0.02) for HIV-1(THRO). This is the first demonstration that any human T/F HIV-1 rectally infects humanized mice and that transmission of the T/F virus can be efficiently blocked by rectally applied 1% tenofovir. These results obtained in BLT mice, along with recent ex vivo, Phase 1 trial and non-human primate reports, provide a critically important step forward in the development of tenofovir-based rectal microbicides

    Modelling imperfect adherence to HIV induction therapy

    Get PDF
    Abstract Background Induction-maintenance therapy is a treatment regime where patients are prescribed an intense course of treatment for a short period of time (the induction phase), followed by a simplified long-term regimen (maintenance). Since induction therapy has a significantly higher chance of pill fatigue than maintenance therapy, patients might take drug holidays during this period. Without guidance, patients who choose to stop therapy will each be making individual decisions, with no scientific basis. Methods We use mathematical modelling to investigate the effect of imperfect adherence during the inductive phase. We address the following research questions: 1. Can we theoretically determine the maximal length of a possible drug holiday and the minimal number of doses that must subsequently be taken while still avoiding resistance? 2. How many drug holidays can be taken during the induction phase? Results For a 180 day therapeutic program, a patient can take several drug holidays, but then has to follow each drug holiday with a strict, but fairly straightforward, drug-taking regimen. Since the results are dependent upon the drug regimen, we calculated the length and number of drug holidays for all fifteen protease-sparing triple-drug cocktails that have been approved by the US Food and Drug Administration. Conclusions Induction therapy with partial adherence is tolerable, but the outcome depends on the drug cocktail. Our theoretical predictions are in line with recent results from pilot studies of short-cycle treatment interruption strategies and may be useful in guiding the design of future clinical trials

    Burning in Banksia Woodlands: How Does the Fire-Free Period Influence Reptile Communities?

    Get PDF
    Fire is an important management tool for both hazard reduction burning and maintenance of biodiversity. The impact of time since last fire on fauna is an important factor to understand as land managers often aim for prescribed burning regimes with specific fire-free intervals. However, our current understanding of the impact of time since last fire on fauna is largely unknown and likely dependent on vegetation type. We examined the responses of reptiles to fire age in banksia woodlands, and the interspersed melaleuca damplands among them, north of Perth, Western Australia, where the current prescribed burning regime is targeting a fire-free period of 8–12 years. The response of reptiles to fire was dependent on vegetation type. Reptiles were generally more abundant (e.g. Lerista elegans and Ctenophorus adelaidensis) and specious in banksia sites. Several species (e.g. Menetia greyii, Cryptoblepharus buchananii) preferred long unburnt melaleuca sites (>16 years since last fire, YSLF) compared to recently burnt sites (<12 YSLF). Several of the small elapids (e.g. the WA priority listed species Neelaps calonotus) were only detected in older-aged banksia sites (>16 YSLF). The terrestrial dragon C. adelaidensis and the skink Morethia obscura displayed a strong response to fire in banksia woodlands only. Highest abundances of the dragon were detected in the recently burnt (<7 YSLF) and long unburnt (>35 YSLF) banksia woodlands, while the skink was more abundant in older sites. Habitats from a range of fire ages are required to support the reptiles we detected, especially the longer unburnt (>16 YSLF) melaleuca habitat. Current burning prescriptions are reducing the availability of these older habitats

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Reliability and validity of ultrasound imaging of features of knee osteoarthritis in the community

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Radiographs are the main outcome measure in epidemiological studies of osteoarthritis (OA). Ultrasound imaging has unique advantages in that it involves no ionising radiation, is easy to use and visualises soft tissue structures. Our objective was to measure the inter-rater reliability and validity of ultrasound imaging in the detection of features of knee OA.</p> <p>Methods</p> <p>Eighteen participants from a community cohort, had both knees scanned by two trained musculoskeletal sonographers, up to six weeks apart. Inter-rater reliability for osteophytes, effusion size and cartilage thickness was calculated by estimating Kappa (κ) and Intraclass correlation coefficients (ICC), as appropriate. A measure of construct validity was determined by estimating κ between the two imaging modalities in the detection of osteophytes.</p> <p>Results</p> <p><it>Reliability: </it>κ for osteophyte presence was 0.77(right femur), 0.65(left femur) and 0.88 for both tibia. ICCs for effusion size were 0.70(right) and 0.85(left). Moderate to substantial agreement was found in cartilage thickness measurements. <it>Validity: </it>For osteophytes, κ was moderate to excellent at 0.52(right) and 0.75(left).</p> <p>Conclusion</p> <p>Substantial to excellent agreement was found between ultrasound observers for the presence of osteophytes and measurement of effusion size; it was moderate to substantial for femoral cartilage thickness. Moderate to substantial agreement was observed between ultrasound and radiographs for osteophyte presence.</p

    Emerging New Crop Pests: Ecological Modelling and Analysis of the South American Potato Psyllid Russelliana solanicola (Hemiptera: Psylloidea) and Its Wild Relatives

    Get PDF
    © 2017 Syfert et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

    Interspecies interactions and potential Influenza A virus risk in small swine farms in Peru

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The recent avian influenza epidemic in Asia and the H1N1 pandemic demonstrated that influenza A viruses pose a threat to global public health. The animal origins of the viruses confirmed the potential for interspecies transmission. Swine are hypothesized to be prime "mixing vessels" due to the dual receptivity of their trachea to human and avian strains. Additionally, avian and human influenza viruses have previously been isolated in swine. Therefore, understanding interspecies contact on smallholder swine farms and its potential role in the transmission of pathogens such as influenza virus is very important.</p> <p>Methods</p> <p>This qualitative study aimed to determine swine-associated interspecies contacts in two coastal areas of Peru. Direct observations were conducted at both small-scale confined and low-investment swine farms (n = 36) and in open areas where swine freely range during the day (n = 4). Interviews were also conducted with key stakeholders in swine farming.</p> <p>Results</p> <p>In both locations, the intermingling of swine and domestic birds was common. An unexpected contact with avian species was that swine were fed poultry mortality in 6/20 of the farms in Chancay. Human-swine contacts were common, with a higher frequency on the confined farms. Mixed farming of swine with chickens or ducks was observed in 36% of all farms. Human-avian interactions were less frequent overall. Use of adequate biosecurity and hygiene practices by farmers was suboptimal at both locations.</p> <p>Conclusions</p> <p>Close human-animal interaction, frequent interspecies contacts and suboptimal biosecurity and hygiene practices pose significant risks of interspecies influenza virus transmission. Farmers in small-scale swine production systems constitute a high-risk population and need to be recognized as key in preventing interspecies pathogen transfer. A two-pronged prevention approach, which offers educational activities for swine farmers about sound hygiene and biosecurity practices and guidelines and education for poultry farmers about alternative approaches for processing poultry mortality, is recommended. Virological and serological surveillance for influenza viruses will also be critical for these human and animal populations.</p
    corecore