73 research outputs found

    Xenograft models of head and neck cancers

    Get PDF
    Head and neck cancers are among the most prevalent tumors in the world. Despite advances in the treatment of head and neck tumors, the survival of patients with these cancers has not markedly improved over the past several decades because of our inability to control and our poor understanding of the regional and distant spread of this disease. One of the factors contributing to our poor understanding may be the lack of reliable animal models of head and neck cancer metastasis. The earliest xenograft models in which human tumor cells were grown in immunosuppressed mice involved subcutaneous implantation of human head and neck cancer cell lines. Subcutaneous xenograft models have been popular because they are easy to establish, easy to manage, and lend themselves to ready quantitation of the tumor burden. More recently, orthotopic xenograft models, in which the tumor cells are implanted in the tumor site of origin, have been used with greater frequency in animal studies of head and neck cancers. Orthotopic xenograft models are advantageous for their ability to mimic local tumor growth and recapitulate the pathways of metastasis seen in human head and neck cancers. In addition, recent innovations in cell labeling techniques and small-animal imaging have enabled investigators to monitor the metastatic process and quantitate the growth and spread of orthopically implanted tumors. This review summarizes the progress in the development of murine xenograft models of head and neck cancers. We then discuss the advantages and disadvantages of each type of xenograft model. We also discuss the potential for these models to help elucidate the mechanisms of regional and distant metastasis, which could improve our ability to treat head and neck cancers

    Analysis of infectious virus clones from two HIV-1 superinfection cases suggests that the primary strains have lower fitness

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two HIV-1 positive patients, L and P, participating in the Amsterdam Cohort studies acquired an HIV-1 superinfection within half a year from their primary HIV-1 infection (Jurriaans <it>et al</it>., <it>JAIDS </it>2008, <b>47:</b>69-73). The aim of this study was to compare the replicative fitness of the primary and superinfecting HIV-1 strains of both patients. The use of isolate-specific primer sets indicated that the primary and secondary strains co-exist in plasma at all time points after the moment of superinfection.</p> <p>Results</p> <p>Biological HIV-1 clones were derived from peripheral blood CD4 + T cells at different time point, and identified as the primary or secondary virus through sequence analysis. Replication competition assays were performed with selected virus pairs in PHA/IL-2 activated peripheral blood mononuclear cells (PBMC's) and analyzed with the Heteroduplex Tracking Assay (HTA) and isolate-specific PCR amplification. In both cases, we found a replicative advantage of the secondary HIV-1 strain over the primary virus. Full-length HIV-1 genomes were sequenced to find possible explanations for the difference in replication capacity. Mutations that could negatively affect viral replication were identified in the primary infecting strains. In patient L, the primary strain has two insertions in the LTR promoter, combined with a mutation in the <it>tat </it>gene that has been associated with decreased replication capacity. The primary HIV-1 strain isolated from patient P has two mutations in the LTR that have been associated with a reduced replication rate. In a luciferase assay, only the LTR from the primary virus of patient P had lower transcriptional activity compared with the superinfecting virus.</p> <p>Conclusions</p> <p>These preliminary findings suggest the interesting scenario that superinfection occurs preferentially in patients infected with a relatively attenuated HIV-1 isolate.</p

    High Viral Fitness during Acute HIV-1 Infection

    Get PDF
    Several clinical studies have shown that, relative to disease progression, HIV-1 isolates that are less fit are also less pathogenic. The aim of the present study was to investigate the relationship between viral fitness and control of viral load (VL) in acute and early HIV-1 infection. Samples were obtained from subjects participating in two clinical studies. In the PULSE study, antiretroviral therapy (ART) was initiated before, or no later than six months following seroconversion. Subjects then underwent multiple structured treatment interruptions (STIs). The PHAEDRA study enrolled and monitored a cohort of individuals with documented evidence of primary infection. The subset chosen were individuals identified no later than 12 months following seroconversion to HIV-1, who were not receiving ART. The relative fitness of primary isolates obtained from study participants was investigated ex vivo. Viral DNA production was quantified using a novel real time PCR assay. Following intermittent ART, the fitness of isolates obtained from 5 of 6 PULSE subjects decreased over time. In contrast, in the absence of ART the fitness of paired isolates obtained from 7 of 9 PHAEDRA subjects increased over time. However, viral fitness did not correlate with plasma VL. Most unexpected was the high relative fitness of isolates obtained at Baseline from PULSE subjects, before initiating ART. It is widely thought that the fitness of strains present during the acute phase is low relative to strains present during chronic HIV-1 infection, due to the bottleneck imposed upon transmission. The results of this study provide evidence that the relative fitness of strains present during acute HIV-1 infection may be higher than previously thought. Furthermore, that viral fitness may represent an important clinical parameter to be considered when deciding whether to initiate ART during early HIV-1 infection

    Tuberculosis among people living with HIV/AIDS in the German ClinSurv HIV Cohort: long-term incidence and risk factors

    Get PDF
    BACKGROUND: Tuberculosis (TB) still presents a leading cause of morbidity and mortality among people living with HIV/AIDS (PLWHA), including those on antiretroviral therapy. In this study, we aimed to determine the long-term incidence density rate (IDR) of TB and risk factors among PLWHA in relation to combination antiretroviral therapy (cART)-status. METHODS: Data of PLWHA enrolled from 2001 through 2011 in the German ClinSurv HIV Cohort were investigated using survival analysis and Cox regression. RESULTS: TB was diagnosed in 233/11,693 PLWHA either at enrollment (N = 62) or during follow-up (N = 171). The TB IDR during follow-up was 0.37 cases per 100 person-years (PY) overall [95% CI, 0.32-0.43], and was higher among patients who never started cART and among patients originating from Sub-Saharan Africa (1.23 and 1.20 per 100PY, respectively). In two multivariable analyses, both patients (I) who never started cART and (II) those on cART shared the same risk factors for TB, namely: originating from Sub-Saharan Africa compared to Germany (I, hazard ratio (HR); [95% CI]) 4.05; [1.87-8.78] and II, HR 5.15 [2.76-9.60], CD4+ cell count <200 cells/μl (I, HR 8.22 [4.36-15.51] and II, HR 1.90 [1.14-3.15]) and viral load >5 log(10) copies/ml (I, HR 2.51 [1.33-4.75] and II, HR 1.77 [1.11-2.82]). Gender, age or HIV-transmission risk group were not independently associated with TB. CONCLUSION: In the German ClinSurv HIV cohort, patients originating from Sub-Saharan Africa, with low CD4+ cell count or high viral load at enrollment were at increased risk of TB even after cART initiation. As patients might be latently infected with Mycobacterium tuberculosis complex, early screening for latent TB infection and implementing isoniazid preventive therapy in line with available recommendations is crucial

    Effects of storm events on mobilisaton and in-stream processing of dissolved organic matter (DOM) in a Welsh peatland catchment.

    Get PDF
    Peatlands are important contributors of dissolved organic matter (DOM) to downstream aquatic systems. We investigated the effects of storm events on dissolved organic carbon (DOC) concentrations and DOM quality in a stream draining a Welsh peatland catchment. Intensive stream samples were collected and analysed for pH, DOC, dissolved organic nitrogen (DON), absorbance and fluorescence. Soil water samples and samples of sphagnum pore water were also collected, and a simple end-member mixing model was applied to account for changes occurring during the events. Fluorescence data were interpreted using parallel factor analysis (PARAFAC). DOC concentrations increased and pH decreased during the storm events. The soil water data and the mixing model indicated that this was due to a change of flow paths and draining of the DOC-rich acrotelm. Absorbance indices and the DOC/DON ratio suggested that the DOM released during events was less degraded. There was a striking, inversely related diurnal pattern in absorbance and fluorescence after the discharge peak. The diurnal pattern and a lack of fit with the mixing model suggested that fluorescing DOM was mainly produced in-stream. Fluorescence has been found to peak in the morning and decline during day-time due to photo-bleaching. We hypothesise that the input of additional DOM during events causes a change in the diurnal pattern, giving a peak at mid-day, when the processing of the additional DOM is highest

    Imaging ultrafast electron dynamics

    No full text
    • …
    corecore