4,416 research outputs found
Predicting the outcomes of treatment to eradicate the latent reservoir for HIV-1
Massive research efforts are now underway to develop a cure for HIV
infection, allowing patients to discontinue lifelong combination antiretroviral
therapy (ART). New latency-reversing agents (LRAs) may be able to purge the
persistent reservoir of latent virus in resting memory CD4+ T cells, but the
degree of reservoir reduction needed for cure remains unknown. Here we use a
stochastic model of infection dynamics to estimate the efficacy of LRA needed
to prevent viral rebound after ART interruption. We incorporate clinical data
to estimate population-level parameter distributions and outcomes. Our findings
suggest that approximately 2,000-fold reductions are required to permit a
majority of patients to interrupt ART for one year without rebound and that
rebound may occur suddenly after multiple years. Greater than 10,000-fold
reductions may be required to prevent rebound altogether. Our results predict
large variation in rebound times following LRA therapy, which will complicate
clinical management. This model provides benchmarks for moving LRAs from the
lab to the clinic and can aid in the design and interpretation of clinical
trials. These results also apply to other interventions to reduce the latent
reservoir and can explain the observed return of viremia after months of
apparent cure in recent bone marrow transplant recipients and an
immediately-treated neonate.Comment: 8 pages main text (4 figures). In PNAS Early Edition
http://www.pnas.org/content/early/2014/08/05/1406663111. Ancillary files: SI,
24 pages SI (7 figures). File .htm opens a browser-based application to
calculate rebound times (see SI). Or, the .cdf file can be run with
Mathematica. The most up-to-date version of the code is available at
http://www.danielrosenbloom.com/reboundtimes
Counselling in primary care : a systematic review of the evidence
Primary objective: To undertake a systematic review which aimed to locate, appraise and synthesise evidence to obtain a reliable overview of the clinical effectiveness, cost-effectiveness and user perspectives regarding counselling in primary care.
Main results: Evidence from 26 studies was presented as a narrative synthesis and demonstrated that counselling is effective in the short term, is as effective as CBT with typical heterogeneous primary care populations and more effective than routine primary care for the treatment of non-specific generic psychological problems, anxiety and depression. Counselling may reduce levels of referrals to psychiatric services, but does not appear to reduce medication, the number of GP consultations or overall
costs. Patients are highly satisfied with the counselling they have received in primary care and prefer counselling to medication for depression.
Conclusions and implications for future research: This review demonstrates the value of counselling as a valid
choice for primary care patients and as a broadly effective therapeutic intervention for a wide range of generic psychological conditions presenting in the primary care setting. More rigorous clinical and cost-effectiveness trials are needed together with surveys of more typical users of primary care services
Feasibility of omega-3 fatty acid supplementation as an adjunct therapy for people with chronic obstructive pulmonary disease: study protocol for a randomized controlled trial
There is evidence to support the use of supplementation with long-chain omega-3 polyunsaturated fatty acids (LCn-3PUFA) from oily fish or fish oil for the treatment of various inflammatory diseases such as rheumatoid arthritis. Chronic obstructive pulmonary disease (COPD) is a progressive, terminal disease characterized by persistent airflow limitation, lung and systemic inflammation. To date, one randomized controlled trial has been published that assessed the efficacy of LCn-3PUFA in people with this condition. The aim of this article is to discuss the feasibility of conducting a trial to evaluate fish oil supplementation as adjunct therapy in people with COPD.The study is supported by a University of South Australia, Division of Health Sciences grant (DRDG 2011 (round 2))
Life cycle synchronization is a viral drug resistance mechanism
Viral infections are one of the major causes of death worldwide, with HIV infection alone resulting in over 1.2 million casualties per year. Antiviral drugs are now being administered for a variety of viral infections, including HIV, hepatitis B and C, and influenza. These therapies target a specific phase of the virusâs life cycle, yet their ultimate success depends on a variety of factors, such as adherence to a prescribed regimen and the emergence of viral drug resistance. The epidemiology and evolution of drug resistance have been extensively characterized, and it is generally assumed that drug resistance arises from mutations that alter the virusâs susceptibility to the direct action of the drug. In this paper, we consider the possibility that a virus population can evolve towards synchronizing its life cycle with the pattern of drug therapy. The periodicity of the drug treatment could then allow for a virus strain whose life cycle length is a multiple of the dosing interval to replicate only when the concentration of the drug is lowest. This process, referred to as âdrug tolerance by synchronizationâ, could allow the virus population to maximize its overall fitness without having to alter drug binding or complete its life cycle in the drugâs presence. We use mathematical models and stochastic simulations to show that life cycle synchronization can indeed be a mechanism of viral drug tolerance. We show that this effect is more likely to occur when the variability in both viral life cycle and drug dose timing are low. More generally, we find that in the presence of periodic drug levels, time-averaged calculations of viral fitness do not accurately predict drug levels needed to eradicate infection, even if there is no synchronization. We derive an analytical expression for viral fitness that is sufficient to explain the drug-pattern-dependent survival of strains with any life cycle length. We discuss the implications of these findings for clinically relevant antiviral strategies
Real-Time Predictions of Reservoir Size and Rebound Time during Antiretroviral Therapy Interruption Trials for HIV
Monitoring the efficacy of novel reservoir-reducing treatments for HIV is challenging. The limited ability to sample and quantify latent infection means that supervised antiretroviral therapy (ART) interruption studies are generally required. Here we introduce a set of mathematical and statistical modeling tools to aid in the design and interpretation of ART-interruption trials. We show how the likely size of the remaining reservoir can be updated in real-time as patients continue off treatment, by combining the output of laboratory assays with insights from models of reservoir dynamics and rebound. We design an optimal schedule for viral load sampling during interruption, whereby the frequency of follow-up can be decreased as patients continue off ART without rebound. While this scheme can minimize costs when the chance of rebound between visits is low, we find that the reservoir will be almost completely reseeded before rebound is detected unless sampling occurs at least every two weeks and the most sensitive viral load assays are used. We use simulated data to predict the clinical trial size needed to estimate treatment effects in the face of highly variable patient outcomes and imperfect reservoir assays. Our findings suggest that large numbers of patientsâbetween 40 and 150âwill be necessary to reliably estimate the reservoir-reducing potential of a new therapy and to compare this across interventions. As an example, we apply these methods to the two âBoston patientsâ, recipients of allogeneic hematopoietic stem cell transplants who experienced large reductions in latent infection and underwent ART-interruption. We argue that the timing of viral rebound was not particularly surprising given the information available before treatment cessation. Additionally, we show how other clinical data can be used to estimate the relative contribution that remaining HIV+ cells in the recipient versus newly infected cells from the donor made to the residual reservoir that eventually caused rebound. Together, these tools will aid HIV researchers in the evaluating new potentially-curative strategies that target the latent reservoir
Primary hemiarthroplasty for treatment of proximal humeral fractures
Background: Primary hemiarthroplasty of the shoulder is used to treat complex proximal humeral fractures, although the reported functional results following this method of treatment have varied widely. The aim of this study was to prospectively assess the prosthetic survival and functional outcomes in a large series of patients treated with shoulder hemiarthroplasty for a proximal humeral fracture. By determining the factors that affected the outcome, we also aimed to produce models that could be used clinically to estimate the functional outcome at one year following surgery.Methods: A thirteen-year observational cohort study of 163 consecutive patients treated with hemiarthroplasty for a proximal humeral fracture was performed. Twenty-five patients died or were lost to follow-up in the first year after treatment, leaving 138 patients who had assessment of shoulder function with use of the modified Constant score at one year postinjury.Results: The overall rate of prosthetic survival was 96.9% at one year, 95.3% at five years, and 93.9% at ten years. The overall median modified Constant score was 64 points at one year, with a typically good score for pain relief (median, 15 points) and poorer scores, with a greater scatter of values, for function (median, 12 points), range of motion (median, 24 points), and muscle power (median, 14 points). Of the factors that were assessed immediately after the injury, only patient age, the presence of a neurological deficit, tobacco usage, and alcohol consumption were significantly predictive of the one-year Constant score (p < 0.05). Of the factors that were assessed at six weeks postinjury, those that predicted the one-year Constant score included the age of the patient, the presence of a persistent neurological deficit, the need for an early reoperation, the degree of displacement of the prosthetic head from the central axis of the glenoid seen radiographically, and the degree of displacement of the tuberosities seen radiographically.Conclusions: Primary shoulder hemiarthroplasty performed for the treatment of a proximal humeral fracture in medically fit and cooperative adults is associated with satisfactory prosthetic survival at an average of 6.3 years. Although the shoulder is usually free of pain following this procedure, the overall functional result, in terms of range of motion, function, and power, at one year varies. A good functional outcome can be anticipated for a younger individual who has no preoperative neurological deficit, no postoperative complications, and a satisfactory radiographic appearance of the shoulder at six weeks. The results are poorer in the larger group of elderly patients who undergo this procedure, especially if they have a neurological deficit, a postoperative complication requiring a reoperation, or an eccentrically located prosthesis with retracted tuberosities.<br /
Recommended from our members
Evolution and emergence of infectious diseases in theoretical and real-world networks
One of the most important advancements in theoretical epidemiology has been the development of methods that account for realistic host population structure. The central finding is that heterogeneity in contact networks, such as the presence of âsuperspreadersâ, accelerates infectious disease spread in real epidemics. Disease control is also complicated by the continuous evolution of pathogens in response to changing environments and medical interventions. It remains unclear, however, how population structure influences these adaptive processes. Here we examine the evolution of infectious disease in empirical and theoretical networks. We show that the heterogeneity in contact structure, which facilitates the spread of a single disease, surprisingly renders a resident strain more resilient to invasion by new variants. Our results suggest that many host contact structures suppress invasion of new strains and may slow disease adaptation. These findings are important to the natural history of disease evolution and the spread of drug-resistant strains
Cattle transport network predicts endemic and epidemic foot-and-mouth disease risk on farms in Turkey
The structure of contact networks affects the likelihood of disease spread at the population scale and the risk of infection at any given node. Though this has been well characterized for both theoretical and empirical networks for the spread of epidemics on completely susceptible networks, the long-term impact of network structure on risk of infection with an endemic pathogen, where nodes can be infected more than once, has been less well characterized. Here, we analyze detailed records of the transportation of cattle among farms in Turkey to characterize the global and local attributes of the directedâweighted shipments network between 2007-2012. We then study the correlations between network properties and the likelihood of infection with, or exposure to, foot-and-mouth disease (FMD) over the same time period using recorded outbreaks. The shipments network shows a complex combination of features (local and global) that have not been previously reported in other networks of shipments; i.e. small-worldness, scale-freeness, modular structure, among others. We find that nodes that were either infected or at high risk of infection with FMD (within one link from an infected farm) had disproportionately higher degree, were more central (eigenvector centrality and coreness), and were more likely to be net recipients of shipments compared to those that were always more than 2 links away from an infected farm. High in-degree (i.e. many shipments received) was the best univariate predictor of infection. Low in-coreness (i.e. peripheral nodes) was the best univariate predictor of nodes always more than 2 links away from an infected farm. These results are robust across the three different serotypes of FMD observed in Turkey and during periods of low-endemic prevalence and high-prevalence outbreaks
- âŠ