2,170 research outputs found

    Demonstration of safety of intravenous immunoglobulin in geriatric patients in a long-term, placebo-controlled study of Alzheimer's disease.

    Get PDF
    INTRODUCTION:We present safety results from a study of Gammagard Liquid intravenous immunoglobulin (IGIV) in patients with probable Alzheimer's disease. METHODS:This was a placebo-controlled double-blind study. Subjects were randomized to 400 mg/kg (n = 127), 200 mg/kg (n = 135) IGIV, or to 0.25% human albumin (n = 121) administered every 2 weeks ± 7 days for 18 months. RESULTS:Elevated risk ratios of IGIV versus placebo included chills (3.85) in 9.5% of IGIV-treated subjects (all doses), compared to 2.5% of placebo-treated subjects, and rash (3.08) in 15.3% of IGIV-treated subjects versus 5.0% of subjects treated with placebo. Subjects in the highest IGIV dose group had the lowest proportion of SAEs considered related to product (2 of 127 [1.6%]). Subjects treated with IGIV experienced a lower rate of respiratory and all other infections compared to placebo. DISCUSSION:IGIV-treated subjects did not experience higher rates of renal failure, lung injury, or thrombotic events than the placebo group. There were no unexpected safety findings. IGIV was well tolerated throughout 18 months of treatment in subjects aged 50-89 years

    Soils in ancient irrigated agricultural terraces in the Atacama Desert, Chile

    Get PDF
    The Atacama Desert is among the driest places on Earth, yet ancient agricultural systems are present in the region. Here, we present a study of terraced agricultural soils in the high-altitude eastern margin of the Atacama Desert in northern Chile, mainly dating to the Late Intermediate Period (ca. 950-1400 AD) and Inka period (ca. 1400-1536 AD). Terraced fields were compartmentalized to distribute limited irrigation water originating mainly from springs. Natural soils used for agriculture are mostly Aridisols developed on Pleistocene alluvial fan terraces and hillslopes underlain by volcanic bedrock. One research objective is to evaluate long-term soil change from agriculture. In this hyperarid climate, agriculture is only possible with irrigation, so natural soils on the same geomorphic surface adjacent to irrigated soils provide baseline data for assessing anthropogenic soil change. Data from soil profiles and surface transects indicate intentional soil change through terracing, removal of soil rock fragments, and probable fertilization. Agricultural soils have anthropogenic horizons ranging from 16 to 54 cm thick. Most agricultural soils have higher phosphorus levels, suggesting enrichment from fertilization. Changes in soil organic carbon and nitrogen are also evident. Unintentional anthropogenic soil change resulted from CaCO3 input through irrigation with calcareous spring water. Initial studies suggest that agriculture here was sustainable in the sense of conserving soils, and maintaining and possibly improving soil productivity over centuries.info:eu-repo/semantics/publishedVersio

    A Landscape Perspective on Climate-Driven Risks to Food Security: Exploring the Relationship between Climate and Social Transformation in the Prehispanic U.S. Southwest

    Get PDF
    Spatially and temporally unpredictable rainfall patterns presented food production challenges to small-scale agricultural communities, requiring multiple risk-mitigating strategies to increase food security. Although site-based investigations of the relationship between climate and agricultural production offer insights into how individual communities may have created long-term adaptations to manage risk, the inherent spatial variability of climate-driven risk makes a landscape-scale perspective valuable. In this article, we model risk by evaluating how the spatial structure of ancient climate conditions may have affected the reliability of three major strategies used to reduce risk: drawing upon social networks in time of need, hunting and gathering of wild resources, and storing surplus food. We then explore how climate-driven changes to this reliability may relate to archaeologically observed social transformations. We demonstrate the utility of this methodology by comparing the Salinas and Cibola regions in the prehispanic U.S. Southwest to understand the complex relationship among climate-driven threats to food security, risk-mitigation strategies, and social transformations. Our results suggest key differences in how communities buffered against risk in the Cibola and Salinas study regions, with the structure of precipitation influencing the range of strategies to which communities had access through time

    Community Preferences for the Allocation &Donation of Organs - The PAraDOx Study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Transplantation is the treatment of choice for people with severe organ failure. However, demand substantially exceeds supply of suitable organs; consequently many people wait months, or years to receive an organ. Reasons for the chronic shortage of deceased organ donations are unclear; there appears to be no lack of 'in principle' public support for organ donation.</p> <p>Methods/Design</p> <p>The PAraDOx Study examines community preferences for organ donation policy in Australia. The aims are to 1) determine which factors influence decisions by individuals to offer their organs for donation and 2) determine the criteria by which the community deems the allocation of donor organs to be fair and equitable. Qualitative and quantitative methods will be used to assess community preferences for organ donation and allocation.</p> <p>Focus group participants from the general community, aged between 18-80, will be purposively sampled to ensure a variety of cultural backgrounds and views on organ donation. Each focus group will include a ranking exercise using a modified nominal group technique. Focus groups of organ recipients, their families, and individuals on a transplant waiting list will also be conducted.</p> <p>Using the qualitative work, a discrete choice study will be designed to quantitatively assess community preferences. Discrete choice methods are based on the premise that goods and services can be described in terms of a number of separate attributes. Respondents are presented with a series of choices where levels of attributes are varied, and a mathematical function is estimated to describe numerically the value respondents attach to different options. Two community surveys will be conducted in approximately 1000 respondents each to assess community preferences for organ donation and allocation. A mixed logit model will be used; model results will be expressed as parameter estimates (β) and the odds of choosing one option over an alternative. Trade-offs between attributes will also be calculated.</p> <p>Discussion</p> <p>By providing a better understanding of current community preferences in relation to organ donation and allocation, the PAraDOx study will highlight options for firstly, increasing the rate of organ donation and secondly, allow for more transparent and equitable policies in relation to organ allocation.</p

    Histone Deacetylase Inhibitor Romidepsin Induces HIV Expression in CD4 T Cells from Patients on Suppressive Antiretroviral Therapy at Concentrations Achieved by Clinical Dosing

    Get PDF
    Persistent latent reservoir of replication-competent proviruses in memory CD4 T cells is a major obstacle to curing HIV infection. Pharmacological activation of HIV expression in latently infected cells is being explored as one of the strategies to deplete the latent HIV reservoir. In this study, we characterized the ability of romidepsin (RMD), a histone deacetylase inhibitor approved for the treatment of T-cell lymphomas, to activate the expression of latent HIV. In an in vitro T-cell model of HIV latency, RMD was the most potent inducer of HIV (EC50 = 4.5 nM) compared with vorinostat (VOR; EC50 = 3,950 nM) and other histone deacetylase (HDAC) inhibitors in clinical development including panobinostat (PNB; EC50 = 10 nM). The HIV induction potencies of RMD, VOR, and PNB paralleled their inhibitory activities against multiple human HDAC isoenzymes. In both resting and memory CD4 T cells isolated from HIV-infected patients on suppressive combination antiretroviral therapy (cART), a 4-hour exposure to 40 nM RMD induced a mean 6-fold increase in intracellular HIV RNA levels, whereas a 24-hour treatment with 1 μM VOR resulted in 2- to 3-fold increases. RMD-induced intracellular HIV RNA expression persisted for 48 hours and correlated with sustained inhibition of cell-associated HDAC activity. By comparison, the induction of HIV RNA by VOR and PNB was transient and diminished after 24 hours. RMD also increased levels of extracellular HIV RNA and virions from both memory and resting CD4 T-cell cultures. The activation of HIV expression was observed at RMD concentrations below the drug plasma levels achieved by doses used in patients treated for T-cell lymphomas. In conclusion, RMD induces HIV expression ex vivo at concentrations that can be achieved clinically, indicating that the drug may reactivate latent HIV in patients on suppressive cART

    Machine learning algorithms performed no better than regression models for prognostication in traumatic brain injury

    Get PDF
    Objective: We aimed to explore the added value of common machine learning (ML) algorithms for prediction of outcome for moderate and severe traumatic brain injury. Study Design and Setting: We performed logistic regression (LR), lasso regression, and ridge regression with key baseline predictors in the IMPACT-II database (15 studies, n = 11,022). ML algorithms included support vector machines, random forests, gradient boosting machines, and artificial neural networks and were trained using the same predictors. To assess generalizability of predictions, we performed internal, internal-external, and external validation on the recent CENTER-TBI study (patients with Glasgow Coma Scale <13, n = 1,554). Both calibration (calibration slope/intercept) and discrimination (area under the curve) was quantified. Results: In the IMPACT-II database, 3,332/11,022 (30%) died and 5,233(48%) had unfavorable outcome (Glasgow Outcome Scale less than 4). In the CENTER-TBI study, 348/1,554(29%) died and 651(54%) had unfavorable outcome. Discrimination and calibration varied widely between the studies and less so between the studied algorithms. The mean area under the curve was 0.82 for mortality and 0.77 for unfavorable outcomes in the CENTER-TBI study. Conclusion: ML algorithms may not outperform traditional regression approaches in a low-dimensional setting for outcome prediction after moderate or severe traumatic brain injury. Similar to regression-based prediction models, ML algorithms should be rigorously validated to ensure applicability to new populations

    Variation in Structure and Process of Care in Traumatic Brain Injury: Provider Profiles of European Neurotrauma Centers Participating in the CENTER-TBI Study.

    Get PDF
    INTRODUCTION: The strength of evidence underpinning care and treatment recommendations in traumatic brain injury (TBI) is low. Comparative effectiveness research (CER) has been proposed as a framework to provide evidence for optimal care for TBI patients. The first step in CER is to map the existing variation. The aim of current study is to quantify variation in general structural and process characteristics among centers participating in the Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study. METHODS: We designed a set of 11 provider profiling questionnaires with 321 questions about various aspects of TBI care, chosen based on literature and expert opinion. After pilot testing, questionnaires were disseminated to 71 centers from 20 countries participating in the CENTER-TBI study. Reliability of questionnaires was estimated by calculating a concordance rate among 5% duplicate questions. RESULTS: All 71 centers completed the questionnaires. Median concordance rate among duplicate questions was 0.85. The majority of centers were academic hospitals (n = 65, 92%), designated as a level I trauma center (n = 48, 68%) and situated in an urban location (n = 70, 99%). The availability of facilities for neuro-trauma care varied across centers; e.g. 40 (57%) had a dedicated neuro-intensive care unit (ICU), 36 (51%) had an in-hospital rehabilitation unit and the organization of the ICU was closed in 64% (n = 45) of the centers. In addition, we found wide variation in processes of care, such as the ICU admission policy and intracranial pressure monitoring policy among centers. CONCLUSION: Even among high-volume, specialized neurotrauma centers there is substantial variation in structures and processes of TBI care. This variation provides an opportunity to study effectiveness of specific aspects of TBI care and to identify best practices with CER approaches

    The Fifteenth Data Release of the Sloan Digital Sky Surveys: First Release of MaNGA-derived Quantities, Data Visualization Tools, and Stellar Library

    Get PDF
    Twenty years have passed since first light for the Sloan Digital Sky Survey (SDSS). Here, we release data taken by the fourth phase of SDSS (SDSS-IV) across its first three years of operation (2014 July–2017 July). This is the third data release for SDSS-IV, and the 15th from SDSS (Data Release Fifteen; DR15). New data come from MaNGA—we release 4824 data cubes, as well as the first stellar spectra in the MaNGA Stellar Library (MaStar), the first set of survey-supported analysis products (e.g., stellar and gas kinematics, emission-line and other maps) from the MaNGA Data Analysis Pipeline, and a new data visualization and access tool we call "Marvin." The next data release, DR16, will include new data from both APOGEE-2 and eBOSS; those surveys release no new data here, but we document updates and corrections to their data processing pipelines. The release is cumulative; it also includes the most recent reductions and calibrations of all data taken by SDSS since first light. In this paper, we describe the location and format of the data and tools and cite technical references describing how it was obtained and processed. The SDSS website (www.sdss.org) has also been updated, providing links to data downloads, tutorials, and examples of data use. Although SDSS-IV will continue to collect astronomical data until 2020, and will be followed by SDSS-V (2020–2025), we end this paper by describing plans to ensure the sustainability of the SDSS data archive for many years beyond the collection of data

    Angiotensin receptor blockers and β blockers in Marfan syndrome: an individual patient data meta-analysis of randomised trials

    Get PDF
    Background: Angiotensin receptor blockers (ARBs) and β blockers are widely used in the treatment of Marfan syndrome to try to reduce the rate of progressive aortic root enlargement characteristic of this condition, but their separate and joint effects are uncertain. We aimed to determine these effects in a collaborative individual patient data meta-analysis of randomised trials of these treatments. Methods: In this meta-analysis, we identified relevant trials of patients with Marfan syndrome by systematically searching MEDLINE, Embase, and CENTRAL from database inception to Nov 2, 2021. Trials were eligible if they involved a randomised comparison of an ARB versus control or an ARB versus β blocker. We used individual patient data from patients with no prior aortic surgery to estimate the effects of: ARB versus control (placebo or open control); ARB versus β blocker; and indirectly, β blocker versus control. The primary endpoint was the annual rate of change of body surface area-adjusted aortic root dimension Z score, measured at the sinuses of Valsalva. Findings: We identified ten potentially eligible trials including 1836 patients from our search, from which seven trials and 1442 patients were eligible for inclusion in our main analyses. Four trials involving 676 eligible participants compared ARB with control. During a median follow-up of 3 years, allocation to ARB approximately halved the annual rate of change in the aortic root Z score (mean annual increase 0·07 [SE 0·02] ARB vs 0·13 [SE 0·02] control; absolute difference –0·07 [95% CI –0·12 to –0·01]; p=0·012). Prespecified secondary subgroup analyses showed that the effects of ARB were particularly large in those with pathogenic variants in fibrillin-1, compared with those without such variants (heterogeneity p=0·0050), and there was no evidence to suggest that the effect of ARB varied with β-blocker use (heterogeneity p=0·54). Three trials involving 766 eligible participants compared ARBs with β blockers. During a median follow-up of 3 years, the annual change in the aortic root Z score was similar in the two groups (annual increase –0·08 [SE 0·03] in ARB groups vs –0·11 [SE 0·02] in β-blocker groups; absolute difference 0·03 [95% CI –0·05 to 0·10]; p=0·48). Thus, indirectly, the difference in the annual change in the aortic root Z score between β blockers and control was –0·09 (95% CI –0·18 to 0·00; p=0·042). Interpretation In people with Marfan syndrome and no previous aortic surgery, ARBs reduced the rate of increase of the aortic root Z score by about one half, including among those taking a β blocker. The effects of β blockers were similar to those of ARBs. Assuming additivity, combination therapy with both ARBs and β blockers from the time of diagnosis would provide even greater reductions in the rate of aortic enlargement than either treatment alone, which, if maintained over a number of years, would be expected to lead to a delay in the need for aortic surgery. Funding: Marfan Foundation, the Oxford British Heart Foundation Centre for Research Excellence, and the UK Medical Research Council
    corecore