254 research outputs found
Obeticholic acid for the treatment of non-alcoholic steatohepatitis: interim analysis from a multicentre, randomised, placebo-controlled phase 3 trial
Background Non-alcoholic steatohepatitis (NASH) is a common type of chronic liver disease that can lead to cirrhosis. Obeticholic acid, a farnesoid X receptor agonist, has been shown to improve the histological features of NASH. Here we report results from a planned interim analysis of an ongoing, phase 3 study of obeticholic acid for NASH. Methods In this multicentre, randomised, double-blind, placebo-controlled study, adult patients with definite NASH,non-alcoholic fatty liver disease (NAFLD) activity score of at least 4, and fibrosis stages F2–F3, or F1 with at least oneaccompanying comorbidity, were randomly assigned using an interactive web response system in a 1:1:1 ratio to receive oral placebo, obeticholic acid 10 mg, or obeticholic acid 25 mg daily. Patients were excluded if cirrhosis, other chronic liver disease, elevated alcohol consumption, or confounding conditions were present. The primary endpointsfor the month-18 interim analysis were fibrosis improvement (≥1 stage) with no worsening of NASH, or NASH resolution with no worsening of fibrosis, with the study considered successful if either primary endpoint was met. Primary analyses were done by intention to treat, in patients with fibrosis stage F2–F3 who received at least one dose of treatment and reached, or would have reached, the month 18 visit by the prespecified interim analysis cutoff date. The study also evaluated other histological and biochemical markers of NASH and fibrosis, and safety. This study is ongoing, and registered with ClinicalTrials.gov, NCT02548351, and EudraCT, 20150-025601-6. Findings Between Dec 9, 2015, and Oct 26, 2018, 1968 patients with stage F1–F3 fibrosis were enrolled and received at least one dose of study treatment; 931 patients with stage F2–F3 fibrosis were included in the primary analysis (311 in the placebo group, 312 in the obeticholic acid 10 mg group, and 308 in the obeticholic acid 25 mg group). The fibrosis improvement endpoint was achieved by 37 (12%) patients in the placebo group, 55 (18%) in the obeticholic acid 10 mg group (p=0·045), and 71 (23%) in the obeticholic acid 25 mg group (p=0·0002). The NASH resolution endpoint was not met (25 [8%] patients in the placebo group, 35 [11%] in the obeticholic acid 10 mg group [p=0·18], and 36 [12%] in the obeticholic acid 25 mg group [p=0·13]). In the safety population (1968 patients with fibrosis stages F1–F3), the most common adverse event was pruritus (123 [19%] in the placebo group, 183 [28%] in the obeticholic acid 10 mg group, and 336 [51%] in the obeticholic acid 25 mg group); incidence was generally mild to moderate in severity. The overall safety profile was similar to that in previous studies, and incidence of serious adverse events was similar across treatment groups (75 [11%] patients in the placebo group, 72 [11%] in the obeticholic acid 10 mg group, and 93 [14%] in the obeticholic acid 25 mg group). Interpretation Obeticholic acid 25 mg significantly improved fibrosis and key components of NASH disease activity among patients with NASH. The results from this planned interim analysis show clinically significant histological improvement that is reasonably likely to predict clinical benefit. This study is ongoing to assess clinical outcomes
Author Correction: Drivers of seedling establishment success in dryland restoration efforts
1 Pág.
Correción errata.In the version of this Article originally published, the surname of author Tina Parkhurst was incorrectly written as Schroeder. This has now been corrected.Peer reviewe
Recommended from our members
Using Light Attenuation to Estimate Leafy Spurge Impacts on Forage Production
Rangeland managers often must decide whether to suppress dicotyledonous weed populations with expensive and time-consuming management strategies. Often, the underlying goal of weed suppression efforts is to increase production of native forage plants. Many managers suppress weeds only when they feel the unwanted plants are substantially impacting their forage base. Currently, intuition and guesswork are used to determine whether weed impacts are severe enough to warrant action. We believe scientific impact assessments could be more effective than these casual approaches to decision making. Scientific approaches will necessitate data on weed abundances because the severity of a weed’s impact is highly correlated with its abundance. The need for weed abundance data poses major obstacles because gathering these data with readily available techniques is time consuming. Most managers cannot or will not spend a lot of time gathering vegetation data. In this paper, we explore a rapidly measured index (<2 minutes per sample location) that is highly correlated with weed (i.e., leafy spurge Euphorbia esula L.) abundance per unit area. This index is based on the light attenuation leafy spurge causes. After measuring light attenuation in plots planted to leafy spurge and grasses, we developed a probabilistic model that predicts leafy spurge impacts on forage production. Data from experiments where herbicides suppressed leafy spurge provided an opportunity to evaluate prediction accuracy of the model. In each case herbicide experiment data fell within the range of values (i.e., credibility intervals) the model predicted, even though the model development experiments were separated from the herbicide experiments by several hundred kilometers in space and 4 years in time. Therefore, we conclude that the model successfully accounts for spatial and temporal variation. We believe light attenuation could help natural resource managers quickly quantify some kinds of weed impacts.  The Rangeland Ecology & Management archives are made available by the Society for Range Management and the University of Arizona Libraries. Contact [email protected] for further information.Migrated from OJS platform August 2020Legacy DOIs that must be preserved: 10.2458/azu_jrm_v59i4_rinell
Data from: Studying long-term, large-scale grassland restoration outcomes to improve seeding methods and reveal knowledge gaps
Studies are increasingly investigating effects of large-scale management activities on grassland restoration outcomes. These studies are providing useful comparisons among currently used management strategies, but not the novel strategies needed to rapidly improve restoration efforts. Here we illustrate how managing restoration projects adaptively can allow promising management innovations to be identified and tested.
We studied 327 Great Plains fields seeded after coal mining. We modelled plant responses to management strategies to identify the most effective previously used strategies for constraining weeds and establishing desired plants. Then, we used the model to predict responses to new strategies our analysis identified as potentially more effective.
Where established, the weed crested wheatgrass (Agropyron cristatum L.) increased through time, indicating a need to manage establishment of this grass. Seeding particular grasses reduced annual weed cover, and because these grasses appeared to become similarly abundant whether sown at low or high rates, low rates could likely be safely used to reduce seeding costs. More importantly, lower than average grass seed rates increased cover of shrubs, the plants most difficult to restore to many grassland ecosystems. After identifying grass seed rates as a driver, we formulated model predictions for rates below the range managers typically use. These predictions require testing but indicated atypically low grass seed rates would further increase shrubs without hindering long-term grass stand development.
Synthesis and applications. Designing management around empirically based predictions is a logical next step towards improving ecological restoration efforts. Our predictions are that reducing grass seed rates to atypically low levels will boost shrubs without compromising grasses. Because these predictions derive from the fitted model, they represent quantitative hypotheses based on current understanding of the system. Generating data needed to test and update these hypotheses will require monitoring responses to shifts in management, specifically shifts to lower grass seed rates. A paucity of data for confronting hypotheses has been a major sticking point hindering adaptive management of most natural resources, but this need not be the case with degraded grasslands, because ongoing restoration efforts around the globe are providing continuous opportunities to monitor and manage processes regulating grassland restoration outcomes
Recommended from our members
Growth Regulator Herbicides Prevent Invasive Annual Grass Seed Production Under Field Conditions
Growth regulator herbicides, such as 2,4-D, dicamba, picloram, and aminopyralid, are commonly used to control broadleaf weeds in rangelands, noncroplands, and cereal crops. If applied to cereals at late growth stages, while the grasses are developing reproductive parts, the herbicides often reduce cereal seed production. We are researching methods for using this injury response to control invasive annual grasses in rangelands by depleting their short-lived seed banks. In a previous greenhouse study, we found picloram and dicamba reduced seed production of the invasive annual grass Japanese brome (Bromus japonicus Thunb.) by nearly 100%. However, this promising greenhouse finding needs to be corroborated in the field before growth regulators can be confidently recommended for invasive annual grass control. This research note describes a study conducted in eastern Montana suggesting growth regulators may provide excellent control of invasive annual grasses. Specifically, we found typical use rates of aminopyralid and picloram reduced Japanese brome seed production by more than 95% (based on sample means) when applied at three different plant growth stages. This promising result contributes to the accumulating body of evidence suggesting growth regulators may control invasive annual grasses. The Rangeland Ecology & Management archives are made available by the Society for Range Management and the University of Arizona Libraries. Contact [email protected] for further information.Migrated from OJS platform August 202
Mowing: An important part of integrated weed management
The Rangelands archives are made available by the Society for Range Management and the University of Arizona Libraries. Contact [email protected] for further information.Migrated from OJS platform March 202
Primary Productivity and Precipitation-Use Efficiency in Mixed-Grass Prairie: A Comparison of Northern and Southern US Sites
Precipitation-use efficiency (PUE) is a key determinant of aboveground net primary production (ANPP). We used long-term datasets to contrast ANPP and PUE estimates between northern (southeast Montana) and southern (north Texas) mixed-grass prairies. Effects of varying amounts and temporal distribution of precipitation on PUE were examined at the Montana site, using a rainout shelter and irrigation. Results show that 1) ANPP was 21% less in Montana than Texas (188 g m-2 vs. 237 g m-2); 2) plant function type (PFT) composition varied between the two study locations, with cool-season perennial grasses (CSPG) dominating in Montana (52%) and warm-season perennial grasses (WSPG) dominating in Texas (47%); 3) production dynamics varied between the two sites with 90% of ANPP completed by 1 July in Montana as compared to 31 August in Texas; 4) average PUE estimates were greater in Montana (0.56 g dry matter m-2 mm-1 of precipitation) than Texas (0.40 g m-2 mm-1); and 5) contributions to PUE estimates varied among PFT and location, with CSPG estimates being greater in Montana than Texas (52% vs. 31%) and WSPG estimates being greater in Texas than Montana (47% vs. 27%). Seasonal droughts and supplemental irrigations at the Montana site substantially altered ANPP, PFT biomass composition, and PUE. Results show PUE was responsive to PFT composition relative to amount and seasonal distribution of precipitation. Therefore, one should expect changes in ANPP and PUE to occur with shifts in precipitation patterns until PFT composition becomes adjusted to the regime. The Rangeland Ecology & Management archives are made available by the Society for Range Management and the University of Arizona Libraries. Contact [email protected] for further information.Migrated from OJS platform August 202
Data from: High precipitation and seeded species competition reduce seeded shrub establishment during dryland restoration
Drylands comprise 40% of Earth's land mass and are critical to food security, carbon sequestration, and threatened and endangered wildlife. Exotic weed invasions, overgrazing, energy extraction, and other factors have degraded many drylands, and this has placed an increased emphasis on dryland restoration. The increased restoration focus has generated a wealth of experience, innovations and empirical data, yet the goal of restoring diverse, native, dryland plant assemblages composed of grasses, forbs, and shrubs has generally proven beyond reach. Of particular concern are shrubs, which often fail to establish or establish at trivially low densities. We used data from two Great Plains, USA coal mines to explore factors regulating shrub establishment. Our predictor data related to weather and restoration (e.g., seed rates, rock cover) variables, and our response data described shrub abundances on fields of the mines. We found that seeded non-shrubs, especially grasses, formed an important competitive barrier to shrub establishment: With every one standard deviation increase in non-shrub seed rate, the probability shrubs were present decreased ~0.1 and shrub cover decreased ~35%. Since new fields were seeded almost every year for >20 years, the data also provided a unique opportunity to explore effects of stochastic drivers (i.e., precipitation, year effects). With every one standard deviation increase in precipitation the first growing season following seeding, the probability shrubs were present decreased ~0.07 and shrub cover decreased ~47%. High precipitation appeared to harm shrubs by increasing grass growth/competition. Also, weak evidence suggested shrub establishment was better in rockier fields where grass abundance/competition was lower. Multiple lines of evidence suggest reducing grass seed rates below levels typically used in Great Plains restoration would benefit shrubs without substantially impacting grass stand development over the long term. We used Bayesian statistics to estimate effects of seed rates and other restoration predictors probabilistically to allow knowledge of the predictors' effects to be refined through time in an adaptive management framework. We believe this framework could improve restoration planning in a variety of systems where restoration outcomes remain highly uncertain and ongoing restoration efforts are continually providing new data of value for reducing the uncertainty
High precipitation and seeded species competition reduce seeded shrub establishment during dryland restoration
- …