72 research outputs found
Distinguishing between ‘Normal’ and ‘Extreme’ Price Volatility in Food Security Assessment
Volatile food prices are held to threaten food security worldwide, but controversy over how to distinguish between ‘normal’ and ‘extreme’ volatility compromises threat assessment and identification of countermeasures. Whether food-market dynamics normally stabilize or destabilize prices is the source of controversy. The conventional view is that market dynamics are inherently stable so that price volatility—arising from exogenous shocks normally stabilizes due to forces of supply and demand. Extended food panics are improbable, reducing need for interventionist public policy. An emergent alternative view is that market dynamics are inherently unstable so that volatility persists endogenously. Interventionist public policy is needed to deal with chronic food panics
Modeling soil water dynamics considering measurement uncertainty
In shallow water table controlled environments, surface water management impacts groundwater table levels and soil water dynamics. The study goal was to simulate soil water dynamics in response to canal stage raises considering uncertainty in measured soil water content. WAVE (Water and Agrochemicals in the soil, crop and Vadose Environment) was applied to simulate unsaturated flow above a shallow aquifer. Global sensitivity analysis was performed to identify model input factors with greatest influence on predicted soil water content. Nash-Sutcliffe increased and Root Mean Square Error reduced when uncertainties in measured data were considered in goodness-of-fit calculations using measurement probability distributions and probable asymmetric error boundaries; implying that appropriate model performance evaluation should be done using uncertainty ranges instead of single values. Although uncertainty in the experimental measured data limited evaluation of the absolute predictions by the model, WAVE was found a useful exploratory tool for estimating temporal variation in soil water content. Visual analysis of soil water content time series under proposed changes in canal stage management indicated that sites with land surface elevation of less than 2.0 m NGVD29 were predicted to periodically experience saturated conditions in the root zone and shortening of the growing season if canal stage is raised more than 9 cm and maintained at this level. The models developed could be combined with high resolution digital elevation models in future studies to identify areas with the greatest risk of experiencing saturated root zone. The study also highlighted the need to incorporate measurement uncertainty when evaluating performance of unsaturated flow models
Dynamic factor analysis of surface water management impacts on soil and bedrock water contents in Southern Florida Lowlands
As part of the C111 spreader canal project, structural and operational modifications involving incremental raises in canal stage are planned along one of the major canals (i.e., C111) separating Everglades National Park and agricultural production areas to the east of the park. This study used Dynamic Factor Analysis (DFA) as an alternative tool to physically based models to explore the relationship between different hydrologic variables and the effect of proposed changes in surface water management on soil and bedrock water contents in south Florida. To achieve the goal, objectives were to: (1) use DFA to identify the most important factors affecting temporal variation in soil and bedrock water contents, (2) develop a simplified DFA based regression model for predicting soil and bedrock water contents as a function of canal stage and (3) assess the effect of the proposed incremental raises in canal stage on soil and bedrock water contents. DFA revealed that 5 common trends were the minimum required to describe unexplained variation in the 11 time series studied. Introducing canal stage, water table evaporation and net recharge resulted in lower Akaike information criterion (AIC) and higher Nash-Sutcliffe (C[subscript eff]) values. Results indicated that canal stage significantly (t > 2) drives temporal variation in soil and bedrock water contents, which was represented as scaled frequency while net surface recharge was significant in 7 out of the 11 time series analyzed. The effect of water table evaporation was not significant at all sites. Results also indicated that the most important factor influencing temporal variation in soil and bedrock water contents in terms of regression coefficient magnitude was canal stage. Based on DFA results, a simple regression model was developed to predict soil and bedrock water contents at various elevations as a function of canal stage and net recharge. The performance of the simple model ranged from good (C[subscript eff] ranging from 0.56 to 0.74) to poor (C[subscript eff] ranging from 0.10 to 0.15), performance was better at sites with smaller depths to water table (< 1 m) highlighting the effect of micro-topography on soil and bedrock water content dynamics. Assessment of the effect of 6, 9 and 12 cm increases in canal stage using the simple regression model indicated that changes in temporal variation in soil and bedrock water contents were negligible (average<1.0% average change) at 500 to 2000 m from C111 (or low elevations) which may be attributed to the near saturation conditions already occurring at these sites. This study used DFA to explore the relationship between soil and bedrock water dynamics and surface water stage in shallow water table environments. This approach can be applied to any system in which detailed physical modeling would be limited by inadequate information on parameters or processes governing the physical system
Simulating water table response to proposed changes in surface water management in the C-111 agricultural basin of south Florida
As part of an effort to restore the hydrology of Everglades National Park (ENP), incremental raises in canal stage are proposed along a major canal draining south Florida called C-111, which separates ENP from agricultural lands. The study purpose was to use monitoring and modeling to investigate the effect of the proposed incremental raises in canal stage on water table elevation in agricultural lands. The objectives were to: (1) develop a MODFLOW based model for simulating groundwater flow within the study area, (2) apply the developed model to determine if the proposed changes in canal stage result in significant changes in water table elevation, root zone saturation or groundwater flooding and (3) assess aquifer response to large rainfall events. Results indicate the developed model was able to reproduce measured water table elevation with an average Nash-Sutcliffe > 0.9 and Root Mean Square Error 2 year return period storm), reduced water table intrusion into the root zone. We conclude that the impact of operational changes in canal stage management on root zone saturation and groundwater flooding depended on micro-topography within the field and depth of storm events. The findings of this study can be used in fine tuning canal stage operations to minimize root zone saturation and groundwater flooding of agricultural fields while maximizing environmental benefits through increased water flow in the natural wetland areas. This study also highlights the benefit of detailed field scale simulations
Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study
Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection
Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study
Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
- …