90 research outputs found

    Calibration of the APEX Model to Simulate Management Practice Effects on Runoff, Sediment, and Phosphorus Loss

    Get PDF
    Process-based computer models have been proposed as a tool to generate data for Phosphorus (P) Index assessment and development. Although models are commonly used to simulate P loss from agriculture using managements that are different from the calibration data, this use of models has not been fully tested. The objective of this study is to determine if the Agricultural Policy Environmental eXtender (APEX) model can accurately simulate runoff, sediment, total P, and dissolved P loss from 0.4 to 1.5 ha of agricultural fields with managements that are different from the calibration data. The APEX model was calibrated with field-scale data from eight different managements at two locations (management-specific models). The calibrated models were then validated, either with the same management used for calibration or with different managements. Location models were also developed by calibrating APEX with data from all managements. The management-specific models resulted in satisfactory performance when used to simulate runoff, total P, and dissolved P within their respective systems, with r2 \u3e 0.50, Nash– Sutcliffe efficiency \u3e 0.30, and percent bias within ±35% for runoff and ±70% for total and dissolved P. When applied outside the calibration management, the management-specific models only met the minimum performance criteria in one-third of the tests. The location models had better model performance when applied across all managements compared with management-specific models. Our results suggest that models only be applied within the managements used for calibration and that data be included from multiple management systems for calibration when using models to assess management effects on P loss or evaluate P Indices

    Supplemental Material for: Multi-site evaluation of APEX for water quality: II. Regional parameterization

    Get PDF
    Model performance was assessed using Nash-Sutcliffe model efficiency (NSE), coefficient of determination (r2), and percent bias (PBIAS) as defined by Moriasi et al. (2007 and 2015). Threshold values indicating acceptable model performance based on these statistics are dependent on the spatial and temporal scales of the data, water quality constituents of interest, and the modeling objectives (Moriasi et al., 2015). Although some standard values have been suggested (Moriasi et al., 2007 and 2015), considerable variability exist in the published literature. For instance Ramanarayan et al. (1997) considered r2 \u3e0.5 and NSE \u3e0.40 as satisfactory for simulation of monthly surface water quality with the APEX model. Chung et al. (2002) defined r2 \u3e 0.5 and NSE \u3e 0.3 as satisfactory for monthly tile flow and NO3-N loss simulated with the Erosion Productivity Impact Calculator (EPIC) model. Wang et al. (2008) indicated r2 \u3e 0.5 and NSE \u3e 0.4 as acceptable for monthly runoff and nutrient concentrations using the APEX model. Moriasi et al. (2007) suggested NSE \u3e 0.5 with P-bias ±25% for streamflow, ±55% for sediment and ±70% for nitrogen and phosphorus for monthly values. They also indicated that NSE values can be relaxed for shorter time steps (daily events). Yin et al. (2009) reported NSE for event based runoff and sediment between 0.41-0.84 and r2 between 0.55 - 0.85. Mudgal et al. (2010) regarded r2 \u3e 0.5 and NSE \u3e 0.45 as threshold for satisfactory calibration and validation with event data

    Association of Accelerometry-Measured Physical Activity and Cardiovascular Events in Mobility-Limited Older Adults: The LIFE (Lifestyle Interventions and Independence for Elders) Study.

    Get PDF
    BACKGROUND:Data are sparse regarding the value of physical activity (PA) surveillance among older adults-particularly among those with mobility limitations. The objective of this study was to examine longitudinal associations between objectively measured daily PA and the incidence of cardiovascular events among older adults in the LIFE (Lifestyle Interventions and Independence for Elders) study. METHODS AND RESULTS:Cardiovascular events were adjudicated based on medical records review, and cardiovascular risk factors were controlled for in the analysis. Home-based activity data were collected by hip-worn accelerometers at baseline and at 6, 12, and 24 months postrandomization to either a physical activity or health education intervention. LIFE study participants (n=1590; age 78.9±5.2 [SD] years; 67.2% women) at baseline had an 11% lower incidence of experiencing a subsequent cardiovascular event per 500 steps taken per day based on activity data (hazard ratio, 0.89; 95% confidence interval, 0.84-0.96; P=0.001). At baseline, every 30 minutes spent performing activities ≥500 counts per minute (hazard ratio, 0.75; confidence interval, 0.65-0.89 [P=0.001]) were also associated with a lower incidence of cardiovascular events. Throughout follow-up (6, 12, and 24 months), both the number of steps per day (per 500 steps; hazard ratio, 0.90, confidence interval, 0.85-0.96 [P=0.001]) and duration of activity ≥500 counts per minute (per 30 minutes; hazard ratio, 0.76; confidence interval, 0.63-0.90 [P=0.002]) were significantly associated with lower cardiovascular event rates. CONCLUSIONS:Objective measurements of physical activity via accelerometry were associated with cardiovascular events among older adults with limited mobility (summary score >10 on the Short Physical Performance Battery) both using baseline and longitudinal data. CLINICAL TRIAL REGISTRATION:URL: http://www.clinicaltrials.gov. Unique identifier: NCT01072500

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Get PDF

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)1.

    Get PDF
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    The IDENTIFY study: the investigation and detection of urological neoplasia in patients referred with suspected urinary tract cancer - a multicentre observational study

    Get PDF
    Objective To evaluate the contemporary prevalence of urinary tract cancer (bladder cancer, upper tract urothelial cancer [UTUC] and renal cancer) in patients referred to secondary care with haematuria, adjusted for established patient risk markers and geographical variation. Patients and Methods This was an international multicentre prospective observational study. We included patients aged ≥16 years, referred to secondary care with suspected urinary tract cancer. Patients with a known or previous urological malignancy were excluded. We estimated the prevalence of bladder cancer, UTUC, renal cancer and prostate cancer; stratified by age, type of haematuria, sex, and smoking. We used a multivariable mixed-effects logistic regression to adjust cancer prevalence for age, type of haematuria, sex, smoking, hospitals, and countries. Results Of the 11 059 patients assessed for eligibility, 10 896 were included from 110 hospitals across 26 countries. The overall adjusted cancer prevalence (n = 2257) was 28.2% (95% confidence interval [CI] 22.3–34.1), bladder cancer (n = 1951) 24.7% (95% CI 19.1–30.2), UTUC (n = 128) 1.14% (95% CI 0.77–1.52), renal cancer (n = 107) 1.05% (95% CI 0.80–1.29), and prostate cancer (n = 124) 1.75% (95% CI 1.32–2.18). The odds ratios for patient risk markers in the model for all cancers were: age 1.04 (95% CI 1.03–1.05; P < 0.001), visible haematuria 3.47 (95% CI 2.90–4.15; P < 0.001), male sex 1.30 (95% CI 1.14–1.50; P < 0.001), and smoking 2.70 (95% CI 2.30–3.18; P < 0.001). Conclusions A better understanding of cancer prevalence across an international population is required to inform clinical guidelines. We are the first to report urinary tract cancer prevalence across an international population in patients referred to secondary care, adjusted for patient risk markers and geographical variation. Bladder cancer was the most prevalent disease. Visible haematuria was the strongest predictor for urinary tract cancer
    corecore