221 research outputs found

    Decreasing stream habitat for Greenback cutthroat trout under future climate projections in headwater streams of the southern Rocky Mountains, Colorado

    Get PDF
    2022 Summer.Includes bibliographical references.Headwaters are vital to the abundance and diversity of biota as they produce various temperatures, light, hydrologic regimes, water chemistry, substrate type, food resources, and species pools. Many studies have shown that headwater streams are especially vulnerable to changing climate, and coldwater fish are especially sensitive to the fluctuations in streamflow and water temperature during summertime low flows. Though previous studies have provided insights on how changes in climate and alterations in stream discharge may affect the habitat requirements for native cutthroat trout species, the suitable physical habitats have not been evaluated under future climate projections for the threatened Greenback Cutthroat Trout (GBCT) occupying the headwater regions in the Southern Rocky Mountains. Thus, this study used field data collected in the summers of 2019 and 2020 from selected headwater streams across the Front Range in the Southern Rocky Mountains to construct one-dimensional hydraulic models (HEC-RAS) to evaluate streamflow and physical habitat under four future climate projections. A principal component analysis (PCA) was then performed to demonstrate the importance of each morphological feature of these streams. Results illustrate high variations in both predicted streamflow reductions and physical habitat for all future climate projections. The projected mean summer streamflow shows much greater decline compared to the projected mean August flow. Moreover, sites located at higher elevations with larger substrate (D50 and D84) and steeper slopes may experience greater reductions in physical habitat under mean summer future climate projections. Future climate change studies on cold-water fisheries need to take multiple influential factors into account instead of heavily focusing on the thermal characteristics. Reintroduction and management efforts for GBCT should be tailored to the individual headwater stream with adequate on-site monitoring that can be applied in a more holistic manner as well

    Evaluating and correcting sensor change artifacts in the SNOTEL temperature records, southern Rocky Mountains, Colorado

    Get PDF
    2017 Summer.Includes bibliographical references.In many high elevation mountain regions, documented warming rates have been greater than the global surface average. These warming rates directly affect the snowpack, runoff, ecosystems, agriculture and species that rely on a high elevation snowpack. Temperature records from the snow telemetry (SNOTEL) network across the Southern Rocky Mountains in the western United States have high warming rates, which may have been affected by systematic inhomogeneities in the temperature data caused by sensor changes. This study evaluates the maximum, average, and minimum temperature trends from 68 long-term SNOTEL stations across Colorado for the period from the 1980s through 2015 using the non-parametric Mann-Kendall/Theil-Sen's analyses before and after the temperature records were corrected for the sensor-caused inhomogeneities. Three homogenization methods were tested using a simple temperature index snow accumulation and melt model. Results show that the significant warming trends found in the original datasets, especially in minimum temperature (average increase of 1.2 °C per decade), decreased (to an average of 0.5 °C per decade) after homogenization. Step-like shifts in temperature datasets were observed in SNOTEL temperature records at the time of temperature sensor change, which created a discontinuity in the temperature dataset. The temperature-index snow model simulated snow water equivalent (SWE) well (more than 93% of the calibrated stations within the "good" and "very good" performance category for all three statistical-evaluation periods based on the Nash-Sutcliffe coefficient of efficiency, NSCE) using the new temperature sensor dataset. However, these models did not perform as well when using the original (pre-sensor change) and homogenized temperatures, with 23% of stations for the original temperature data and 44-69% of stations for two homogenized temperature datasets within the "good" and "very good"temperature data, but they did not fully correct for the effects of sensor change on the temperature records. The NSCE and bias statistics from SWE modeling using the original and homogenized datasets suggest that the homogenization methods evaluated in this study are applicable for many of the SNOTEL stations in Colorado but not all, and need to be applied with caution. Potential users of temperature products from the SNOTEL network should also be very careful when choosing time periods for future climate change research and assessments. More long-term climate monitoring stations should be installed in high elevation mountain regions to document and investigate elevation-dependent warming

    Bridging Parametric and Nonparametric Methods in Cognitive Diagnosis

    Full text link
    A number of parametric and nonparametric methods for estimating cognitive diagnosis models (CDMs) have been developed and applied in a wide range of contexts. However, in the literature, a wide chasm exists between these two families of methods, and their relationship to each other is not well understood. In this paper, we propose a unified estimation framework to bridge the divide between parametric and nonparametric methods in cognitive diagnosis to better understand their relationship. We also develop iterative joint estimation algorithms and establish consistency properties within the proposed framework. Lastly, we present comprehensive simulation results to compare different methods, and provide practical recommendations on the appropriate use of the proposed framework in various CDM contexts

    A Note on Improving Variational Estimation for Multidimensional Item Response Theory

    Full text link
    Survey instruments and assessments are frequently used in many domains of social science. When the constructs that these assessments try to measure become multifaceted, multidimensional item response theory (MIRT) provides a unified framework and convenient statistical tool for item analysis, calibration, and scoring. However, the computational challenge of estimating MIRT models prohibits its wide use because many of the extant methods can hardly provide results in a realistic time frame when the number of dimensions, sample size, and test length are large. Instead, variational estimation methods, such as Gaussian Variational Expectation Maximization (GVEM) algorithm, have been recently proposed to solve the estimation challenge by providing a fast and accurate solution. However, results have shown that variational estimation methods may produce some bias on discrimination parameters during confirmatory model estimation, and this note proposes an importance weighted version of GVEM (i.e., IW-GVEM) to correct for such bias under MIRT models. We also use the adaptive moment estimation method to update the learning rate for gradient descent automatically. Our simulations show that IW-GVEM can effectively correct bias with modest increase of computation time, compared with GVEM. The proposed method may also shed light on improving the variational estimation for other psychometrics models

    Food supplementation as a conservation intervention:A framework and a case of helping threatened shorebirds at a refuelling site

    Get PDF
    Supplemental feeding to mitigate the effects of food shortages may in some cases provide critical help to species conservation. However, supplemental feeding may have both positive and negative effects on wildlife and the environment. A scientifically designed feeding project helps to achieve conservation targets and reduces adverse effects. Here, we summarize a three-step framework for food supplementation that we used in practice: (1) determining whether supplemental feeding is required; (2) designing and implementing a practical feeding scheme; and (3) evaluating the effectiveness of food supplementation. We supplemented food for great knots (Calidris tenuirostris), an endangered migratory shorebird, at a recently impoverished refuelling site (Yalu Jiang estuary) in the Yellow Sea in spring 2018. The abundance of the staple food of great knots (Potamocorbula laevis, which had become very rare after 2012), was insufficient for the birds to refuel before the migratory flight to the breeding grounds. In our practical test, living P. laevis were collected in subtidal areas and transported to the intertidal area where great knots had been foraging in earlier years. The supplemented areas attracted 48% of all the great knots present in the 200 km2 study area. Nearly 90% of the supplemented food was consumed. Most great knots (>80%) foraged in the high-density supplementation zone where the densities of P. laevis were restored to the naturally occurring levels in 2011–2012. Here, food intake rates (mg AFDM/s) were 4.2 times those in the adjacent control zones. The framework and the feeding practice should help guide future supplemental feeding in a wide range of species

    Context-aware Event Forecasting via Graph Disentanglement

    Full text link
    Event forecasting has been a demanding and challenging task throughout the entire human history. It plays a pivotal role in crisis alarming and disaster prevention in various aspects of the whole society. The task of event forecasting aims to model the relational and temporal patterns based on historical events and makes forecasting to what will happen in the future. Most existing studies on event forecasting formulate it as a problem of link prediction on temporal event graphs. However, such pure structured formulation suffers from two main limitations: 1) most events fall into general and high-level types in the event ontology, and therefore they tend to be coarse-grained and offers little utility which inevitably harms the forecasting accuracy; and 2) the events defined by a fixed ontology are unable to retain the out-of-ontology contextual information. To address these limitations, we propose a novel task of context-aware event forecasting which incorporates auxiliary contextual information. First, the categorical context provides supplementary fine-grained information to the coarse-grained events. Second and more importantly, the context provides additional information towards specific situation and condition, which is crucial or even determinant to what will happen next. However, it is challenging to properly integrate context into the event forecasting framework, considering the complex patterns in the multi-context scenario. Towards this end, we design a novel framework named Separation and Collaboration Graph Disentanglement (short as SeCoGD) for context-aware event forecasting. Since there is no available dataset for this novel task, we construct three large-scale datasets based on GDELT. Experimental results demonstrate that our model outperforms a list of SOTA methods.Comment: KDD 2023, 9 pages, 7 figures, 4 table

    Carbon Nanotube Coated Fibrous Tubes for Highly Stretchable Strain Sensors Having High Linearity

    Get PDF
    Strain sensors are currently limited by an inability to operate over large deformations or to exhibit linear responses to strain. Producing strain sensors meeting these criteria remains a particularly difficult challenge. In this work, the fabrication of a highly flexible strain sensor based on electrospun thermoplastic polyurethane (TPU) fibrous tubes comprising wavy and oriented fibers coated with carboxylated multiwall carbon nanotubes (CNTs) is described. By combining spraying and ultrasonic-assisted deposition, the number of CNTs deposited on the electrospun TPU fibrous tube could reach 12 wt%, which can potentially lead to the formation of an excellent conductive network with high conductivity of 0.01 S/cm. The as-prepared strain sensors exhibited a wide strain sensing range of 0–760% and importantly high linearity over the whole sensing range while maintaining high sensitivity with a GF of 57. Moreover, the strain sensors were capable of detecting a low strain (2%) and achieved a fast response time whilst retaining a high level of durability. The TPU/CNTs fibrous tube-based strain sensors were found capable of accurately monitoring both large and small human body motions. Additionally, the strain sensors exhibited rapid response time, (e.g., 45 ms) combined with reliable long-term stability and durability when subjected to 60 min of water washing. The strain sensors developed in this research had the ability to detect large and subtle human motions, (e.g., bending of the finger, wrist, and knee, and swallowing). Consequently, this work provides an effective method for designing and manufacturing high-performance fiber-based wearable strain sensors, which offer wide strain sensing ranges and high linearity over broad working strain ranges

    Application of the improved dynamical–Statistical–Analog ensemble forecast model for landfalling typhoon precipitation in Fujian province

    Get PDF
    The forecasting performance of the Dynamical–Statistical–Analog Ensemble Forecast (DSAEF) model for Landfalling Typhoon [or tropical cyclone (TC)] Precipitation (DSAEF_LTP), with new values of two parameters (i.e., similarity region and ensemble method) for landfalling TC precipitation over Fujian Province, is tested in four experiments. Forty-two TCs with precipitation over 100 mm in Fujian Province during 2004–2020 are chosen as experimental samples. Thirty of them are training samples and twelve are independent samples. First, simulation experiments for the training samples are used to determine the best scheme of the DSAEF_LTP model. Then, the forecasting performance of this best scheme is evaluated through forecast experiments. In the forecast experiments, the TSsum (the sum of threat scores for predicting TC accumulated rainfall of ≥250 mm and ≥100 mm) of experiments DSAEF_A, B, C, D is 0.0974, 0.2615, 0.2496, and 0.4153, respectively. The results show that the DSAEF_LTP model performs best when both adding new values of the similarity region and ensemble method (DSAEF_D). At the same time, the TSsum of the best performer of numerical weather prediction (NWP) models is only 0.2403. The improved DSAEF_LTP model shows advantages compared to the NWP models. It is an important method to improve the predictability of the DSAEF_LTP model by adopting different schemes in different regions
    • …
    corecore