29 research outputs found

    Sensitivity of snow models to the accuracy of meteorological forcings in mountain environments

    Get PDF
    Snow models are usually evaluated at sites providing high-quality meteorological data, so that the uncertainty in the meteorological input data can be neglected when assessing model performances. However, high-quality input data are rarely available in mountain areas and, in practical applications, the meteorological forcing used to drive snow models is typically derived from spatial interpolation of the available in situ data or from reanalyses, whose accuracy can be considerably lower. In order to fully characterize the performances of a snow model, the model sensitivity to errors in the input data should be quantified. In this study we test the ability of six snow models to reproduce snow water equivalent, snow density and snow depth when they are forced by meteorological input data with gradually lower accuracy. The SNOWPACK, GEOTOP, HTESSEL, UTOPIA, SMASH and S3M snow models are forced, first, with high-quality measurements performed at the experimental site of Torgnon, located at 2160ma.s.l. in the Italian Alps (control run). Then, the models are forced by data at gradually lower temporal and/or spatial resolution, obtained by (i) sampling the original Torgnon 30 min time series at 3, 6, and 12 h, (ii) spatially interpolating neighbouring in situ station measurements and (iii) extracting information from GLDAS, ERA5 and ERA-Interim reanalyses at the grid point closest to the Torgnon site. Since the selected models are characterized by different degrees of complexity, from highly sophisticated multi-layer snow models to simple, empirical, single-layer snow schemes, we also discuss the results of these experiments in relation to the model complexity. The results show that, when forced by accurate 30 min resolution weather station data, the single-layer, intermediatecomplexity snow models HTESSEL and UTOPIA provide similar skills to the more sophisticated multi-layer model SNOWPACK, and these three models show better agreement with observations and more robust performances over different seasons compared to the lower-complexity models SMASH and S3M. All models forced by 3-hourly data provide similar skills to the control run, while the use of 6- A nd 12-hourly temporal resolution forcings may lead to a reduction in model performances if the incoming shortwave radiation is not properly represented. The SMASH model generally shows low sensitivity to the temporal degradation of the input data. Spatially interpolated data from neighbouring stations and reanalyses are found to be adequate forcings, provided that temperature and precipitation variables are not affected by large biases over the considered period. However, a simple bias-adjustment technique applied to ERA-Interim temperatures allowed all models to achieve similar performances to the control run. Regardless of their complexity, all models show weaknesses in the representation of the snow density

    Multiple Objective D-Optimal Sensor Management for Group Target Tracking

    Get PDF
    A group target is moving in an area well covered by a network of passive sensor nods with known positions. Additionally, there are a number of mobile robots with active sensors. In order to obtain a robust estimate of the position of the target and decrease the amount of energy spent on active sensing and communications by the sensor network and the mobile robots a sensor management system optimises the spatial configuration of the mobile robots over time. A tracking algorithm predicts the position of the target over multiple steps. An estimate for the tracking accuracy for each possible sensor action is calculated based on a function of the expected resulting posterior inverse covariance (information) matrix given the position of the nodes of the sensor networks and the feasible position of the mobile robots in future time instants. We propose a novel approach for active sensor management that combines the Rao-Blackwellised particle filter/predictor and multi-objective D-optimal optimisation. The designed decentralised Rao-Blackwellised particle filter (RBPF) is composed of two parts: a decentralised Information or Kalman filter and a particle filter (PF). The sensor management framework that is based on the generalised D-optimal optimisation with slack variables is proposed

    ДОСЛІДЖЕННЯ РОБОТИ АВТОМОБІЛЬНОГО ТРАНСПОРТУ В ЛОГІСТИЧНИХ ЛАНЦЮГАХ

    No full text
    In the article an issue of efficiency of operation of a car transport in a logistic chain is considered. On the basis of constructed mathematical model of the logistic chain functioning a number of studies re. influence of the features of a transport participant (on example of a car transport) on efficiency of operation of other participants and the chain as a whole is carried out.В статье рассмотрен вопрос эффективности работы автомобильного транспорта в логистической цепи. На основе построенной математической модели функционирования логистической цепи проведен ряд исследований относительно влияния характеристик транспортного участника (на примере автомобильного транспорта) на эффективность функционирования остальных участников и цепи в целом.У статті розглянуто питання ефективності роботи автомобільного транспорту в логістичному ланцюзі. На основі побудованої математичної моделі функціонування логістичного ланцюга проведено ряд досліджень відносно впливу характеристик транспортного учасника (на прикладі автомобільного транспорту) на ефективність функціонування інших учасників і ланцюга в цілому

    THE STUDY OF THE ROAD TRANSPORT IN THE LOGISTICS CHAIN

    No full text
    In the article an issue of efficiency of operation of a car transport in a logistic chain is considered. On the basis of constructed mathematical model of the logistic chain functioning a number of studies re. influence of the features of a transport participant (on example of a car transport) on efficiency of operation of other participants and the chain as a whole is carried out

    Coastal Risk Assessment Framework: Comparison of modelled fluvial and marine inundation impacts, Bocca di Magra, Ligurian coast, Italy

    No full text
    The identification and classification of critical coastal areas is becoming more and more important from a coastal management point of view, especially considering future climate change. The standardized assessment of multiple hazards and their potential impacts is crucial, in terms of risk management, for those coastal areas where both marine and fluvial hazards can occur. Nevertheless, in Bocca di Magra (Liguria Region, Italy), where both coastal and fluvial flooding can occur, up until now the potential impacts from marine flooding have not been thought to be of importance; only the impact of fluvial flooding has been systematically analysed. Now, however, the Liguria Regional stakeholders have become interested in understanding the potential impact of marine inundations compared to fluvial inundations, applying the CRAF (Coastal Risk Assessment Framework) methodology developed inside the RISC-KIT project. The hazard modelling of coastal and fluvial inundations was used, together with exposure data, to evaluate the direct and systemic impacts generated by both flooding mechanisms separately. An End-User-driven Multi-Criteria Analysis was implemented to compare coastal and fluvial impacts on the same area. For an event with a 1 in 200 year return period, the CRAF predicts that fluvial inundation generates higher impacts, in comparison to the marine one. Even though the impacts in the coastal area are less, the impacted exposed elements are different from those impacted by fluvial inundation and none of them can be excluded from the analysis. This work highlights the need for regional managers to develop combined coastal-fluvial flooding assessments; such actions should be seen as a priority for flood disaster risk management in locations affected by both marine flooding and riverine flash flooding
    corecore