809,959 research outputs found
Voice quality estimation in combined radio-VoIP networks for dispatching systems
The voice quality modelling assessment and planning field is deeply and widely theoretically and practically mastered for common voice communication systems, especially for the public fixed and mobile telephone networks including Next Generation Networks (NGN - internet protocol based networks). This article seeks to contribute voice quality modelling assessment and planning for dispatching communication systems based on Internet Protocol (IP) and private radio networks. The network plan, correction in E-model calculation and default values for the model are presented and discussed
Choosing the observational likelihood in state-space stock assessment models
Data used in stock assessment models result from combinations of biological,
ecological, fishery, and sampling processes. Since different types of errors
propagate through these processes it can be difficult to identify a particular
family of distributions for modelling errors on observations a priori. By
implementing several observational likelihoods, modelling both numbers- and
proportions-at-age, in an age based state-space stock assessment model, we
compare the model fit for each choice of likelihood along with the implications
for spawning stock biomass and average fishing mortality. We propose using AIC
intervals based on fitting the full observational model for comparing different
observational likelihoods. Using data from four stocks, we show that the model
fit is improved by modelling the correlation of observations within years.
However, the best choice of observational likelihood differs for different
stocks, and the choice is important for the short-term conclusions drawn from
the assessment model; in particular, the choice can influence total allowable
catch advise based on reference points.Comment: To be published in Canadian Journal of Fisheries and Aquatic Science
Architecture of Environmental Risk Modelling: for a faster and more robust response to natural disasters
Demands on the disaster response capacity of the European Union are likely to
increase, as the impacts of disasters continue to grow both in size and
frequency. This has resulted in intensive research on issues concerning
spatially-explicit information and modelling and their multiple sources of
uncertainty. Geospatial support is one of the forms of assistance frequently
required by emergency response centres along with hazard forecast and event
management assessment. Robust modelling of natural hazards requires dynamic
simulations under an array of multiple inputs from different sources.
Uncertainty is associated with meteorological forecast and calibration of the
model parameters. Software uncertainty also derives from the data
transformation models (D-TM) needed for predicting hazard behaviour and its
consequences. On the other hand, social contributions have recently been
recognized as valuable in raw-data collection and mapping efforts traditionally
dominated by professional organizations. Here an architecture overview is
proposed for adaptive and robust modelling of natural hazards, following the
Semantic Array Programming paradigm to also include the distributed array of
social contributors called Citizen Sensor in a semantically-enhanced strategy
for D-TM modelling. The modelling architecture proposes a multicriteria
approach for assessing the array of potential impacts with qualitative rapid
assessment methods based on a Partial Open Loop Feedback Control (POLFC) schema
and complementing more traditional and accurate a-posteriori assessment. We
discuss the computational aspect of environmental risk modelling using
array-based parallel paradigms on High Performance Computing (HPC) platforms,
in order for the implications of urgency to be introduced into the systems
(Urgent-HPC).Comment: 12 pages, 1 figure, 1 text box, presented at the 3rd Conference of
Computational Interdisciplinary Sciences (CCIS 2014), Asuncion, Paragua
Risk Assessment of Bioaccumulation Substances. Part II: Description of a Model Framework
This report provides a proposal for a framework for risk assessment of bioaccumulative substances, either from produced water discharges or present as background contamination. The proposed framework is such that it is compatible to the current EIF risk assessment models that are used in the Norwegian offshore oil and gas industry. The risk assessment approach selected for this framework is based on the use of critical body residues (CBR); i.e., body-tissue concentrations above which adverse effects are expected. A three-tiered risk assessment approach is distinguished: tier 1 for worst-case screening purposes; tier 2 based on probabilistic risk assessment using species sensitivity distributions and tier 3 focusing on population modelling for specific species. The latter tier is, because of its specific characteristics, not elaborated in detail. It is proposed to use a food-chain accumulation model to translate species sensitivity thresholds on the basis of CBR into external threshold concentrations, those external thresholds could then be used to either derive an ecosystem PNEC (tier I) or Species Sensitivity Distribution (tier II). This would provide a pragmatic approach to risk assessment of bioaccumulative substances in the context of the EIF modelling framework. Finally, an outline is provided for a research project in which the a risk assessment model for bioaccumulative substances is developed. This model will then be applied to two cases for purposes of demonstration and evaluation. An indication of workload and planning is provided
An Integrated Assessment approach to linking biophysical modelling and economic valuation tools
Natural resource management (NRM) typically involves complex decisions that affect a variety of stakeholder values. Efficient NRM, which achieves the greatest net environmental, social and financial benefits, needs to integrate the assessment of environmental impacts with the costs and benefits of investment. Integrated assessment (IA) is one approach that incorporates the several dimensions of catchment NRM, by considering multiple issues and knowledge from various disciplines and stakeholders. Despite the need for IA, there are few studies that integrate biophysical modelling tools with economic valuation. In this paper, we demonstrate how economic non-market valuation tools can be used to support an IA of catchment NRM changes. We develop a Bayesian Network model that integrates: a process-based water quality model; ecological assessments of native riparian vegetation; estimates of management costs; and non-market (intangible) values of changes in riparian vegetation. This modelling approach illustrates how information from different sources can be integrated in one framework to evaluate the environmental and economic impacts of NRM actions. It also shows the uncertainties associated with the estimated welfare effects. By estimating the marginal social costs and benefits, a cost-benefit analysis of alternative management intervention can be gained and provides more economic rationality to NRM decisions.Bayesian networks, bio-economic modelling, catchment management, cost-benefit analysis, environmental values, integrated assessment and modelling, non-market valuation, riparian vegetation, Environmental Economics and Policy, Research Methods/ Statistical Methods,
The Integration of Coastal Flooding into an ArcFLOOD Data Model
With the impact of global climate change, the speedy, intelligent and accessible dissemination of coastal flood predictions from a number of modelling tools at a range of temporal and spatial scales becomes increasingly important for policy decision makers. This thesis provides a novel approach to integrate the coastal flood data into an ArcFLOOD data model to improve the analysis, assessment and mitigation of the potential flood risk in coastal zones. This novel methodology has improved the accessibility, dissemination and visualisation of coastal flood risk. The results were condensed into spatial information flows, data model schematic diagrams and XML schema for end-user extension, customisation and spatial analysis. More importantly, software developers with these applications can now develop rich internet applications with little knowledge of numerical flood modelling systems. Specifically, this work has developed a coastal flooding geodatabase based upon the amalgamation, reconditioning and analysis of numerical flood modelling.
In this research, a distinct lack of Geographic Information Systems (GIS) data modelling for coastal flooding prediction was identified in the literature. A schema was developed to provide the linkage between numerical flood modelling, flood risk assessment and information technology (IT) by extending the ESRI ArcGIS Marine Data Model (MDM) to include coastal flooding. The results of a linked hybrid hydrodynamic-morphological numerical flood model were used to define the time-series representation of a coastal flood in the schema.
The results generated from GIS spatial analyses have improved the interpretation of numerical flood modelling output by effectively mapping the flood risk in the study site, with an improved definition according to the time-series duration of a flood. The improved results include flood water depth at a point and flood water increase which equates to the difference in significant wave height for each time step of coastal flooding. The flood risk mapping provided has indicated the potential risk to infrastructure and property and depicted the failure of flood defence structures. In the wider context, the results have been provided to allow knowledge transfer to a range of coastal flooding end-users.Natural Environment Research Counci
Don't know, can't know: Embracing deeper uncertainties when analysing risks
This article is available open access through the publisherâs website at the link below. Copyright @ 2011 The Royal Society.Numerous types of uncertainty arise when using formal models in the analysis of risks. Uncertainty is best seen as a relation, allowing a clear separation of the object, source and âownerâ of the uncertainty, and we argue that all expressions of uncertainty are constructed from judgements based on possibly inadequate assumptions, and are therefore contingent. We consider a five-level structure for assessing and communicating uncertainties, distinguishing three within-model levelsâevent, parameter and model uncertaintyâand two extra-model levels concerning acknowledged and unknown inadequacies in the modelling process, including possible disagreements about the framing of the problem. We consider the forms of expression of uncertainty within the five levels, providing numerous examples of the way in which inadequacies in understanding are handled, and examining criticisms of the attempts taken by the Intergovernmental Panel on Climate Change to separate the likelihood of events from the confidence in the science. Expressing our confidence in the adequacy of the modelling process requires an assessment of the quality of the underlying evidence, and we draw on a scale that is widely used within evidence-based medicine. We conclude that the contingent nature of risk-modelling needs to be explicitly acknowledged in advice given to policy-makers, and that unconditional expressions of uncertainty remain an aspiration
Applying Bayesian model averaging for uncertainty estimation of input data in energy modelling
Background
Energy scenarios that are used for policy advice have ecological and social impact on society. Policy measures that are based on modelling exercises may lead to far reaching financial and ecological consequences. The purpose of this study is to raise awareness that energy modelling results are accompanied with uncertainties that should be addressed explicitly.
Methods
With view to existing approaches of uncertainty assessment in energy economics and climate science, relevant requirements for an uncertainty assessment are defined. An uncertainty assessment should be explicit, independent of the assessor’s expertise, applicable to different models, including subjective quantitative and statistical quantitative aspects, intuitively understandable and be reproducible. Bayesian model averaging for input variables of energy models is discussed as method that satisfies these requirements. A definition of uncertainty based on posterior model probabilities of input variables to energy models is presented.
Results
The main findings are that (1) expert elicitation as predominant assessment method does not satisfy all requirements, (2) Bayesian model averaging for input variable modelling meets the requirements and allows evaluating a vast amount of potentially relevant influences on input variables and (3) posterior model probabilities of input variable models can be translated in uncertainty associated with the input variable.
Conclusions
An uncertainty assessment of energy scenarios is relevant if policy measures are (partially) based on modelling exercises. Potential implications of these findings include that energy scenarios could be associated with uncertainty that is presently neither assessed explicitly nor communicated adequately
Maintenance strategy optimisation for infrastructure assets through cost modelling
In infrastructure asset management, maintenance strategies in terms of cost modelling is normally adopted to achieve two broad strategic objectives: to ensure that sufficient funding is available to maintain the portfolio of assets; and to ensure that a minimum cost is achieved while maintaining safety. The data and information required for carrying out cost modelling are often not sufficient in quantity and quality. Even if the data is available, the uncertainty associated with the data and the assessment of the assetsâ condition remain a challenge to be dealt with. We report in this paper that cost modelling can be carried out at the initial stage instead of delaying it due to data insufficiency. Subjective expertsâ knowledge is elicited and utilised together with some information which is gathered only for a small sample of assets. Linear Bayes methods is adopted to combine the sample data with the subjective expertsâ knowledge to estimate unknown model parameters of the cost model. We use a case study from the rail industry to demonstrate the methods proposed in this paper. The assets are metal girders on bridges from a rail company. The optimal maintenance strategy is obtained via simulation based on estimated model parameters
A Crash Risk Assessment Model for Road Curves
A comprehensive model to assess crash risks and reduce driverâs exposure to risks on road curves is still unavailable. We aim to create a model that can assist a driver to negotiate road curves safely. The overall model uses situation awareness, ubiquitous data mining and driver behaviour modelling concepts to assess crash risks on road curves. However, only the risk assessment model, which is part of the overall model, is presented in the paper. Crash risks are assessed using the predictions and a risk assessment scale that is created based on driver behaviours on road curves. This paper identifies the contributing factors from which we assess crash risk level. Five risk levels are defined and the contributing factors for each crash risk level are used to determine risk. The contributing factors are identified from a set of insurance crash records using link analysis. The factors will be compared with the actual factors of the driving context in order to determine the risk level
- âŚ