5,524 research outputs found
A New Weighting Scheme in Weighted Markov Model for Predicting the Probability of Drought Episodes
Drought is a complex stochastic natural hazard caused by prolonged shortage
of rainfall. Several environmental factors are involved in determining drought
classes at the specific monitoring station. Therefore, efficient sequence
processing techniques are required to explore and predict the periodic
information about the various episodes of drought classes. In this study, we
proposed a new weighting scheme to predict the probability of various drought
classes under Weighted Markov Chain (WMC) model. We provide a standardized
scheme of weights for ordinal sequences of drought classifications by
normalizing squared weighted Cohen Kappa. Illustrations of the proposed scheme
are given by including temporal ordinal data on drought classes determined by
the standardized precipitation temperature index (SPTI). Experimental results
show that the proposed weighting scheme for WMC model is sufficiently flexible
to address actual changes in drought classifications by restructuring the
transient behavior of a Markov chain. In summary, this paper proposes a new
weighting scheme to improve the accuracy of the WMC, specifically in the field
of hydrology
Forecasting in Mathematics
Mathematical probability and statistics are an attractive, thriving, and respectable part of mathematics. Some mathematicians and philosophers of science say they are the gateway to mathematics’ deepest mysteries. Moreover, mathematical statistics denotes an accumulation of mathematical discussions connected with efforts to most efficiently collect and use numerical data subject to random or deterministic variations. Currently, the concept of probability and mathematical statistics has become one of the fundamental notions of modern science and the philosophy of nature. This book is an illustration of the use of mathematics to solve specific problems in engineering, statistics, and science in general
Recommended from our members
Common mortality modeling and coherent forecasts. An empirical analysis of worldwide mortality data
A new common mortality modeling structure is presented for analyzing mortality dynamics for a pool of countries, under the framework of generalized linear models (GLM). The countries are first classified by fuzzy c-means cluster analysis in order to construct the common sparse age-period model structure for the mortality experience. Next, we propose a method to create the common sex difference age-period model structure and then use this to produce the residual age-periodmodel structure for each country and sex. The time related principal components are extrapolated using dynamic linear regression (DLR) models and coherent mortality forecasts are investigated. We make use of mortality data from the “Human Mortality Database”
Scenarios, probability and possible futures
This paper provides an introduction to the mathematical theory of possibility, and examines how this tool can contribute to the analysis of far distant futures. The degree of mathematical possibility of a future is a number between O and 1. It quantifies the extend to which a future event is implausible or surprising, without implying that it has to happen somehow. Intuitively, a degree of possibility can be seen as the upper bound of a range of admissible probability levels which goes all the way down to zero. Thus, the proposition `The possibility of X is Pi(X) can be read as `The probability of X is not greater than Pi(X).Possibility levels offers a measure to quantify the degree of unlikelihood of far distant futures. It offers an alternative between forecasts and scenarios, which are both problematic. Long range planning using forecasts with precise probabilities is problematic because it tends to suggests a false degree of precision. Using scenarios without any quantified uncertainty levels is problematic because it may lead to unjustified attention to the extreme scenarios.This paper further deals with the question of extreme cases. It examines how experts should build a set of two to four well contrasted and precisely described futures that summarizes in a simple way their knowledge. Like scenario makers, these experts face multiple objectives: they have to anchor their analysis in credible expertise; depict though-provoking possible futures; but not so provocative as to be dismissed out-of-hand. The first objective can be achieved by describing a future of possibility level 1. The second and third objective, however, balance each other. We find that a satisfying balance can be achieved by selecting extreme cases that do not rule out equiprobability. For example, if there are three cases, the possibility level of extremes should be about 1/3.Futures, futurible, scenarios, possibility, imprecise probabilities, uncertainty, fuzzy logic
Prognostics in switching systems: Evidential markovian classification of real-time neuro-fuzzy predictions.
International audienceCondition-based maintenance is nowadays considered as a key-process in maintenance strategies and prognostics appears to be a very promising activity as it should permit to not engage inopportune spending. Various approaches have been developed and data-driven methods are increasingly applied. The training step of these methods generally requires huge datasets since a lot of methods rely on probability theory and/or on artificial neural networks. This step is thus time-consuming and generally made in batch mode which can be restrictive in practical application when few data are available. A method for prognostics is proposed to face up this problem of lack of information and missing prior knowledge. The approach is based on the integration of three complementary modules and aims at predicting the failure mode early while the system can switch between several functioning modes. The three modules are: 1) observation selection based on information theory and Choquet Integral, 2) prediction relying on an evolving real-time neuro-fuzzy system and 3) classification into one of the possible functioning modes using an evidential Markovian classifier based on Dempster-Shafer theory. Experiments concern the prediction of an engine health based on more than twenty observations
A fuzzy approach for measuring development of topics in patents using Latent Dirichlet Allocation
© 2015 IEEE. Technology progress brings the very rapid growth of patent publications, which increases the difficulty of domain experts to measure the development of various topics, handle linguistic terms used in evaluation and understand massive technological content. To overcome the limitations of keyword-ranking type of text mining result in existing research, and at the same time deal with the vagueness of linguistic terms to assist thematic evaluation, this research proposes a fuzzy set-based topic development measurement (FTDM) approach to estimate and evaluate the topics hidden in a large volume of patent claims using Latent Dirichlet Allocation. In this study, latent semantic topics are first discovered from patent corpus and measured by a temporal-weight matrix to reveal the importance of all topics in different years. For each topic, we then calculate a temporal-weight coefficient based on the matrix, which is associated with a set of linguistic terms to describe its development state over time. After choosing a suitable linguistic term set, fuzzy membership functions are created for each term. The temporal-weight coefficients are then transformed to membership vectors related to the linguistic terms, which can be used to measure the development states of all topics directly and effectively. A case study using solar cell related patents is given to show the effectiveness of the proposed FTDM approach and its applicability for estimating hidden topics and measuring their corresponding development states efficiently
Recommended from our members
Bayesian recursive parameter estimation for hydrologic models
The uncertainty in a given hydrologic prediction is the compound effect of the parameter, data, and structural uncertainties associated with the underlying model. In general, therefore, the confidence in a hydrologic prediction can be improved by reducing the uncertainty associated with the parameter estimates. However, the classical approach to doing this via model calibration typically requires that considerable amounts of data be collected and assimilated before the model can be used. This limitation becomes immediately apparent when hydrologic predictions must be generated for a previously ungauged watershed that has only recently been instrumented. This paper presents the framework for a Bayesian recursive estimation approach to hydrologic prediction that can be used for simultaneous parameter estimation and prediction in an operational setting. The prediction is described in terms of the probabilities associated with different output values. The uncertainty associated with the parameter estimates is updated (reduced) recursively, resulting in smaller prediction uncertainties as measurement data are successively assimilated. The effectiveness and efficiency of the method are illustrated in the context of two models: a simple unit hydrograph model and the more complex Sacramento soil moisture accounting model, using data from the Leaf River basin in Mississippi
- …