10,358 research outputs found
Minimizing value-at-risk in the single-machine total weighted tardiness problem
The vast majority of the machine scheduling literature focuses on deterministic
problems, in which all data is known with certainty a priori. This may be a reasonable assumption when the variability in the problem parameters is low. However, as variability in the parameters increases incorporating this uncertainty explicitly into a scheduling model is essential to mitigate the resulting adverse effects. In this paper, we consider the celebrated single-machine total weighted tardiness (TWT) problem in the presence of uncertain problem parameters. We impose a probabilistic constraint on the random TWT and introduce a risk-averse stochastic programming model. In particular, the objective of the proposed model is to find a non-preemptive static job processing sequence that minimizes the value-at-risk (VaR) measure on the random
TWT at a specified confidence level. Furthermore, we develop a lower bound on the optimal VaR that may also benefit alternate solution approaches in the future. In this study, we implement a tabu-search heuristic to obtain reasonably good feasible solutions and present results to demonstrate the effect of the risk parameter and the value of the proposed model with respect to a corresponding risk-neutral approach
Parsing of Spoken Language under Time Constraints
Spoken language applications in natural dialogue settings place serious
requirements on the choice of processing architecture. Especially under adverse
phonetic and acoustic conditions parsing procedures have to be developed which
do not only analyse the incoming speech in a time-synchroneous and incremental
manner, but which are able to schedule their resources according to the varying
conditions of the recognition process. Depending on the actual degree of local
ambiguity the parser has to select among the available constraints in order to
narrow down the search space with as little effort as possible.
A parsing approach based on constraint satisfaction techniques is discussed.
It provides important characteristics of the desired real-time behaviour and
attempts to mimic some of the attention focussing capabilities of the human
speech comprehension mechanism.Comment: 19 pages, LaTe
Relational reasoning via probabilistic coupling
Probabilistic coupling is a powerful tool for analyzing pairs of
probabilistic processes. Roughly, coupling two processes requires finding an
appropriate witness process that models both processes in the same probability
space. Couplings are powerful tools proving properties about the relation
between two processes, include reasoning about convergence of distributions and
stochastic dominance---a probabilistic version of a monotonicity property.
While the mathematical definition of coupling looks rather complex and
cumbersome to manipulate, we show that the relational program logic pRHL---the
logic underlying the EasyCrypt cryptographic proof assistant---already
internalizes a generalization of probabilistic coupling. With this insight,
constructing couplings is no harder than constructing logical proofs. We
demonstrate how to express and verify classic examples of couplings in pRHL,
and we mechanically verify several couplings in EasyCrypt
Heuristic Voting as Ordinal Dominance Strategies
Decision making under uncertainty is a key component of many AI settings, and
in particular of voting scenarios where strategic agents are trying to reach a
joint decision. The common approach to handle uncertainty is by maximizing
expected utility, which requires a cardinal utility function as well as
detailed probabilistic information. However, often such probabilities are not
easy to estimate or apply.
To this end, we present a framework that allows "shades of gray" of
likelihood without probabilities. Specifically, we create a hierarchy of sets
of world states based on a prospective poll, with inner sets contain more
likely outcomes. This hierarchy of likelihoods allows us to define what we term
ordinally-dominated strategies. We use this approach to justify various known
voting heuristics as bounded-rational strategies.Comment: This is the full version of paper #6080 accepted to AAAI'1
Analysis of the potentials of multi criteria decision analysis methods to conduct sustainability assessment
Sustainability assessments require the management of a wide variety of information types, parameters and uncertainties. Multi criteria decision analysis (MCDA) has been regarded as a suitable set of methods to perform sustainability evaluations as a result of its flexibility and the possibility of facilitating the dialogue between stakeholders, analysts and scientists. However, it has been reported that researchers do not usually properly define the reasons for choosing a certain MCDA method instead of another. Familiarity and affinity with a certain approach seem to be the drivers for the choice of a certain procedure. This review paper presents the performance of five MCDA methods (i.e. MAUT, AHP, PROMETHEE, ELECTRE and DRSA) in respect to ten crucial criteria that sustainability assessments tools should satisfy, among which are a life cycle perspective, thresholds and uncertainty management, software support and ease of use. The review shows that MAUT and AHP are fairly simple to understand and have good software support, but they are cognitively demanding for the decision makers, and can only embrace a weak sustainability perspective as trade-offs are the norm. Mixed information and uncertainty can be managed by all the methods, while robust results can only be obtained with MAUT. ELECTRE, PROMETHEE and DRSA are non-compensatory approaches which consent to use a strong sustainability concept, accept a variety of thresholds, but suffer from rank reversal. DRSA is less demanding in terms of preference elicitation, is very easy to understand and provides a straightforward set of decision rules expressed in the form of elementary “if … then …” conditions. Dedicated software is available for all the approaches with a medium to wide range of results capability representation. DRSA emerges as the easiest method, followed by AHP, PROMETHEE and MAUT, while ELECTRE is regarded as fairly difficult. Overall, the analysis has shown that most of the requirements are satisfied by the MCDA methods (although to different extents) with the exclusion of management of mixed data types and adoption of life cycle perspective which are covered by all the considered approaches
Research on Rough Set Model Based on Golden Ratio
AbstractHow to make decision with pre-defined preference-ordered criteria also depends on the environment of the problem. Dominance rough set model is suitable for preference analysis and probabilistic rough set introduces probabilistic approaches to rough sets. In this paper, new dominance rough set rough set models are given by taking golden ratio into account. Also, we present steps to make decision using new dominance rough set models
Introduction
Zadanie pt. „Digitalizacja i udostępnienie w Cyfrowym Repozytorium Uniwersytetu Łódzkiego kolekcji czasopism naukowych wydawanych przez Uniwersytet Łódzki” nr 885/P-DUN/2014 zostało dofinansowane ze środków MNiSW w ramach działalności upowszechniającej naukę
- …