47 research outputs found
Linguistic probability theory
In recent years probabilistic knowledge-based systems such as Bayesian networks and influence diagrams have come to the fore as a means of representing and reasoning about complex real-world situations. Although some of the
probabilities used in these models may be obtained statistically, where this is
impossible or simply inconvenient, modellers rely on expert knowledge. Experts, however, typically find it difficult to specify exact probabilities and conventional representations cannot reflect any uncertainty they may have. In
this way, the use of conventional point probabilities can damage the accuracy,
robustness and interpretability of acquired models. With these concerns in
mind, psychometric researchers have demonstrated that fuzzy numbers are
good candidates for representing the inherent vagueness of probability estimates, and the fuzzy community has responded with two distinct theories of
fuzzy probabilities.This thesis, however, identifies formal and presentational problems with these
theories which render them unable to represent even very simple scenarios.
This analysis leads to the development of a novel and intuitively appealing
alternative - a
theory of linguistic probabilities patterned after the standard Kolmogorov axioms of probability theory. Since fuzzy numbers lack algebraic
inverses, the resulting theory is weaker than, but generalises its classical counterpart. Nevertheless, it is demonstrated that analogues for classical probabilistic concepts such as conditional probability and random variables can be
constructed. In the classical theory, representation theorems mean that most of
the time the distinction between mass/density distributions and probability
measures can be ignored. Similar results are proven for linguistic probabiliities.From these results it is shown that directed acyclic graphs annotated with linguistic probabilities (under certain identified conditions) represent systems of
linguistic random variables. It is then demonstrated these linguistic Bayesian
networks can utilise adapted best-of-breed Bayesian network algorithms (junction tree based inference and Bayes' ball irrelevancy calculation). These algorithms are implemented in ARBOR, an interactive design, editing and querying
tool for linguistic Bayesian networks.To explore the applications of these techniques, a realistic example drawn from
the domain of forensic statistics is developed. In this domain the knowledge
engineering problems cited above are especially pronounced and expert estimates are commonplace. Moreover, robust conclusions are of unusually critical importance. An analysis of the resulting linguistic Bayesian network for
assessing evidential support in glass-transfer scenarios highlights the potential
utility of the approach
Fuzzy expert systems in civil engineering
Imperial Users onl
A practical development of multi-attribute decision making using fuzzy set theory
The foundations of multi-attribute utility theory are reviewed and compared with the author's practical experience and other psychological studies of decision-making. The case is presented for a new approach to decision-making, moving away from the strictly numerical techniques. Instead of concentrating on the normative or descriptive aspects of decision-making, themed-problem of decision-making is studied, thereby giving the decision-maker more control over the decision-making process and ensuring a more truly participative approach to design and decision-making. The problem of uncertainty is also tackled by considering it from both the stochastic and fuzzy standpoints. A revised approach to the assessment of uncertainty and its incorporation in the decision-making process is advocated. The theoretical framework behind these ideas is expressed using fuzzy set theory. Previous attempts to apply fuzzy set theory to multi-attribute decision-making are reviewed and criticised for their failure to tackle the basic assumptions of multi-attribute utility theory. A practical methodology for using verbal descriptions is derived, and illustrated with a worked example. A practical description of how to apply the method is included, and the results of some applications are presented
Constructing 3D faces from natural language interface
This thesis presents a system by which 3D images of human faces can be constructed
using a natural language interface. The driving force behind the project was the need to
create a system whereby a machine could produce artistic images from verbal or
composed descriptions. This research is the first to look at constructing and modifying
facial image artwork using a natural language interface.
Specialised modules have been developed to control geometry of 3D polygonal head
models in a commercial modeller from natural language descriptions. These modules
were produced from research on human physiognomy, 3D modelling techniques and
tools, facial modelling and natural language processing. [Continues.
Methods for designing and optimizing fuzzy controllers
We start by discussing fuzzy sets and the algebra of fuzzy sets. We consider some properties of fuzzy modeling tools. This is followed by considering the Mamdani and Sugeno models for designing fuzzy controllers. Various methods for using sets of data for desining controllers are discussed. This is followed by a chapter illustrating the use of genetic algorithms in designing and optimizing fuzzy controllers.Finally we look at some previous applications of fuzzy control in telecommunication networks, and illustrate a simple application that was developed as part of the present work
Death of Paradox: The Killer Logic Beneath the Standards of Proof
The prevailing but contested view of proof standards is that factfinders should determine facts by probabilistic reasoning. Given imperfect evidence, they should ask themselves what they think the chances are that the burdened party would be right if the truth were to become known; they then compare those chances to the applicable standard of proof.
I contend that for understanding the standards of proof, the modern versions of logic — in particular, fuzzy logic and belief functions — work better than classical probability. This modern logic suggests that factfinders view evidence of an imprecisely perceived and described reality to form a fuzzy degree of belief in a fact’s existence; they then apply the standard of proof in accordance with the theory of belief functions, by comparing their belief in a fact’s existence to their belief in its negation.
This understanding explains how the standard of proof actually works in the law world. It gives a superior mental image of the factfinders’ task, conforms more closely to what we know of people’s cognition, and captures better what the law says its standards are and how it manipulates them. One virtue of this conceptualization is that it is not a radically new view. Another virtue is that it nevertheless manages to resolve some stubborn problems of proof, including the infamous conjunction paradox
Neutrosophic Theory and its Applications : Collected Papers - vol. 1
Neutrosophic Theory means Neutrosophy applied in many fields in order to solve problems related to indeterminacy. Neutrosophy is a new branch of philosophy that studies the origin, nature, and scope of neutralities, as well as their interactions with different ideational spectra. This theory considers every entity together with its opposite or negation and with their spectrum of neutralities in between them (i.e. entities supporting neither nor ). The and ideas together are referred to as . Neutrosophy is a generalization of Hegel\u27s dialectics (the last one is based on and only). According to this theory every entity tends to be neutralized and balanced by and entities - as a state of equilibrium. In a classical way , , are disjoint two by two. But, since in many cases the borders between notions are vague, imprecise, Sorites, it is possible that , , (and of course) have common parts two by two, or even all three of them as well. Hence, in one hand, the Neutrosophic Theory is based on the triad , , and . In the other hand, Neutrosophic Theory studies the indeterminacy, labelled as I, with In = I for n ≥ 1, and mI + nI = (m+n)I, in neutrosophic structures developed in algebra, geometry, topology etc. The most developed fields of the Neutrosophic Theory are Neutrosophic Set, Neutrosophic Logic, Neutrosophic Probability, and Neutrosophic Statistics - that started in 1995, and recently Neutrosophic Precalculus and Neutrosophic Calculus, together with their applications in practice.
Neutrosophic Set and Neutrosophic Logic are generalizations of the fuzzy set and respectively fuzzy logic (especially of intuitionistic fuzzy set and respectively intuitionistic fuzzy logic). In neutrosophic logic a proposition has a degree of truth (T), a degree of indeterminacy (I), and a degree of falsity (F), where T, I, F are standard or non-standard subsets of ]-0, 1+[. Neutrosophic Probability is a generalization of the classical probability and imprecise probability. Neutrosophic Statistics is a generalization of the classical statistics. What distinguishes the neutrosophics from other fields is the , which means neither nor . And , which of course depends on , can be indeterminacy, neutrality, tie (game), unknown, contradiction, vagueness, ignorance, incompleteness, imprecision, etc
Multivalued Logic, Neutrosophy and Schrodinger equation
This book was intended to discuss some paradoxes in Quantum Mechanics from the viewpoint of Multi-Valued-logic pioneered by Lukasiewicz, and a recent concept Neutrosophic Logic. Essentially, this new concept offers new insights on the idea of ‘identity’, which too often it has been accepted as given. Neutrosophy itself was developed in attempt to generalize Fuzzy-Logic introduced by L. Zadeh. While some aspects of theoretical foundations of logic are discussed, this book is not intended solely for pure mathematicians, but instead for physicists in the hope that some of ideas presented herein will be found useful. The book is motivated by observation that despite almost eight decades, there is indication that some of those paradoxes known in Quantum Physics are not yet solved. In our knowledge, this is because the solution of those paradoxes requires re-examination of the foundations of logic itself, in particular on the notion of identity and multi-valuedness of entity.
The book is also intended for young physicist fellows who think that somewhere there should be a ‘complete’ explanation of these paradoxes in Quantum Mechanics. If this book doesn’t answer all of their questions, it is our hope that at least it offers a new alternative viewpoint for these old questions