214 research outputs found
Protocol modelling : synchronous composition of data and behaviour
This thesis develops and explores a technique called Protocol Modelling, a mathematics
for the description of orderings. Protocol Modelling can be viewed as a hybrid
of object orientation, as it supports ideas of data encapsulation and object instantiation;
and process algebra, as it supports a formally defined idea of process and process composition.
The first half of the thesis focuses on describing and defining the Protocol Modelling
technique. A formal denotational semantics for protocol machines is developed and
used to establish various properties; in particular that composition is closed and preserves
type safety. The formal semantics is extended to cover instantiation of objects.
Comparison is made with other process algebras and an approach to unification of
different formulations of the semantics of process composition is proposed.
The second half of the thesis explores three applications of Protocol Modelling:
Object Modelling. This explores the use of Protocol Modelling as a medium for object
modelling, and the facility to execute protocol models is described. Protocol Modelling
is compared with other object modelling techniques; in particular by contrasting
its compositional style with traditional hierarchical inheritance.
Protocol Contracts. This proposes the use of protocol models as a medium for expressing
formal behavioural contracts. This is compared with more traditional forms
of software contract in the generalization of the notion of contractual obligation as a
mechanism for software specification.
Choreographed Collaborations. In this application Protocol Modelling is used as a
medium to describe choreographies for asynchronous multiparty collaborations. A
compositional approach to choreography engineering, enabled by the synchronous
semantics of Protocol Modelling, is explored and results established concerning sufficient
conditions for choreography realizability. The results are extended to address
choreographies that employ behavioural rules based on data
Protocol modelling : synchronous composition of data and behaviour
This thesis develops and explores a technique called Protocol Modelling, a mathematics
for the description of orderings. Protocol Modelling can be viewed as a hybrid
of object orientation, as it supports ideas of data encapsulation and object instantiation;
and process algebra, as it supports a formally defined idea of process and process composition.
The first half of the thesis focuses on describing and defining the Protocol Modelling
technique. A formal denotational semantics for protocol machines is developed and
used to establish various properties; in particular that composition is closed and preserves
type safety. The formal semantics is extended to cover instantiation of objects.
Comparison is made with other process algebras and an approach to unification of
different formulations of the semantics of process composition is proposed.
The second half of the thesis explores three applications of Protocol Modelling:
Object Modelling. This explores the use of Protocol Modelling as a medium for object
modelling, and the facility to execute protocol models is described. Protocol Modelling
is compared with other object modelling techniques; in particular by contrasting
its compositional style with traditional hierarchical inheritance.
Protocol Contracts. This proposes the use of protocol models as a medium for expressing
formal behavioural contracts. This is compared with more traditional forms
of software contract in the generalization of the notion of contractual obligation as a
mechanism for software specification.
Choreographed Collaborations. In this application Protocol Modelling is used as a
medium to describe choreographies for asynchronous multiparty collaborations. A
compositional approach to choreography engineering, enabled by the synchronous
semantics of Protocol Modelling, is explored and results established concerning sufficient
conditions for choreography realizability. The results are extended to address
choreographies that employ behavioural rules based on data
Projection in discourse:A data-driven formal semantic analysis
A sentence like "Bertrand, a famous linguist, wrote a book" contains different contributions: there is a person named "Bertrand", he is a famous linguist, and he wrote a book. These contributions convey different types of information; while the existence of Bertrand is presented as given information---it is presupposed---the other contributions signal new information. Moreover, the contributions are affected differently by linguistic constructions. The inference that Bertrand wrote a book disappears when the sentence is negated or turned into interrogative form, while the other contributions survive; this is called 'projection'. In this thesis, I investigate the relation between different types of contributions in a sentence from a theoretical and empirical perspective. I focus on projection phenomena, which include presuppositions ('Bertrand exists' in the aforementioned example) and conventional implicatures ('Bertrand is a famous linguist'). I argue that the differences between the contributions can be explained in terms of information status, which describes how content relates to the unfolding discourse context. Based on this analysis, I extend the widely used formal representational system Discourse Representation Theory (DRT) with an explicit representation of the different contributions made by projection phenomena; this extension is called 'Projective Discourse Representation Theory' (PDRT). I present a data-driven computational analysis based on data from the Groningen Meaning Bank, a corpus of semantically annotated texts. This analysis shows how PDRT can be used to learn more about different kinds of projection behaviour. These results can be used to improve linguistically oriented computational applications such as automatic translation systems
A study in metaphysics for free will: using models of causality, determinism and supervenience in the search for free will
We have two main aims: to construct mathematical models for analysing determinism, causality and supervenience; and then to use these to demonstrate the possibility of constructing an ontic construal of the operation of free will - one requiring both the presentation of genuine alternatives to an agent and their selecting between them in a manner that permits the attribution of responsibility.
Determinism is modelled using trans-temporal ontic links between discrete juxtaposed universe states and shown to be distinct from predictability. Causality is defined on a temporal sequence of δ-algebras and quantified using a measure. The measure leads to definitions of causal overdetermination and epiphenomena. Proofs are constructed to demonstrate deterministic universes must carry their properties essentially but not necessarily locally. We argue determinism and causality are separate doctrines.
These models and results are marshalled to put the case that a counterfactual construal of ontic choice cannot work. In response we propose ‘immanence’ - a modified form of indeterminism whereby a universe can present choices to its denizens.
We prove that beings subsumed within a universe cannot pilot their own actions. We then argue these beings can exercise free will only when selecting between choices inhering within immanent relata. A being is responsible for its selections if and only if it is constituted of a temporally evolving deterministic substructure. Our proposal is novel: it avoids injecting indeterminism into the decision process.
Topological models for property supervenience are developed and used to reconstruct standard definitions from the literature. These are then used to demonstrate considerations of supervenience do not affect our arguments. We have demonstrated that a model of the exercise of free will involving both genuine choices and responsibility is possible but can only operate within a non-deterministic universe possessing specific traits
Treatment of imprecision in data repositories with the aid of KNOLAP
Traditional data repositories introduced for the needs of business
processing, typically focus on the storage and querying of crisp
domains of data. As a result, current commercial data repositories
have no facilities for either storing or querying imprecise/
approximate data.
No significant attempt has been made for a generic and applicationindependent
representation of value imprecision mainly as a
property of axes of analysis and also as part of dynamic
environment, where potential users may wish to define their “own”
axes of analysis for querying either precise or imprecise facts. In
such cases, measured values and facts are characterised by
descriptive values drawn from a number of dimensions, whereas
values of a dimension are organised as hierarchical levels.
A solution named H-IFS is presented that allows the representation
of flexible hierarchies as part of the dimension structures. An
extended multidimensional model named IF-Cube is put forward,
which allows the representation of imprecision in facts and
dimensions and answering of queries based on imprecise
hierarchical preferences. Based on the H-IFS and IF-Cube
concepts, a post relational OLAP environment is delivered, the
implementation of which is DBMS independent and its performance
solely dependent on the underlying DBMS engine
Algebraic Structures of Neutrosophic Triplets, Neutrosophic Duplets, or Neutrosophic Multisets
Neutrosophy (1995) is a new branch of philosophy that studies triads of the form (, , ), where is an entity {i.e. element, concept, idea, theory, logical proposition, etc.}, is the opposite of , while is the neutral (or indeterminate) between them, i.e., neither nor .Based on neutrosophy, the neutrosophic triplets were founded, which have a similar form (x, neut(x), anti(x)), that satisfy several axioms, for each element x in a given set.This collective book presents original research papers by many neutrosophic researchers from around the world, that report on the state-of-the-art and recent advancements of neutrosophic triplets, neutrosophic duplets, neutrosophic multisets and their algebraic structures – that have been defined recently in 2016 but have gained interest from world researchers. Connections between classical algebraic structures and neutrosophic triplet / duplet / multiset structures are also studied. And numerous neutrosophic applications in various fields, such as: multi-criteria decision making, image segmentation, medical diagnosis, fault diagnosis, clustering data, neutrosophic probability, human resource management, strategic planning, forecasting model, multi-granulation, supplier selection problems, typhoon disaster evaluation, skin lesson detection, mining algorithm for big data analysis, etc
Effective Quantum Extended Spacetime of Polymer Schwarzschild Black Hole
The physical interpretation and eventual fate of gravitational singularities
in a theory surpassing classical general relativity are puzzling questions that
have generated a great deal of interest among various quantum gravity
approaches. In the context of loop quantum gravity (LQG), one of the major
candidates for a non-perturbative background-independent quantisation of
general relativity, considerable effort has been devoted to construct effective
models in which these questions can be studied. In these models, classical
singularities are replaced by a "bounce" induced by quantum geometry
corrections. Undesirable features may arise however depending on the details of
the model. In this paper, we focus on Schwarzschild black holes and propose a
new effective quantum theory based on polymerisation of new canonical phase
space variables inspired by those successful in loop quantum cosmology. The
quantum corrected spacetime resulting from the solutions of the effective
dynamics is characterised by infinitely many pairs of trapped and anti-trapped
regions connected via a space-like transition surface replacing the central
singularity. Quantum effects become relevant at a unique mass independent
curvature scale, while they become negligible in the low curvature region near
the horizon. The effective quantum metric describes also the exterior regions
and asymptotically classical Schwarzschild geometry is recovered. We however
find that physically acceptable solutions require us to select a certain subset
of initial conditions, corresponding to a specific mass (de-)amplification
after the bounce. We also sketch the corresponding quantum theory and
explicitly compute the kernel of the Hamiltonian constraint operator.Comment: 50 pages, 10 figures; v2: journal version, minor comment and
references added; v3: minor corrections in section 5.3 to match journal
versio
- …