43 research outputs found
Recommended from our members
Investigation and Modelling of Quantum-like User Cognitive Behaviour in Information Access and Retrieval
This thesis is fundamentally about using conceptual and mathematical constructs from the area of Quantum Theory in Information Retrieval (IR). The need and motivation for this is two-fold â firstly, it has been increasingly shown in decision sciences that human decision-making does not always conform to the norms of traditional probability and logic framework. The quantum framework offers a generalised probability and logic framework which can model decisions or judgements under dynamic context and ambiguity. Secondly, there is a need in IR for theories and models which improve our understanding of user behaviour. Hence it is worth exploring the combination of the quantum framework and IR, especially focused on the user aspects of IR. The overarching research question is whether there is evidence of user behaviour in IR scenarios which warrants the need for a quantum based approach by way of showing the limitations of the traditional (classical) approach. The methodology involves analysing data to detect quantum-like phenomena of interference, contextuality, incompatibility, etc. from two common data sources in IRâstandard datasets like query log data, and through crowdsourced user studies designed similar to some quantum physics or cognitive science experiments. While the evidence of quantum-like phenomena from standard datasets is not convincing, we find that some of the user studies reveal the quantum-like structure of document judgements. One of the key findings which has implication for IR is the dynamic interactions between the different dimensions of relevance. For example, a userâs judgement of reliability of a document depends significantly on whether they found it understandable or not. Thus, the consideration of one relevance dimension or document feature can provide a context for another dimension, contrary to the current IR models which consider these features to be independent of each other and an objective property of the document. The quantum framework has been especially designed to deal with such scenarios where properties of systems or objects do not exist independent of measurement context. The thesis concludes with suggestions about incorporating quantum mathematical constructs into state-of-art IR algorithms
Automated Validation of State-Based Client-Centric Isolation with TLA <sup>+</sup>
Clear consistency guarantees on data are paramount for the design and implementation of distributed systems. When implementing distributed applications, developers require approaches to verify the data consistency guarantees of an implementation choice. Crooks et al. define a state-based and client-centric model of database isolation. This paper formalizes this state-based model in, reproduces their examples and shows how to model check runtime traces and algorithms with this formalization. The formalized model in enables semi-automatic model checking for different implementation alternatives for transactional operations and allows checking of conformance to isolation levels. We reproduce examples of the original paper and confirm the isolation guarantees of the combination of the well-known 2-phase locking and 2-phase commit algorithms. Using model checking this formalization can also help finding bugs in incorrect specifications. This improves feasibility of automated checking of isolation guarantees in synthesized synchronization implementations and it provides an environment for experimenting with new designs.</p
On politics and social science â the subject-object problem in social science and Foucaultâs engaged epistemology
The epistemological problem of the relationship between the subject of knowledge and
the object being known has itâs form in social science as a problem of the relationship between a
social scientist as a researcher and society and itâs phenomena as an object of this inquiry. As
Berger and Kellner note in their book âSociology Reinterpretedâ a social scientist is necessarily a
part of the object he studies, being embedded in a position in society from which he studies it.
Hence social sciences as scientific endeavors face a problem of the inseperability of their
researchers from object they study. Two main solutions two this problem have arisen: positivism
and interpretivism. Positivism postulates that rigorous methods for research will insure that
objective knowledge will be produced while interpretivism sees society only as an aggregate of
individuals whose interactions should be interpreted. A third epistemological framework has
arisen in the first half of the twentieth century usually called âcritical theoryâ. Critical theory
states that researchers should aim their research towards changing the object they are
researching, therefore their scientific practice should have extra-scientific effects, namely
political effects. This perspective violates Webers postulate of value neutrality which claims that
social sciences can only study the state of affairs but canât subscribe desirable ways of action. As
we will see the main topic of our paper is the epistemological framework of the work of Michel
Foucault and his contribution to the resolution of the problematic relation between a researcher
and his research object in social science. We will claim that Foucault broadly falls into the
critical theory paradigm but manages to solve itâs conflict with the value neutrality postulate.
Foucault envisions society as an amalgam of discursive and non-discursive practices that
interconnect in a way that gives them regularity and coherence through time. As Gayatri Spivak
notices for Foucault discursive practices create meaning and in doing so chart a way for nondiscursive
practices and therefore for action. This can be seen as an explanation for Foucaultâs
well known postulate of the relationship between power and knowledge, discursive practices
create knowledge that makes visible certain paths for action. Both of these types of practices
intertwine to create what Foucault calls âdispositifsâ that can be seen as mechanisms that bind discursive and non-discursive practices in a coherent manner and enable their regular repetition
through time. Foucault calls his methodology âgenealogyâ and sees it as a historical research of
the emergence of dipositifs. Genealogy is a historical research of the contingent ways in which
practices got interconnected in the past to create dispositifs we see today. As Foucault claims
genealogy begins with a âquestion posed in the presentâ about a certain dispositive and then
charts historical events and processes that led to its current form. The main aim of genealogy is
to show that there is no transcendental necessity for a certain dispositif to exist in itâs current
form by exposing the historical contingency that led to itâs current state. Foucault claimed that
his intent was to show that there is no metaphysical necessity that grounds the existences of
dispositifs and hence that their current form is arbitrary. As we can see Foucault follows his
postulate on the relationship between knowledge and power and formulates his scientific practice
as an opening of possibilities for different forms of action. This is way he calls his books
âexperimentsâ and claims that they are to be used for readers to re-examine their own links to the
currently existing dispositifs and possibilities of their alternative arrangements. But as Foucault
claims the genealogical method doesnât include normative prescriptions and can be seen only as
a form of an anti-metaphysical âunmaskingâ of current dispositifs. This unmasking doesnât
prescribe a desirable form to any dispositive but only shows that it can be arranged in different
ways. Hence we can say that Foucault sees the relationship between a researcher and his object
of study as a form of an intervention of the subject that aims at showing that the object is an
arbitrary construction. In that regard Foucault falls into the critical theory paradigm. Where he
differs from critical theory is his anti-normative stance that refuses to prescribe any desirable
form of action unlike for example Horkheimer who in his essay on critical theory claims that
âthe task of the theorist is to push society towards justiceâ. Foucault claims that his research
results should be used as âinstrumentsâ in political struggles but he himself doesnât ever
proclaim a desirable political goal. So we can conclude that Foucault solves the problem of the
subject-object relation in social science by envisioning the research process as a practice of
production of tools for social change. Therefore he connects social science to extra-scientific
political goals but doesnât violate the value neutrality postulate because his research doesnât
prescribe any concrete political goals but only shows the possibility for social change
Bell Nonlocality
Nonlocality was discovered by John Bell in 1964, in the context of the debates about quantum theory, but is a phenomenon that can be studied in its own right. Its observation proves that measurements are not revealing pre-determined values, falsifying the idea of âlocal hidden variablesâ suggested by Einstein and others. One is then forced to make some radical choice: either nature is intrinsically statistical and individual events are unspeakable, or our familiar space-time cannot be the setting for the whole of physics. As phenomena, nonlocality and its consequences will have to be predicted by any future theory, and may possibly play the role of foundational principles in these developments. But nonlocality has found a role in applied physics too: it can be used for âdevice-independentâ certification of the correct functioning of random number generators and other devices. After a self-contained introduction to the topic, this monograph on nonlocality presents the main tools and results following a logical, rather than a chronological, order
PSA 2020
These preprints were automatically compiled into a PDF from the collection of papers deposited in PhilSci-Archive in conjunction with the PSA 2020
Foundations of Trusted Autonomy
Trusted Autonomy; Automation Technology; Autonomous Systems; Self-Governance; Trusted Autonomous Systems; Design of Algorithms and Methodologie
Bell Nonlocality
Nonlocality was discovered by John Bell in 1964, in the context of the debates about quantum theory, but is a phenomenon that can be studied in its own right. Its observation proves that measurements are not revealing pre-determined values, falsifying the idea of âlocal hidden variablesâ suggested by Einstein and others. One is then forced to make some radical choice: either nature is intrinsically statistical and individual events are unspeakable, or our familiar space-time cannot be the setting for the whole of physics. As phenomena, nonlocality and its consequences will have to be predicted by any future theory, and may possibly play the role of foundational principles in these developments. But nonlocality has found a role in applied physics too: it can be used for âdevice-independentâ certification of the correct functioning of random number generators and other devices. After a self-contained introduction to the topic, this monograph on nonlocality presents the main tools and results following a logical, rather than a chronological, order
COMPLEX ACTION METHODOLOGY FOR ENTERPRISE SYSTEMS (CAMES): AN EXPERIMENTAL ACTION RESEARCH INQUIRY INTO COMMUNICATIVE ACTION AND QUANTUM MECHANICS FOR ACTION RESEARCH FIELD STUDIES IN ORGANISATIONAL CONTEXT
Current action research methodologies bias observations severely and render quantification models of subjective data uncertain. Thus, this research thesis aims to design a scientifically rigorous action-science methodology process that establishes a subject-bias-free method for communication in an organisational context. This investigation aims to apply scientific rigour to this issue and to verify the general applicability of mathematical formalism of quantum mechanics to address organisational venture that includes a âwicked problemâ (Stubbart, 1987, quoted in Pearson and Clair, 1998, p. 62) of how to communicate and collaborate appropriately. The subjective data collection and quantification models of this thesis build on the quantitative formalism of quantum mechanics and qualitative formalism of the theory of communicative action. Mathematical and ontological formalism combine into a novel research strategy with planned instrumentation for action research field studies summarised under the term âComplex Action Methodology for Enterprise Systemsâ (CAMES). The outcome is a process to understand the behavioural action of organisational members. This process is not technical, and neither does it involve a machine or apparatus. The process is primarily mathematical and requires that participants act under a new identity, a virtual identity. Similarly, the data analysis does not require a specific machine, technology or an apparatus. A spreadsheet calculator will primarily be sufficient for low entry. Data collection occurs in one block with an average duration time of 10 minutes in a virtual location. The practice can, therefore, use this thesisâ procedures for bias-free quantification of subjective data and prediction of an individualâs future behaviour with certainty. Prediction of an individualâs future behaviour with certainty provides to the organizational practice what organisational practice lacks but urgently requires. The certainty that claimed findings of behaviour in organisational context requires to intervene and steer. Certainty and justification for planned intervening and steering initiatives secure funding
International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book
The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions.
This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
Irreversibility, coherence and quantum fluctuation theorems
Irreversible processes have long been the focus of much attention in physics, forming cornerstones of thermodynamics and the foundations of quantum mechanics (principally the measurement problem). Recent interest in the marriage of these two fields has laid bare the partial inadequacy of definitions of thermodynamic work in a quantum context. Its problems are fundamental to quantum mechanics, in that projective measurements irreversibly destroy coherence in a state. To attempt to resolve this incompatibility, we begin with a deterministic quantum work process that adequately generalises the Newtonian framework for deterministic work processes. In doing so, we uncover a structure that has strong links to an old problem in probability theory on the decomposability of random variables. Crucially, we define coherent work as a state and Hamiltonian pair, sidestepping the measurement problem. We then look to fluctuation theorems which detail the thermodynamic irreversibility, and further develop a recent framework to show how our coherent work state appears just as Newtonian work appears in Crooksâ fluctuation theorem â providing an infinite hierarchy of correction terms. To round this off, we discuss the implications of incorporating additional observables, both commuting and complementary, on work processes and thermodynamics.Open Acces