43 research outputs found

    Automated Validation of State-Based Client-Centric Isolation with TLA <sup>+</sup>

    Get PDF
    Clear consistency guarantees on data are paramount for the design and implementation of distributed systems. When implementing distributed applications, developers require approaches to verify the data consistency guarantees of an implementation choice. Crooks et al. define a state-based and client-centric model of database isolation. This paper formalizes this state-based model in, reproduces their examples and shows how to model check runtime traces and algorithms with this formalization. The formalized model in enables semi-automatic model checking for different implementation alternatives for transactional operations and allows checking of conformance to isolation levels. We reproduce examples of the original paper and confirm the isolation guarantees of the combination of the well-known 2-phase locking and 2-phase commit algorithms. Using model checking this formalization can also help finding bugs in incorrect specifications. This improves feasibility of automated checking of isolation guarantees in synthesized synchronization implementations and it provides an environment for experimenting with new designs.</p

    On politics and social science – the subject-object problem in social science and Foucault’s engaged epistemology

    Get PDF
    The epistemological problem of the relationship between the subject of knowledge and the object being known has it’s form in social science as a problem of the relationship between a social scientist as a researcher and society and it’s phenomena as an object of this inquiry. As Berger and Kellner note in their book “Sociology Reinterpreted” a social scientist is necessarily a part of the object he studies, being embedded in a position in society from which he studies it. Hence social sciences as scientific endeavors face a problem of the inseperability of their researchers from object they study. Two main solutions two this problem have arisen: positivism and interpretivism. Positivism postulates that rigorous methods for research will insure that objective knowledge will be produced while interpretivism sees society only as an aggregate of individuals whose interactions should be interpreted. A third epistemological framework has arisen in the first half of the twentieth century usually called “critical theory”. Critical theory states that researchers should aim their research towards changing the object they are researching, therefore their scientific practice should have extra-scientific effects, namely political effects. This perspective violates Webers postulate of value neutrality which claims that social sciences can only study the state of affairs but can’t subscribe desirable ways of action. As we will see the main topic of our paper is the epistemological framework of the work of Michel Foucault and his contribution to the resolution of the problematic relation between a researcher and his research object in social science. We will claim that Foucault broadly falls into the critical theory paradigm but manages to solve it’s conflict with the value neutrality postulate. Foucault envisions society as an amalgam of discursive and non-discursive practices that interconnect in a way that gives them regularity and coherence through time. As Gayatri Spivak notices for Foucault discursive practices create meaning and in doing so chart a way for nondiscursive practices and therefore for action. This can be seen as an explanation for Foucault’s well known postulate of the relationship between power and knowledge, discursive practices create knowledge that makes visible certain paths for action. Both of these types of practices intertwine to create what Foucault calls “dispositifs” that can be seen as mechanisms that bind discursive and non-discursive practices in a coherent manner and enable their regular repetition through time. Foucault calls his methodology “genealogy” and sees it as a historical research of the emergence of dipositifs. Genealogy is a historical research of the contingent ways in which practices got interconnected in the past to create dispositifs we see today. As Foucault claims genealogy begins with a “question posed in the present” about a certain dispositive and then charts historical events and processes that led to its current form. The main aim of genealogy is to show that there is no transcendental necessity for a certain dispositif to exist in it’s current form by exposing the historical contingency that led to it’s current state. Foucault claimed that his intent was to show that there is no metaphysical necessity that grounds the existences of dispositifs and hence that their current form is arbitrary. As we can see Foucault follows his postulate on the relationship between knowledge and power and formulates his scientific practice as an opening of possibilities for different forms of action. This is way he calls his books “experiments” and claims that they are to be used for readers to re-examine their own links to the currently existing dispositifs and possibilities of their alternative arrangements. But as Foucault claims the genealogical method doesn’t include normative prescriptions and can be seen only as a form of an anti-metaphysical “unmasking” of current dispositifs. This unmasking doesn’t prescribe a desirable form to any dispositive but only shows that it can be arranged in different ways. Hence we can say that Foucault sees the relationship between a researcher and his object of study as a form of an intervention of the subject that aims at showing that the object is an arbitrary construction. In that regard Foucault falls into the critical theory paradigm. Where he differs from critical theory is his anti-normative stance that refuses to prescribe any desirable form of action unlike for example Horkheimer who in his essay on critical theory claims that “the task of the theorist is to push society towards justice”. Foucault claims that his research results should be used as “instruments” in political struggles but he himself doesn’t ever proclaim a desirable political goal. So we can conclude that Foucault solves the problem of the subject-object relation in social science by envisioning the research process as a practice of production of tools for social change. Therefore he connects social science to extra-scientific political goals but doesn’t violate the value neutrality postulate because his research doesn’t prescribe any concrete political goals but only shows the possibility for social change

    Bell Nonlocality

    Get PDF
    Nonlocality was discovered by John Bell in 1964, in the context of the debates about quantum theory, but is a phenomenon that can be studied in its own right. Its observation proves that measurements are not revealing pre-determined values, falsifying the idea of “local hidden variables” suggested by Einstein and others. One is then forced to make some radical choice: either nature is intrinsically statistical and individual events are unspeakable, or our familiar space-time cannot be the setting for the whole of physics. As phenomena, nonlocality and its consequences will have to be predicted by any future theory, and may possibly play the role of foundational principles in these developments. But nonlocality has found a role in applied physics too: it can be used for “device-independent” certification of the correct functioning of random number generators and other devices. After a self-contained introduction to the topic, this monograph on nonlocality presents the main tools and results following a logical, rather than a chronological, order

    PSA 2020

    Get PDF
    These preprints were automatically compiled into a PDF from the collection of papers deposited in PhilSci-Archive in conjunction with the PSA 2020

    Foundations of Trusted Autonomy

    Get PDF
    Trusted Autonomy; Automation Technology; Autonomous Systems; Self-Governance; Trusted Autonomous Systems; Design of Algorithms and Methodologie

    Bell Nonlocality

    Get PDF
    Nonlocality was discovered by John Bell in 1964, in the context of the debates about quantum theory, but is a phenomenon that can be studied in its own right. Its observation proves that measurements are not revealing pre-determined values, falsifying the idea of “local hidden variables” suggested by Einstein and others. One is then forced to make some radical choice: either nature is intrinsically statistical and individual events are unspeakable, or our familiar space-time cannot be the setting for the whole of physics. As phenomena, nonlocality and its consequences will have to be predicted by any future theory, and may possibly play the role of foundational principles in these developments. But nonlocality has found a role in applied physics too: it can be used for “device-independent” certification of the correct functioning of random number generators and other devices. After a self-contained introduction to the topic, this monograph on nonlocality presents the main tools and results following a logical, rather than a chronological, order

    COMPLEX ACTION METHODOLOGY FOR ENTERPRISE SYSTEMS (CAMES): AN EXPERIMENTAL ACTION RESEARCH INQUIRY INTO COMMUNICATIVE ACTION AND QUANTUM MECHANICS FOR ACTION RESEARCH FIELD STUDIES IN ORGANISATIONAL CONTEXT

    Get PDF
    Current action research methodologies bias observations severely and render quantification models of subjective data uncertain. Thus, this research thesis aims to design a scientifically rigorous action-science methodology process that establishes a subject-bias-free method for communication in an organisational context. This investigation aims to apply scientific rigour to this issue and to verify the general applicability of mathematical formalism of quantum mechanics to address organisational venture that includes a “wicked problem” (Stubbart, 1987, quoted in Pearson and Clair, 1998, p. 62) of how to communicate and collaborate appropriately. The subjective data collection and quantification models of this thesis build on the quantitative formalism of quantum mechanics and qualitative formalism of the theory of communicative action. Mathematical and ontological formalism combine into a novel research strategy with planned instrumentation for action research field studies summarised under the term ‘Complex Action Methodology for Enterprise Systems’ (CAMES). The outcome is a process to understand the behavioural action of organisational members. This process is not technical, and neither does it involve a machine or apparatus. The process is primarily mathematical and requires that participants act under a new identity, a virtual identity. Similarly, the data analysis does not require a specific machine, technology or an apparatus. A spreadsheet calculator will primarily be sufficient for low entry. Data collection occurs in one block with an average duration time of 10 minutes in a virtual location. The practice can, therefore, use this thesis’ procedures for bias-free quantification of subjective data and prediction of an individual’s future behaviour with certainty. Prediction of an individual’s future behaviour with certainty provides to the organizational practice what organisational practice lacks but urgently requires. The certainty that claimed findings of behaviour in organisational context requires to intervene and steer. Certainty and justification for planned intervening and steering initiatives secure funding

    International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book

    Get PDF
    The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions. This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more

    Irreversibility, coherence and quantum fluctuation theorems

    Get PDF
    Irreversible processes have long been the focus of much attention in physics, forming cornerstones of thermodynamics and the foundations of quantum mechanics (principally the measurement problem). Recent interest in the marriage of these two fields has laid bare the partial inadequacy of definitions of thermodynamic work in a quantum context. Its problems are fundamental to quantum mechanics, in that projective measurements irreversibly destroy coherence in a state. To attempt to resolve this incompatibility, we begin with a deterministic quantum work process that adequately generalises the Newtonian framework for deterministic work processes. In doing so, we uncover a structure that has strong links to an old problem in probability theory on the decomposability of random variables. Crucially, we define coherent work as a state and Hamiltonian pair, sidestepping the measurement problem. We then look to fluctuation theorems which detail the thermodynamic irreversibility, and further develop a recent framework to show how our coherent work state appears just as Newtonian work appears in Crooks’ fluctuation theorem – providing an infinite hierarchy of correction terms. To round this off, we discuss the implications of incorporating additional observables, both commuting and complementary, on work processes and thermodynamics.Open Acces
    corecore