7,658 research outputs found

    From the end of Unitary Science Projection to the Causally Complete Complexity Science: Extended Mathematics, Solved Problems, New Organisation and Superior Purposes

    Get PDF
    The deep crisis in modern fundamental science development is ever more evident and openly recognised now even by mainstream, official science professionals and leaders. By no coincidence, it occurs in parallel to the world civilisation crisis and related global change processes, where the true power of unreduced scientific knowledge is just badly missing as the indispensable and unique tool for the emerging greater problem solution and further progress at a superior level of complex world dynamics. Here we reveal the mathematically exact reason for the crisis in conventional science, containing also the natural and unified problem solution in the form of well-specified extension of usual, artificially restricted paradigm. We show how that extended, now causally complete science content provides various "unsolvable" problem solutions and opens new development possibilities for both science and society, where the former plays the role of the main, direct driver for the latter. We outline the related qualitative changes in science organisation, practice and purposes, giving rise to the sustainability transition in the entire civilisation dynamics towards the well-specified superior level of its unreduced, now well understood and universally defined complexity

    Fisheye Consistency: Keeping Data in Synch in a Georeplicated World

    Get PDF
    Over the last thirty years, numerous consistency conditions for replicated data have been proposed and implemented. Popular examples of such conditions include linearizability (or atomicity), sequential consistency, causal consistency, and eventual consistency. These consistency conditions are usually defined independently from the computing entities (nodes) that manipulate the replicated data; i.e., they do not take into account how computing entities might be linked to one another, or geographically distributed. To address this lack, as a first contribution, this paper introduces the notion of proximity graph between computing nodes. If two nodes are connected in this graph, their operations must satisfy a strong consistency condition, while the operations invoked by other nodes are allowed to satisfy a weaker condition. The second contribution is the use of such a graph to provide a generic approach to the hybridization of data consistency conditions into the same system. We illustrate this approach on sequential consistency and causal consistency, and present a model in which all data operations are causally consistent, while operations by neighboring processes in the proximity graph are sequentially consistent. The third contribution of the paper is the design and the proof of a distributed algorithm based on this proximity graph, which combines sequential consistency and causal consistency (the resulting condition is called fisheye consistency). In doing so the paper not only extends the domain of consistency conditions, but provides a generic provably correct solution of direct relevance to modern georeplicated systems

    Out of Nowhere: Spacetime from causality: causal set theory

    Get PDF
    This is a chapter of the planned monograph "Out of Nowhere: The Emergence of Spacetime in Quantum Theories of Gravity", co-authored by Nick Huggett and Christian W\"uthrich and under contract with Oxford University Press. (More information at www.beyondspacetime.net.) This chapter introduces causal set theory and identifies and articulates a 'problem of space' in this theory.Comment: 29 pages, 5 figure

    Asynchronous Message Orderings Beyond Causality

    Get PDF

    Against the Tide. A Critical Review by Scientists of How Physics and Astronomy Get Done

    Get PDF
    Nobody should have a monopoly of the truth in this universe. The censorship and suppression of challenging ideas against the tide of mainstream research, the blacklisting of scientists, for instance, is neither the best way to do and filter science, nor to promote progress in the human knowledge. The removal of good and novel ideas from the scientific stage is very detrimental to the pursuit of the truth. There are instances in which a mere unqualified belief can occasionally be converted into a generally accepted scientific theory through the screening action of refereed literature and meetings planned by the scientific organizing committees and through the distribution of funds controlled by "club opinions". It leads to unitary paradigms and unitary thinking not necessarily associated to the unique truth. This is the topic of this book: to critically analyze the problems of the official (and sometimes illicit) mechanisms under which current science (physics and astronomy in particular) is being administered and filtered today, along with the onerous consequences these mechanisms have on all of us.\ud \ud The authors, all of them professional researchers, reveal a pessimistic view of the miseries of the actual system, while a glimmer of hope remains in the "leitmotiv" claim towards the freedom in doing research and attaining an acceptable level of ethics in science

    Transcendental Aspects, Ontological Commitments and Naturalistic Elements in Nietzsche's Thought

    Get PDF
    Nietzsche's views on knowledge have been interpreted in at least three incompatible ways - as transcendental, naturalistic or proto-deconstructionist. While the first two share a commitment to the possibility of objective truth, the third reading denies this by highlighting Nietzsche's claims about the necessarily falsifying character of human knowledge (his so-called error theory). This paper examines the ways in which his work can be construed as seeking ways of overcoming the strict opposition between naturalism and transcendental philosophy whilst fully taking into account the error theory (interpreted non-literally, as a hyperbolic warning against uncritical forms of realism). In doing so, it clarifies the nature of Nietzsche's ontological commitments, both in the early and the later work, and shows that his relation to transcendental idealism is more subtle than is allowed by naturalistic interpreters while conversely accounting for the impossibility of conceiving the conditions of the possibility of knowledge as genuinely a priori

    Distributed shared memory for virtual environments

    Get PDF
    Bibliography: leaves 71-77.This work investigated making virtual environments easier to program, by designing a suitable distributed shared memory system. To be usable, the system must keep latency to a minimum, as virtual environments are very sensitive to it. The resulting design is push-based and non-consistent. Another requirement is that the system should be scaleable, over large distances and over large numbers of participants. The latter is hard to achieve with current network protocols, and a proposal was made for a more scaleable multicast addressing system than is used in the Internet protocol. Two sample virtual environments were developed to test the ease-of-use of the system. This showed that the basic concept is sound, but that more support is needed. The next step should be to extend the language and add compiler support, which will enhance ease-of-use and allow numerous optimisations. This can be improved further by providing system-supported containers
    • 

    corecore