1,803 research outputs found

    Determinism and the Antiquated Deontology of the Social Sciences

    Get PDF
    This article shows how the social sciences, particularly human geography, rejected hard determinism by the mid-twentieth century largely on the deontological basis that it is irreconcilable with social justice, yet this rejection came just before a burst of creative development in consequentialist theories of social justice that problematize a facile rejection of determinism on moral grounds, a development that has seldom been recognized in the social sciences. Thus many current social science and human geography views on determinism and social justice are antiquated, ignoring numerous common and well-respected arguments within philosophy that hard determinism can be reconciled with a just society. We support this argument by briefly tracing the parallel development of stances on determinism in the social sciences and the deontological-consequentialist debate in philosophy. The purpose of the article is to resituate social science and human geography debates on determinism and social justice within a modern ethical framework

    On determinism and well-posedness in multiple time dimensions

    Full text link
    We study the initial value problem for the wave equation and the ultrahyperbolic equation for data posed on initial surface of mixed signature (both spacelike and timelike). Under a nonlocal constraint, we show that the Cauchy problem on codimension-one hypersurfaces has global unique solutions in the Sobolev spaces HmH^{m}, thus it is well-posed. In contrast, we show that the initial value problem on higher codimension hypersurfaces is ill-posed, at least when specifying a finite number of derivatives of the data, due to the failure of uniqueness. This is in contrast to a uniqueness result which Courant and Hilbert deduce from Asgeirsson's mean value theorem, for which we give an independent derivation. The proofs use Fourier synthesis and the Holmgren-John uniqueness theorem

    On Determinism and Unambiguity of Weighted Two-way Automata

    Get PDF
    In this paper, we first study the conversion of weighted two-way automata to one-way automata. We show that this conversion preserves the unambiguity but does not preserve the determinism. Yet, we prove that the conversion of an unambiguous weighted one-way automaton into a two-way automaton leads to a deterministic two-way automaton. As a consequence, we prove that unambiguous weighted two-way automata are equivalent to deterministic weighted two-way automata in commutative semirings.Comment: In Proceedings AFL 2014, arXiv:1405.527

    Lorenz, G\"{o}del and Penrose: New perspectives on determinism and causality in fundamental physics

    Full text link
    Despite being known for his pioneering work on chaotic unpredictability, the key discovery at the core of meteorologist Ed Lorenz's work is the link between space-time calculus and state-space fractal geometry. Indeed, properties of Lorenz's fractal invariant set relate space-time calculus to deep areas of mathematics such as G\"{o}del's Incompleteness Theorem. These properties, combined with some recent developments in theoretical and observational cosmology, motivate what is referred to as the `cosmological invariant set postulate': that the universe UU can be considered a deterministic dynamical system evolving on a causal measure-zero fractal invariant set IUI_U in its state space. Symbolic representations of IUI_U are constructed explicitly based on permutation representations of quaternions. The resulting `invariant set theory' provides some new perspectives on determinism and causality in fundamental physics. For example, whilst the cosmological invariant set appears to have a rich enough structure to allow a description of quantum probability, its measure-zero character ensures it is sparse enough to prevent invariant set theory being constrained by the Bell inequality (consistent with a partial violation of the so-called measurement independence postulate). The primacy of geometry as embodied in the proposed theory extends the principles underpinning general relativity. As a result, the physical basis for contemporary programmes which apply standard field quantisation to some putative gravitational lagrangian is questioned. Consistent with Penrose's suggestion of a deterministic but non-computable theory of fundamental physics, a `gravitational theory of the quantum' is proposed based on the geometry of IUI_U, with potential observational consequences for the dark universe.Comment: This manuscript has been accepted for publication in Contemporary Physics and is based on the author's 9th Dennis Sciama Lecture, given in Oxford and Triest

    POSIWID and determinism in design for behaviour change

    Get PDF
    Copyright @ 2012 Social Services Research GroupWhen designing to influence behaviour for social or environmental benefit, does designers' intent matter? Or are the effects on behaviour more important, regardless of the intent involved? This brief paper explores -- in the context of design for behaviour change -- some treatments of design, intentionality, purpose and responsibility from a variety of fields, including Stafford Beer's "The purpose of a system is what it does" and Maurice Broady's perspective on determinism. The paper attempts to extract useful implications for designers working on behaviour-related problems, in terms of analytical or reflective questions to ask during the design process

    Strawson and Prasad on Determinism and Resentment

    Get PDF
    P. F. Strawson's influential article "Freedom and Resentment" has been much commented on, and one of the most trenchant commentaries is Rajendra Prasad's, "Reactive Attitudes, Rationality, and Determinism." In his article, Prasad contests the significance of the reactive attitude over a precise theory of determinism, concluding that Strawson's argument is ultimately unconvincing. In this article, I evaluate Prasad's challenges to Strawson by summarizing and categorizing all of the relevant arguments in both Strawson's and Prasad's pieces. Strawson offers four types of arguments to demonstrate that determinism and free agency cannot be incompatible, showing that the reactive attitude is natural and desirable and the objective attitude is not natural, not desirable, not sustainable, and not compatible with the reactive attitude. Prasad targets Strawson's incompatibilist arguments, showing that determinism and free agency are incompatible. Of Prasad's seven types of arguments, four target Strawson's four above. Three of these succeed and one fails. The remaining three target Strawson's support of the reactive attitude, and of these, one succeeds, and the others fail. Although Prasad's arguments miss the mark at times, he does succeed in putting forth a legitimate challenge to Strawson's notion that determinism is no inhibitor of the reactive attitude

    On determinism versus nondeterminism for restarting automata

    Get PDF
    AbstractA restarting automaton processes a given word by executing a sequence of local simplifications until a simple word is obtained that the automaton then accepts. Such a computation is expressed as a sequence of cycles. A nondeterministic restarting automaton M is called correctness preserving, if, for each cycle u⊱Mcv, the string v belongs to the characteristic language LC(M) of M, if the string u does. Our first result states that for each type of restarting automaton X∈{R,RW,RWW,RL,RLW,RLWW}, if M is a nondeterministic X-automaton that is correctness preserving, then there exists a deterministic X-automaton M1 such that the characteristic languages LC(M1) and LC(M) coincide. When a restarting automaton M executes a cycle that transforms a string from the language LC(M) into a string not belonging to LC(M), then this can be interpreted as an error of M. By counting the number of cycles it may take M to detect this error, we obtain a measure for the influence that errors have on computations. Accordingly, this measure is called error detection distance. It turns out, however, that an X-automaton with bounded error detection distance is equivalent to a correctness preserving X-automaton, and therewith to a deterministic X-automaton. This means that nondeterminism increases the expressive power of X-automata only in combination with an unbounded error detection distance
    • 

    corecore