2,071 research outputs found

    Joint attention and perceptual experience

    Get PDF
    Joint attention customarily refers to the coordinated focus of attention between two or more individuals on a common object or event, where it is mutually “open” to all attenders that they are so engaged. We identify two broad approaches to analyse joint attention, one in terms of cognitive notions like common knowledge and common awareness, and one according to which joint attention is fundamentally a primitive phenomenon of sensory experience. John Campbell’s relational theory is a prominent representative of the latter approach, and the main focus of this paper. We argue that Campbell’s theory is problematic for a variety of reasons, through which runs a common thread: most of the problems that the theory is faced with arise from the relational view of perception that he endorses, and, more generally, they suggest that perceptual experience is not sufficient for an analysis of joint attention

    What is "true" in internal realism?

    Get PDF
    This paper is a critical examination of Putnam's theory of truth as it evolves from metaphysical to internal realism. First, I analyze the model-theoretic argument that led Putnam to abandon the metaphysical concept of truth as correspondence and to adopt and epistemic view of truth. Though a powerful critique of the metaphysical realist conception of truth, this argument does not establish conclusively that the concept of truth has any epistemic content. Secondly, I discuss Putnam's idealization theory of truth, arguing that the identification of truth with "acceptability under ideal conditions" is at odds with the claim that truth is context-transcendent, since the notion of justification is intrinsically context-dependent and no amount of idealization can reedeem its contextual character. Finally, I suggest that the realist intuitions that Putnam's internal realism tries to capture call for no more than a deflationary view of truth. Acceptance of this view requires abandoning not only the idea that truth is an epistemic property, but also the idea that truth is a substantive property that all true statements share and, therefore, a proper object of philosophical theorizing.Aquest article és un examen critic de la teoria de la veritat de Putnam en la seva evolució des d'un realisme metafisic a un realisme intern. En primer lloc, presento una anàlisi de l'argument de la teoria de models que va dur a Putnam a abandonar el concepte metafísic de veritat corn a correspondència i a adoptar un concepte epistèmic de veritat. Aquest argument, encara que constitueix una potent crítica de la concepció de la veritat del realisme metafísic, no estableix conclusivament que la veritar tingui algun contingut epistèmic. En segon lloc, discuteixo la teoria idealitzada de la veritat de Putnam, tot argumentant que la identificació de la veritat amb "acceptabilitat en condicions ideals" no s'adiu amb la tesi que la veritat no és contextual sinó transcendent, per tal com la noció de justificació és intrínsecament contextual i no hi ha idealització que la pugui redimir d'aquest caracter. Finalment, suggereixo que les intuicions realistes que el realisme intern de Putnam intenta captar no requereixen més que una perspectiva deflacionista sobre la veritat. Adoptar aquesta perspectiva implica abandonar no sols la idea que la veritat és una propietat epistèmica, sinó també la idea que la veritat és una propietat substantiva que totes les afirmacions verdaderes comparteixen i que, per tant, és objecte d'una teoria filosòfica

    NEGOTIATING ETHICAL RESIDUE ASSOCIATED WITH BEDSIDE RATIONING: A GROUNDED THEORY STUDY WITH FAMILY PHYSICIANS

    Get PDF
    In any health care system where funds are limited, priorities must be set. Resource allocation, also called priority setting or rationing, may determine who receives treatment, what treatment they receive, or amount of time spent with professionals. Bedside rationing decisions are those which health care professionals make at the clinical level. Ethical distress and residue theory may inform the investigation of discomfort and conflict identified in physicians who ration care. Ethical distress is experienced when external constraints make it nearly impossible to do “the right thing,” and ethical residue represents the traces which remain following unresolved ethical distress. This study aims to explore how physicians negotiate ethical residue associated with making bedside rationing decisions on an ongoing basis, using a grounded theory methodology. The findings indicate that “doing everything I think patients need” is central to this process. Findings may inform medical ethics education and training interventions for practicing physicians

    Improvements to photometry. Part 1: Better estimation of derivatives in extinction and transformation equations

    Get PDF
    Atmospheric extinction in wideband photometry is examined both analytically and through numerical simulations. If the derivatives that appear in the Stromgren-King theory are estimated carefully, it appears that wideband measurements can be transformed to outside the atmosphere with errors no greater than a millimagnitude. A numerical analysis approach is used to estimate derivatives of both the stellar and atmospheric extinction spectra, avoiding previous assumptions that the extinction follows a power law. However, it is essential to satify the requirements of the sampling theorem to keep aliasing errors small. Typically, this means that band separations cannot exceed half of the full width at half-peak response. Further work is needed to examine higher order effects, which may well be significant

    Self-Relative (or Machian) Information: Entropy-Area Relation

    Get PDF
    The entropy-area relation of black holes is one of the important results of theoretical physics. It is one of the few relations that is used to test theories of quantum gravity in the absence of any experimental evidence. It states that 4×P24 \times \ell_P^2 is the fundamental area that holds \textit{one} bit of information. Consequently, a question arises: why 4×P24 \times \ell_P^2 and not 1×P21 \times \ell_P^2 is the fundamental holder of \textit{one} bit of information? In any case it seems the latter choice is more natural. We show that this question can be answered with a more explicit counting of the independent states of a black hole. To do this we introduce a method of counting which we name self-relative information. It says that a bit alone does not have any information unless it is considered near other bits. Utilizing this approach we obtain the correct entropy-area relation for black holes with 1×P21 \times \ell_P^2 as the fundamental holder of \textit{one} bit of information. This method also predicts, naturally, the existence of logarithmic corrections to the entropy-area relation

    Measurement in the de Broglie-Bohm interpretation: Double-slit, Stern-Gerlach and EPR-B

    Full text link
    We propose a pedagogical presentation of measurement in the de Broglie-Bohm interpretation. In this heterodox interpretation, the position of a quantum particle exists and is piloted by the phase of the wave function. We show how this position explains determinism and realism in the three most important experiments of quantum measurement: double-slit, Stern-Gerlach and EPR-B. First, we demonstrate the conditions in which the de Broglie-Bohm interpretation can be assumed to be valid through continuity with classical mechanics. Second, we present a numerical simulation of the double-slit experiment performed by J\"onsson in 1961 with electrons. It demonstrates the continuity between classical mechanics and quantum mechanics: evolution of the probability density at various distances and convergence of the quantum trajectories to the classical trajectories when h tends to 0. Third, we present an analytic expression of the wave function in the Stern-Gerlach experiment. This explicit solution requires the calculation of a Pauli spinor with a spatial extension. This solution enables to demonstrate the decoherence of the wave function and the three postulates of quantum measurement: quantization, the Born interpretation and wave function reduction. The spinor spatial extension also enables the introduction of the de Broglie-Bohm trajectories, which gives a very simple explanation of the particles' impact and of the measurement process. Finally, we study the EPR-B experiment, the Bohm version of the Einstein-Podolsky-Rosen experiment. Its theoretical resolution in space and time shows that a causal interpretation exists where each atom has a position and a spin. This interpretation avoids the flaw of the previous causal interpretation. We recall that a physical explanation of non-local influences is possible.Comment: 15 pages, 14 figure

    Problem Theory

    Full text link
    The Turing machine, as it was presented by Turing himself, models the calculations done by a person. This means that we can compute whatever any Turing machine can compute, and therefore we are Turing complete. The question addressed here is why, Why are we Turing complete? Being Turing complete also means that somehow our brain implements the function that a universal Turing machine implements. The point is that evolution achieved Turing completeness, and then the explanation should be evolutionary, but our explanation is mathematical. The trick is to introduce a mathematical theory of problems, under the basic assumption that solving more problems provides more survival opportunities. So we build a problem theory by fusing set and computing theories. Then we construct a series of resolvers, where each resolver is defined by its computing capacity, that exhibits the following property: all problems solved by a resolver are also solved by the next resolver in the series if certain condition is satisfied. The last of the conditions is to be Turing complete. This series defines a resolvers hierarchy that could be seen as a framework for the evolution of cognition. Then the answer to our question would be: to solve most problems. By the way, the problem theory defines adaptation, perception, and learning, and it shows that there are just three ways to resolve any problem: routine, trial, and analogy. And, most importantly, this theory demonstrates how problems can be used to found mathematics and computing on biology.Comment: 43 page
    corecore