467 research outputs found
Theories of Reference: What Was the Question?
The new theory of reference has won popularity. However, a number of noted philosophers have also attempted to reply to the critical arguments of Kripke and others, and aimed to vindicate the description theory of reference. Such responses are often based on ingenious novel kinds of descriptions, such as rigidified descriptions, causal descriptions, and metalinguistic descriptions. This prolonged debate raises the doubt whether different parties really have any shared understanding of what the central question of the philosophical theory of reference is: what is the main question to which descriptivism and the causal-historical theory have presented competing answers. One aim of the paper is to clarify this issue. The most influential objections to the new theory of reference are critically reviewed. Special attention is also paid to certain important later advances in the new theory of reference, due to Devitt and others
Quantum System Identification by Bayesian Analysis of Noisy Data: Beyond Hamiltonian Tomography
We consider how to characterize the dynamics of a quantum system from a
restricted set of initial states and measurements using Bayesian analysis.
Previous work has shown that Hamiltonian systems can be well estimated from
analysis of noisy data. Here we show how to generalize this approach to systems
with moderate dephasing in the eigenbasis of the Hamiltonian. We illustrate the
process for a range of three-level quantum systems. The results suggest that
the Bayesian estimation of the frequencies and dephasing rates is generally
highly accurate and the main source of errors are errors in the reconstructed
Hamiltonian basis.Comment: 6 pages, 3 figure
Precision characterisation of two-qubit Hamiltonians via entanglement mapping
We show that the general Heisenberg Hamiltonian with non-uniform couplings
can be characterised by mapping the entanglement it generates as a function of
time. Identification of the Hamiltonian in this way is possible as the
coefficients of each operator control the oscillation frequencies of the
entanglement function. The number of measurements required to achieve a given
precision in the Hamiltonian parameters is determined and an efficient
measurement strategy designed. We derive the relationship between the number of
measurements, the resulting precision and the ultimate discrete error
probability generated by a systematic mis-characterisation, when implementing
two-qubit gates for quantum computing.Comment: 6 Pages, 3 figure
Mind and body, form and content: how not to do petitio principii analysis
Few theoretical insights have emerged from the extensive literature discussions of petitio principii argument. In particular, the pattern of petitio analysis has largely been one of movement between the two sides of a dichotomy, that of form and content. In this paper, I trace the basis of this dichotomy to a dualist conception of mind and world. I argue for the rejection of the form/content dichotomy on the ground that its dualist presuppositions generate a reductionist analysis of certain concepts which are central to the analysis of petitio argument. I contend, for example, that no syntactic relation can assimilate within its analysis the essentially holistic nature of a notion like justification. In this regard, I expound a form of dialectical criticism which has been frequently employed in the philosophical arguments of Hilary Putnam. Here the focus of analysis is upon the way in which the proponent of a position proceeds to explain or argue for his/her own particular theses. My conclusion points to the use of such dialectic within future analyses of petitio principii
Justifying the Special Theory of Relativity with Unconceived Methods
Many realists argue that present scientific theories will not follow the fate of past scientific theories because the former are more successful than the latter. Critics object that realists need to show that present theories have reached the level of success that warrants their truth. I reply that the special theory of relativity has been repeatedly reinforced by unconceived scientific methods, so it will be reinforced by infinitely many unconceived scientific methods. This argument for the special theory of relativity overcomes the criticsâ objection, and has advantages over the no-miracle argument and the selective induction for it
Apoptotic cell-derived ICAM-3 promotes both macrophage chemoattraction to and tethering of apoptotic cells
A wide range of molecules acting as apoptotic cell-associated ligands, phagocyte-associated receptors or soluble bridging molecules have been implicated within the complex sequential processes that result in phagocytosis and degradation of apoptotic cells. Intercellular adhesion molecule 3 (ICAM-3, also known as CD50), a human leukocyte-restricted immunoglobulin super-family (IgSF) member, has previously been implicated in apoptotic cell clearance, although its precise role in the clearance process is ill defined. The main objective of this work is to further characterise the function of ICAM-3 in the removal of apoptotic cells. Using a range of novel anti-ICAM-3 monoclonal antibodies (mAbs), including one (MA4) that blocks apoptotic cell clearance by macrophages, alongside apoptotic human leukocytes that are normal or deficient for ICAM-3, we demonstrate that ICAM-3 promotes a domain 1â2-dependent tethering interaction with phagocytes. Furthermore, we demonstrate an apoptosis-associated reduction in ICAM-3 that results from release of ICAM-3 within microparticles that potently attract macrophages to apoptotic cells. Taken together, these data suggest that apoptotic cell-derived microparticles bearing ICAM-3 promote macrophage chemoattraction to sites of leukocyte cell death and that ICAM-3 mediates subsequent cell corpse tethering to macrophages. The defined function of ICAM-3 in these processes and profound defect in chemotaxis noted to ICAM-3-deficient microparticles suggest that ICAM-3 may be an important adhesion molecule involved in chemotaxis to apoptotic human leukocytes
A Self-Reference False Memory Effect in the DRM Paradigm: Evidence from Eastern and Western Samples
It is well established that processing information in relation to oneself (i.e., selfreferencing) leads to better memory for that information than processing that same information in relation to others (i.e., other-referencing). However, it is unknown whether self-referencing also leads to more false memories than other-referencing. In the current two experiments with European and East Asian samples, we presented participants the Deese-Roediger/McDermott (DRM) lists together with their own name or other peopleâs name (i.e., âTrumpâ in Experiment 1 and âLi Mingâ in Experiment 2). We found consistent results across the two experiments; that is, in the self-reference condition, participants had higher true and false memory rates compared to those in the other-reference condition. Moreover, we found that selfreferencing did not exhibit superior mnemonic advantage in terms of net accuracy compared to other-referencing and neutral conditions. These findings are discussed in terms of theoretical frameworks such as spreading activation theories and the fuzzytrace theory. We propose that our results reflect the adaptive nature of memory in the sense that cognitive processes that increase mnemonic efficiency may also increase susceptibility to associative false memories
Recommended from our members
The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector.
The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies
Convolutional Neural Networks Applied to Neutrino Events in a Liquid Argon Time Projection Chamber
We present several studies of convolutional neural networks applied to data
coming from the MicroBooNE detector, a liquid argon time projection chamber
(LArTPC). The algorithms studied include the classification of single particle
images, the localization of single particle and neutrino interactions in an
image, and the detection of a simulated neutrino event overlaid with cosmic ray
backgrounds taken from real detector data. These studies demonstrate the
potential of convolutional neural networks for particle identification or event
detection on simulated neutrino interactions. We also address technical issues
that arise when applying this technique to data from a large LArTPC at or near
ground level
- âŠ