5,900 research outputs found

    Smear fitting: a new deconvolution method for interferometric data

    Full text link
    A new technique is presented for producing images from interferometric data. The method, ``smear fitting'', makes the constraints necessary for interferometric imaging double as a model, with uncertainties, of the sky brightness distribution. It does this by modelling the sky with a set of functions and then convolving each component with its own elliptical gaussian to account for the uncertainty in its shape and location that arises from noise. This yields much sharper resolution than CLEAN for significantly detected features, without sacrificing any sensitivity. Using appropriate functional forms for the components provides both a scientifically interesting model and imaging constraints that tend to be better than those used by traditional deconvolution methods. This allows it to avoid the most serious problems that limit the imaging quality of those methods. Comparisons of smear fitting to CLEAN and maximum entropy are given, using both real and simulated observations. It is also shown that the famous Rayleigh criterion (resolution = wavelength / baseline) is inappropriate for interferometers as it does not consider the reliability of the measurements.Comment: 16 pages, 38 figures (some have been lossily compressed for astro-ph). Uses the hyperref LaTeX package. Accepted for publication by the Monthly Notices of the Royal Astronomical Societ

    Method Pluralism, Method Mismatch & Method Bias

    Get PDF
    This is the final version. Available from Michigan Publishing via the URL in this record.Pluralism about scientific method is more-or-less accepted, but the consequences have yet to be drawn out. Scientists adopt different methods in response to different epistemic situations: depending on the system they are interested in, the resources at their disposal, and so forth. If it is right that different methods are appropriate in different situations, then mismatches between methods and situations are possible. This is most likely to occur due to method bias: when we prefer a particular kind of method, despite that method clashing with evidential context or our aims. To explore these ideas, we sketch a kind of method pluralism which turns on two properties of evidence, before using agent-based models to examine the relationship between methods, epistemic situations, and bias. Based on our results, we suggest that although method bias can undermine the efficiency of a scientific community, it can also be productive through preserving a diversity of evidence. We consider circumstances where method bias could be particularly egregious, and those where it is a potential virtue, and argue that consideration of method bias reveals that community standards deserve a central place in the epistemology of science.Templeton World Charity Foundatio

    Bridging the gap between general probabilistic theories and the device-independent framework for nonlocality and contextuality

    Get PDF
    Characterizing quantum correlations in terms of information-theoretic principles is a popular chapter of quantum foundations. Traditionally, the principles adopted for this scope have been expressed in terms of conditional probability distributions, specifying the probability that a black box produces a certain output upon receiving a certain input. This framework is known as "device-independent". Another major chapter of quantum foundations is the information-theoretic characterization of quantum theory, with its sets of states and measurements, and with its allowed dynamics. The different frameworks adopted for this scope are known under the umbrella term "general probabilistic theories". With only a few exceptions, the two programmes on characterizing quantum correlations and characterizing quantum theory have so far proceeded on separate tracks, each one developing its own methods and its own agenda. This paper aims at bridging the gap, by comparing the two frameworks and illustrating how the two programmes can benefit each other.Comment: 61 pages, no figures, published versio

    The Standard Model of Quantum Measurement Theory: History and Applications

    Get PDF
    The standard model of the quantum theory of measurement is based on an interaction Hamiltonian in which the observable-to-be-measured is multiplied with some observable of a probe system. This simple Ansatz has proved extremely fruitful in the development of the foundations of quantum mechanics. While the ensuing type of models has often been argued to be rather artificial, recent advances in quantum optics have demonstrated their prinicpal and practical feasibility. A brief historical review of the standard model together with an outline of its virtues and limitations are presented as an illustration of the mutual inspiration that has always taken place between foundational and experimental research in quantum physics.Comment: 22 pages, to appear in Found. Phys. 199

    Stationarity of Inflation and Predictions of Quantum Cosmology

    Get PDF
    We describe several different regimes which are possible in inflationary cosmology. The simplest one is inflation without self-reproduction of the universe. In this scenario the universe is not stationary. The second regime, which exists in a broad class of inflationary models, is eternal inflation with the self-reproduction of inflationary domains. In this regime local properties of domains with a given density and given values of fields do not depend on the time when these domains were produced. The probability distribution to find a domain with given properties in a self-reproducing universe may or may not be stationary, depending on the choice of an inflationary model. We give examples of models where each of these possibilities can be realized, and discuss some implications of our results for quantum cosmology. In particular, we propose a new mechanism which may help solving the cosmological constant problem.Comment: 30 pages, Stanford preprint SU-ITP-94-24, LaTe

    The Methods of Normativity

    Get PDF
    This essay is an examination of the relationship between phenomenology and analytic method in the philosophy of law. It proceeds by way of a case study, the requirement of compliance in Raz’s theory of mandatory norms. Proceeding in this way provides a degree of specificity that is otherwise neglected in the relevant literature on method. Drawing on insights from the philosophy of art and cognitive neuroscience, it is argued that the requirement of compliance is beset by a range of epistemological difficulties. The implications of these difficulties are then reviewed for method and normativity in practical reason. A topology of normativity emerges nearer the end of the paper, followed by a brief examination of how certain normative categories must satisfy distinct burdens of proof
    • …
    corecore