3,276 research outputs found

    Four-colour photometry of eclipsing binaries. XLI uvby light curves for AD Bootis, HW Canis Majoris, SW Canis Majoris, V636 Centauri, VZ Hydrae, and WZ Ophiuchi

    Full text link
    CONTEXT: Accurate mass, radius, and abundance determinations from binaries provide important information on stellar evolution, fundamental to central fields in modern astrophysics and cosmology. AIMS: Within the long-term Copenhagen Binary Project, we aim to obtain high-quality light curves and standard photometry for double-lined detached eclipsing binaries with late A, F, and G type main-sequence components, needed for the determination of accurate absolute dimensions and abundances, and for detailed comparisons with results from recent stellar evolutionary models. METHODS: Between March 1985 and July 2007, we carried out photometric observations of AD Boo, HW CMA, SW CMa, V636 Cen, VZ Hya, and WZ Oph at the Str"omgren Automatic Telescope at ESO, La Silla. RESULTS: We obtained complete uvby light curves, ephemerides, and standard uvby\beta indices for all six systems.For V636 Cen and HW CMa, we present the first modern light curves, whereas for AD Boo, SW CMa, VZ Hya, and WZ Oph, they are both more accurate and more complete than earlier data. Due to a high orbital eccentricity (e = 0.50), combined with a low orbital inclination (i = 84.7), only one eclipse, close to periastron, occurs for HW CMa. For the two other eccentric systems, V636 Cen (e = 0.134) and SW CMa (e = 0.316), apsidal motion has been detected with periods of 5270 +/- 335 and 14900 +/- 3600 years, respectively.Comment: Only change is: Bottom lines (hopefully) not truncated anymore. Accepted for publication in Astonomy & Astrophysic

    Facts, Values and Quanta

    Full text link
    Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probability has excited nearly as much philosophical controversy as the interpretation of quantum mechanics. 20th century physicists have mostly adopted a frequentist conception. In this paper it is argued that we ought, instead, to adopt a logical or Bayesian conception. The paper includes a comparison of the orthodox and Bayesian theories of statistical inference. It concludes with a few remarks concerning the implications for the concept of physical reality.Comment: 30 pages, AMS Late

    A perspective on the landscape problem

    Full text link
    I discuss the historical roots of the landscape problem and propose criteria for its successful resolution. This provides a perspective to evaluate the possibility to solve it in several of the speculative cosmological scenarios under study including eternal inflation, cosmological natural selection and cyclic cosmologies.Comment: Invited contribution for a special issue of Foundations of Physics titled: Forty Years Of String Theory: Reflecting On the Foundations. 31 pages, no figure

    Bayes and health care research.

    Get PDF
    Bayes’ rule shows how one might rationally change one’s beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism. There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly inductive (i.e. it shows how we may induce views about the world based on partial data from it) whereas frequentism is at least compatible with non-inductive views of scientific method, particularly the critical realism of Popper. Popper and others detail significant problems with induction. Frequentism’s apparent ability to avoid these, plus its ability to give a seemingly more scientific and objective take on probability, lies behind its philosophical appeal to health care researchers. However, there are also significant problems with frequentism, particularly its inability to assign probability scores to single events. Popper thus proposed an alternative objectivist view of probability, called propensity theory, which he allies to a theory of corroboration; but this too has significant problems, in particular, it may not successfully avoid induction. If this is so then Bayesianism might be philosophically the strongest of the statistical approaches. The article sets out a number of its philosophical and methodological attractions. Finally, it outlines a way in which critical realism and Bayesianism might work together. </p

    Playing dice with mice: building experimental futures in Singapore

    Get PDF
    This is a postprint of an article published in New Genetics and Society, 2011, Vol. 30, Issue 4 pp. 433 – 441 © 2011 copyright Taylor & Francis. New Genetics and Society is available online at: http://www.tandfonline.com/loi/cngs20#.UqsI0tJdU24This short paper adds to debates on the unfolding spaces and logics of biotechnological development bought together in the 2009 special issue of New Genetics and Society on ‘Biopolitics in Asia’. Though an unlikely comparison between the development of the genomic sciences and the building of gambling casinos in the city state of Singapore, it reflects on the nature of political and technological investments in this South-East Asian city. It argues that Western expectations of a link between scientific practices, and civic epistemologies linked to democratic decision-making, are replaced by a rather different future orientation to scientific experimentation, economic investment and social development in Singapore

    Is the quantum world composed of propensitons?

    Get PDF
    In this paper I outline my propensiton version of quantum theory (PQT). PQT is a fully micro-realistic version of quantum theory that provides us with a very natural possible solution to the fundamental wave/particle problem, and is free of the severe defects of orthodox quantum theory (OQT) as a result. PQT makes sense of the quantum world. PQT recovers all the empirical success of OQT and is, furthermore, empirically testable (although not as yet tested). I argue that Einstein almost put forward this version of quantum theory in 1916/17 in his papers on spontaneous and induced radiative transitions, but retreated from doing so because he disliked the probabilistic character of the idea. Subsequently, the idea was overlooked because debates about quantum theory polarised into the Bohr/Heisenberg camp, which argued for the abandonment of realism and determinism, and the Einstein/Schrödinger camp, which argued for the retention of realism and determinism, no one, as a result, pursuing the most obvious option of retaining realism but abandoning determinism. It is this third, overlooked option that leads to PQT. PQT has implications for quantum field theory, the standard model, string theory, and cosmology. The really important point, however, is that it is experimentally testable. I indicate two experiments in principle capable of deciding between PQT and OQT

    Accurate masses and radii of normal stars: modern results and applications

    Get PDF
    This paper presents and discusses a critical compilation of accurate, fundamental determinations of stellar masses and radii. We have identified 95 detached binary systems containing 190 stars (94 eclipsing systems, and alpha Centauri) that satisfy our criterion that the mass and radius of both stars be known to 3% or better. To these we add interstellar reddening, effective temperature, metal abundance, rotational velocity and apsidal motion determinations when available, and we compute a number of other physical parameters, notably luminosity and distance. We discuss the use of this information for testing models of stellar evolution. The amount and quality of the data also allow us to analyse the tidal evolution of the systems in considerable depth, testing prescriptions of rotational synchronisation and orbital circularisation in greater detail than possible before. The new data also enable us to derive empirical calibrations of M and R for single (post-) main-sequence stars above 0.6 M(Sun). Simple, polynomial functions of T(eff), log g and [Fe/H] yield M and R with errors of 6% and 3%, respectively. Excellent agreement is found with independent determinations for host stars of transiting extrasolar planets, and good agreement with determinations of M and R from stellar models as constrained by trigonometric parallaxes and spectroscopic values of T(eff) and [Fe/H]. Finally, we list a set of 23 interferometric binaries with masses known to better than 3%, but without fundamental radius determinations (except alpha Aur). We discuss the prospects for improving these and other stellar parameters in the near future.Comment: 56 pages including figures and tables. To appear in The Astronomy and Astrophysics Review. Ascii versions of the tables will appear in the online version of the articl

    Contracting for the unknown and the logic of innovation

    Get PDF
    This paper discusses the components of contracts adequatefor governing innovation, and their microfoundations in the logic of innovative decision processes. Drawing on models of discovery and design processes, distinctive logical features of innovative decision making are specified and connected to features of contracts that can sustain innovation processes and do not fail under radical uncertainty. It is argued that if new knowledge is to be generated under uncertainty and risk, 'relational contracts', as usually intended, are not enough and a more robust type of contracting is needed and it is actually often used: formal constitutional contracts that associate resources, leave their uses rationally unspecified, but exhaustively specify the assignment of residual decision rights and other property rights, and the decision rules to be followed in governance. The argument is supported by an analysis of a large international database on the governance of multi-party projects in discovery-intensive and design-intensive industries

    The ‘Galilean Style in Science’ and the Inconsistency of Linguistic Theorising

    Get PDF
    Chomsky’s principle of epistemological tolerance says that in theoretical linguistics contradictions between the data and the hypotheses may be temporarily tolerated in order to protect the explanatory power of the theory. The paper raises the following problem: What kinds of contradictions may be tolerated between the data and the hypotheses in theoretical linguistics? First a model of paraconsistent logic is introduced which differentiates between week and strong contradiction. As a second step, a case study is carried out which exemplifies that the principle of epistemological tolerance may be interpreted as the tolerance of week contradiction. The third step of the argumentation focuses on another case study which exemplifies that the principle of epistemological tolerance must not be interpreted as the tolerance of strong contradiction. The reason for the latter insight is the unreliability and the uncertainty of introspective data. From this finding the author draws the conclusion that it is the integration of different data types that may lead to the improvement of current theoretical linguistics and that the integration of different data types requires a novel methodology which, for the time being, is not available
    corecore