1,525 research outputs found
Bayesian Conditioning, the Reflection Principle, and Quantum Decoherence
The probabilities a Bayesian agent assigns to a set of events typically
change with time, for instance when the agent updates them in the light of new
data. In this paper we address the question of how an agent's probabilities at
different times are constrained by Dutch-book coherence. We review and attempt
to clarify the argument that, although an agent is not forced by coherence to
use the usual Bayesian conditioning rule to update his probabilities, coherence
does require the agent's probabilities to satisfy van Fraassen's [1984]
reflection principle (which entails a related constraint pointed out by
Goldstein [1983]). We then exhibit the specialized assumption needed to recover
Bayesian conditioning from an analogous reflection-style consideration.
Bringing the argument to the context of quantum measurement theory, we show
that "quantum decoherence" can be understood in purely personalist
terms---quantum decoherence (as supposed in a von Neumann chain) is not a
physical process at all, but an application of the reflection principle. From
this point of view, the decoherence theory of Zeh, Zurek, and others as a story
of quantum measurement has the plot turned exactly backward.Comment: 14 pages, written in memory of Itamar Pitowsk
Nuclear rDNA-based molecular clock of the evolution of triatominae (Hemiptera: Reduviidae), vectors of Chagas disease
The evolutionary history and times of divergence of triatomine bug lineages are estimated from molecular clocks inferred from nucleotide sequences of the small subunit SSU (18S) and the second internal transcribed spacer (ITS-2) of the nuclear ribosomal DNA of these reduviids. The 18S rDNA molecular clock rate in Triatominae, and Prosorrhynchan Hemiptera in general, appears to be of 1.8% per 100 million years (my). The ITS-2 molecular clock rate in Triatominae is estimated to be around 0.4-1% per 1 my, indicating that ITS-2 evolves 23-55 times faster than 18S rDNA. Inferred chronological data about the evolution of Triatominae fit well with current hypotheses on their evolutionary histories, but suggest reconsideration of the current taxonomy of North American species complexes
Palaeoclimate inferred from δ18O and palaeobotanical indicators in freshwater tufa of Lake Äntu Sinijärv, Estonia
We investigated a 3.75-m-long lacustrine sediment record from Lake Äntu Sinijärv, northern Estonia, which has a modeled basal age >12,800 cal yr BP. Our multi-proxy approach focused on the stable oxygen isotope composition (δ18O) of freshwater tufa. Our new palaeoclimate information for the Eastern Baltic region, based on high-resolution δ18O data (219 samples), is supported by pollen and plant macrofossil data. Radiocarbon dates were used to develop a core chronology and estimate sedimentation rates. Freshwater tufa precipitation started ca. 10,700 cal yr BP, ca. 2,000 years later than suggested by previous studies on the same lake. Younger Dryas cooling is documented clearly in Lake Äntu Sinijärv sediments by abrupt appearance of diagnostic pollen (Betula nana, Dryas octopetala), highest mineral matter content in sediments (up to 90 %) and low values of δ18O (less than −12 ‰). Globally recognized 9.3- and 8.2-ka cold events are weakly defined by negative shifts in δ18O values, to −11.3 and −11.7 ‰, respectively, and low concentrations of herb pollen and charcoal particles. The Holocene thermal maximum (HTM) is palaeobotanically well documented by the first appearance and establishment of nemoral thermophilous taxa and presence of water lilies requiring warm conditions. Isotope values show an increasing trend during the HTM, from −11.5 to −10.5 ‰. Relatively stable environmental conditions, represented by only a small-scale increase in δ18O (up to 1 ‰) and high pollen concentrations between 5,000 and 3,000 cal yr BP, were followed by a decrease in δ18O, reaching the most negative value (−12.7 ‰) recorded in the freshwater tufa ca. 900 cal yr BP
Evaluation of physicians' professional performance: An iterative development and validation study of multisource feedback instruments
Contains fulltext :
107798.pdf (publisher's version ) (Open Access)BACKGROUND: There is a global need to assess physicians' professional performance in actual clinical practice. Valid and reliable instruments are necessary to support these efforts. This study focuses on the reliability and validity, the influences of some sociodemographic biasing factors, associations between self and other evaluations, and the number of evaluations needed for reliable assessment of a physician based on the three instruments used for the multisource assessment of physicians' professional performance in the Netherlands. METHODS: This observational validation study of three instruments underlying multisource feedback (MSF) was set in 26 non-academic hospitals in the Netherlands. In total, 146 hospital-based physicians took part in the study. Each physician's professional performance was assessed by peers (physician colleagues), co-workers (including nurses, secretary assistants and other healthcare professionals) and patients. Physicians also completed a self-evaluation. Ratings of 864 peers, 894 co-workers and 1960 patients on MSF were available. We used principal components analysis and methods of classical test theory to evaluate the factor structure, reliability and validity of instruments. We used Pearson's correlation coefficient and linear mixed models to address other objectives. RESULTS: The peer, co-worker and patient instruments respectively had six factors, three factors and one factor with high internal consistencies (Cronbach's alpha 0.95 - 0.96). It appeared that only 2 percent of variance in the mean ratings could be attributed to biasing factors. Self-ratings were not correlated with peer, co-worker or patient ratings. However, ratings of peers, co-workers and patients were correlated. Five peer evaluations, five co-worker evaluations and 11 patient evaluations are required to achieve reliable results (reliability coefficient >/= 0.70). CONCLUSIONS: The study demonstrated that the three MSF instruments produced reliable and valid data for evaluating physicians' professional performance in the Netherlands. Scores from peers, co-workers and patients were not correlated with self-evaluations. Future research should examine improvement of performance when using MSF
Very Cold Gas and Dark Matter
We have recently proposed a new candidate for baryonic dark matter: very cold
molecular gas, in near-isothermal equilibrium with the cosmic background
radiation at 2.73 K. The cold gas, of quasi-primordial abundances, is condensed
in a fractal structure, resembling the hierarchical structure of the detected
interstellar medium.
We present some perspectives of detecting this very cold gas, either directly
or indirectly. The H molecule has an "ultrafine" structure, due to the
interaction between the rotation-induced magnetic moment and the nuclear spins.
But the lines fall in the km domain, and are very weak. The best opportunity
might be the UV absorption of H in front of quasars. The unexpected cold
dust component, revealed by the COBE/FIRAS submillimetric results, could also
be due to this very cold H gas, through collision-induced radiation, or
solid H grains or snowflakes. The -ray distribution, much more
radially extended than the supernovae at the origin of cosmic rays
acceleration, also points towards and extended gas distribution.Comment: 16 pages, Latex pages, crckapb macro, 3 postscript figures, uuencoded
compressed tar file. To be published in the proceeedings of the
"Dust-Morphology" conference, Johannesburg, 22-26 January, 1996, D. Block
(ed.), (Kluwer Dordrecht
MSSM Baryogenesis and Electric Dipole Moments: An Update on the Phenomenology
We explore the implications of electroweak baryogenesis for future searches
for permanent electric dipole moments in the context of the minimal
supersymmetric extension of the Standard Model (MSSM). From a cosmological
standpoint, we point out that regions of parameter space that over-produce
relic lightest supersymmetric particles can be salvaged only by assuming a
dilution of the particle relic density that makes it compatible with the dark
matter density: this dilution must occur after dark matter freeze-out, which
ordinarily takes place after electroweak baryogenesis, implying the same degree
of dilution for the generated baryon number density as well. We expand on
previous studies on the viable MSSM regions for baryogenesis, exploring for the
first time an orthogonal slice of the relevant parameter space, namely the
(tan\beta, m_A) plane, and the case of non-universal relative gaugino-higgsino
CP violating phases. The main result of our study is that in all cases lower
limits on the size of the electric dipole moments exist, and are typically on
the same order, or above, the expected sensitivity of the next generation of
experimental searches, implying that MSSM electroweak baryogenesis will be soon
conclusively tested.Comment: 23 pages, 10 figures, matches version published in JHE
Assessing time to pulmonary function benefit following antibiotic treatment of acute cystic fibrosis exacerbations
<p>Abstract</p> <p>Background</p> <p>Cystic Fibrosis (CF) is a life-shortening genetic disease in which ~80% of deaths result from loss of lung function linked to inflammation due to chronic bacterial infection (principally <it>Pseudomonas aeruginosa</it>). Pulmonary exacerbations (intermittent episodes during which symptoms of lung infection increase and lung function decreases) can cause substantial resource utilization, morbidity, and irreversible loss of lung function. Intravenous antibiotic treatment to reduce exacerbation symptoms is standard management practice. However, no prospective studies have identified an optimal antibiotic treatment duration and this lack of objective data has been identified as an area of concern and interest.</p> <p>Methods</p> <p>We have retrospectively analyzed pulmonary function response data (as forced expiratory volume in one second; FEV<sub>1</sub>) from a previous blinded controlled CF exacerbation management study of intravenous ceftazidime/tobramycin and meropenem/tobramycin in which spirometry was conducted daily to assess the time course of pulmonary function response.</p> <p>Results</p> <p>Ninety-five patients in the study received antibiotics for at least 4 days and were included in our analyses. Patients received antibiotics for an average of 12.6 days (median = 13, SD = 3.2 days), with a range of 4 to 27 days. No significant differences were observed in mean or median treatment durations as functions of either treatment group or baseline lung disease stage. Average time from initiation of antibiotic treatment to highest observed FEV<sub>1 </sub>was 8.7 days (median = 10, SD = 4.0 days), with a range of zero to 19 days. Patients were treated an average of 3.9 days beyond the day of peak FEV<sub>1 </sub>(median = 3, SD = 3.8 days), with 89 patients (93.7%) experiencing their peak FEV<sub>1 </sub>improvement within 13 days. There were no differences in mean or median times to peak FEV<sub>1 </sub>as a function of treatment group, although the magnitude of FEV<sub>1 </sub>improvement differed between groups.</p> <p>Conclusions</p> <p>Our results suggest that antibiotic response to exacerbation as assessed by pulmonary function is essentially complete within 2 weeks of treatment initiation and relatively independent of the magnitude of pulmonary function response observed.</p
Diagnosing Spin at the LHC via Vector Boson Fusion
We propose a new technique for determining the spin of new massive particles
that might be discovered at the Large Hadron Collider. The method relies on
pair-production of the new particles in a kinematic regime where the vector
boson fusion production mechanism is enhanced. For this regime, we show that
the distribution of the leading jets as a function of their relative azimuthal
angle can be used to distinguish spin-0 from spin-1/2 particles. We illustrate
this effect by considering the particular cases of (i) strongly-interacting,
stable particles and (ii) supersymmetric particles carrying color charge. We
argue that this method should be applicable in a wide range of new physics
scenarios.Comment: 5 pages, 4 figure
The costs of preventing and treating chagas disease in Colombia
Background: The objective of this study is to report the costs of Chagas disease in Colombia, in terms of vector disease control programmes and the costs of providing care to chronic Chagas disease patients with cardiomyopathy.
Methods: Data were collected from Colombia in 2004. A retrospective review of costs for vector control programmes carried out in rural areas included 3,084 houses surveyed for infestation with triatomine bugs and 3,305 houses sprayed with insecticide. A total of 63 patient records from 3 different hospitals were selected for a retrospective review of resource use. Consensus methodology with local experts was used to estimate care seeking behaviour and to complement observed data on utilisation. Findings: The mean cost per house per entomological survey was of 2004), whereas the mean cost of spraying a house with insecticide was 46.4 and 1,028, whereas lifetime costs averaged $11,619 per patient. Chronic Chagas disease patients have limited access to healthcare, with an estimated 22% of patients never seeking care. Conclusion: Chagas disease is a preventable condition that affects mostly poor populations living in rural areas. The mean costs of surveying houses for infestation and spraying infested houses were low in comparison to other studies and in line with treatment costs. Care seeking behaviour and the type of insurance affiliation seem to play a role in the facilities and type of care that patients use, thus raising concerns about equitable access to care. Preventing Chagas disease in Colombia would be cost-effective and could contribute to prevent inequalities in health and healthcare.Wellcome Trus
Reciprocity as a foundation of financial economics
This paper argues that the subsistence of the fundamental theorem of contemporary financial mathematics is the ethical concept ‘reciprocity’. The argument is based on identifying an equivalence between the contemporary, and ostensibly ‘value neutral’, Fundamental Theory of Asset Pricing with theories of mathematical probability that emerged in the seventeenth century in the context of the ethical assessment of commercial contracts in a framework of Aristotelian ethics. This observation, the main claim of the paper, is justified on the basis of results from the Ultimatum Game and is analysed within a framework of Pragmatic philosophy. The analysis leads to the explanatory hypothesis that markets are centres of communicative action with reciprocity as a rule of discourse. The purpose of the paper is to reorientate financial economics to emphasise the objectives of cooperation and social cohesion and to this end, we offer specific policy advice
- …