292 research outputs found

    Screening versus routine practice in detection of atrial fibrillation in patients aged 65 or over: Screening versus routine practice in detection cluster randomised controlled trial

    Get PDF
    Objectives : To assess whether screening improves the detection of atrial fibrillation (cluster randomisation) and to compare systematic and opportunistic screening. Design : Multicentred cluster randomised controlled trial, with subsidiary trial embedded within the intervention arm. Setting : 50 primary care centres in England, with further individual randomisation of patients in the intervention practices. Participants : 14,802 patients aged 65 or over in 25 intervention and 25 control practices. Interventions : Patients in intervention practices were randomly allocated to systematic screening (invitation for electrocardiography) or opportunistic screening (pulse taking and invitation for electrocardiography if the pulse was irregular). Screening took place over 12 months in each practice from October 2001 to February 2003. No active screening took place in control practices. Main outcome measure : Newly identified atrial fibrillation. Results : The detection rate of new cases of atrial fibrillation was 1.63% a year in the intervention practices and 1.04% in control practices (difference 0.59%, 95% confidence interval 0.20% to 0.98%). Systematic and opportunistic screening detected similar numbers of new cases (1.62% v 1.64%, difference 0.02%, −0.5% to 0.5%). Conclusion : Active screening for atrial fibrillation detects additional cases over current practice. The preferred method of screening in patients aged 65 or over in primary care is opportunistic pulse taking with follow-up electrocardiography. Trial registration Current Controlled Trials ISRCTN19633732

    MaxEnt power spectrum estimation using the Fourier transform for irregularly sampled data applied to a record of stellar luminosity

    Full text link
    The principle of maximum entropy is applied to the spectral analysis of a data signal with general variance matrix and containing gaps in the record. The role of the entropic regularizer is to prevent one from overestimating structure in the spectrum when faced with imperfect data. Several arguments are presented suggesting that the arbitrary prefactor should not be introduced to the entropy term. The introduction of that factor is not required when a continuous Poisson distribution is used for the amplitude coefficients. We compare the formalism for when the variance of the data is known explicitly to that for when the variance is known only to lie in some finite range. The result of including the entropic measure factor is to suggest a spectrum consistent with the variance of the data which has less structure than that given by the forward transform. An application of the methodology to example data is demonstrated.Comment: 15 pages, 13 figures, 1 table, major revision, final version, Accepted for publication in Astrophysics & Space Scienc

    Evolution of the Luminosity Function and Colors of Galaxies in a Lambda-CDM Universe

    Full text link
    The luminosity function of galaxies is derived from a cosmological hydrodynamic simulation of a Lambda cold dark matter (CDM) universe with the aid of a stellar population synthesis model. At z=0, the resulting B band luminosity function has a flat faint end slope of \alpha \approx -1.15 with the characteristic luminosity and the normalization in a fair agreement with observations, while the dark matter halo mass function is steep with a slope of \alpha \approx -2. The colour distribution of galaxies also agrees well with local observations. We also discuss the evolution of the luminosity function, and the colour distribution of galaxies from z=0 to 5. A large evolution of the characteristic mass in the stellar mass function due to number evolution is compensated by luminosity evolution; the characteristic luminosity increases only by 0.8 mag from z=0 to 2, and then declines towards higher redshift, while the B band luminosity density continues to increase from z=0 to 5 (but only slowly at z>3).Comment: 6 pages, including 4 figures, mn2e style. Accepted to MNRAS pink page

    Computer simulations of hard pear-shaped particles

    Get PDF
    We report results obtained from Monte Carlo simulations investi- gating mesophase formation in two model systems of hard pear-shaped particles. The first model considered is a hard variant of the trun- cated Stone-Expansion model previously shown to form nematic and smectic mesophases when embedded within a 12-6 Gay-Berne-like po- tential [1]. When stripped of its attractive interactions, however, this system is found to lose its liquid crystalline phases. For particles of length to breadth ratio k = 3, glassy behaviour is seen at high pressures, whereas for k = 5 several bi-layer-like domains are seen, with high intradomain order but little interdomain orientational correlation. For the second model, which uses a parametric shape parameter based on the generalised Gay-Berne formalism, results are presented for particles with elongation k = 3; 4 and 5. Here, the systems with k = 3 and 4 fail to display orientationally ordered phases, but that with k = 5 shows isotropic, nematic and, unusually for a hard-particle model, interdigitated smectic A2 phases.</p

    Apoptotic signaling clears engineered Salmonella in an organ-specific manner

    Get PDF
    Pyroptosis and apoptosis are two forms of regulated cell death that can defend against intracellular infection. When a cell fails to complete pyroptosis, backup pathways will initiate apop-tosis. Here, we investigated the utility of apoptosis compared to pyroptosis in defense against an intracellular bacterial infection. We previously engineered Salmonella enterica serovar Typhimurium to persistently express flagellin, and thereby activate NLRC4 during systemic infection in mice. The resulting pyroptosis clears this flagellin-engineered strain. We now show that infection of caspase-1 or gasdermin D deficient macrophages by this flagellin-engineered S. Typhimurium induces apop-tosis in vitro. Additionally, we engineered S. Typhimurium to translocate the pro-apoptotic BH3 domain of BID, which also triggers apoptosis in macrophages in vitro. During mouse infection, the apoptotic pathway successfully cleared these engineered S. Typhimurium from the intestinal niche but failed to clear the bacteria from the myeloid niche in the spleen or lymph nodes. In contrast, the pyroptotic pathway was beneficial in defense of both niches. To clear an infection, cells may have specific tasks that they must complete before they die; different modes of cell death could initiate these ‘bucket lists’ in either convergent or divergent ways

    SPIDER: Probing the Early Universe with a Suborbital Polarimeter

    Full text link
    We evaluate the ability of SPIDER, a balloon-borne polarimeter, to detect a divergence-free polarization pattern ("B-modes") in the Cosmic Microwave Background (CMB). In the inflationary scenario, the amplitude of this signal is proportional to that of the primordial scalar perturbations through the tensor-to-scalar ratio r. We show that the expected level of systematic error in the SPIDER instrument is significantly below the amplitude of an interesting cosmological signal with r=0.03. We present a scanning strategy that enables us to minimize uncertainty in the reconstruction of the Stokes parameters used to characterize the CMB, while accessing a relatively wide range of angular scales. Evaluating the amplitude of the polarized Galactic emission in the SPIDER field, we conclude that the polarized emission from interstellar dust is as bright or brighter than the cosmological signal at all SPIDER frequencies (90 GHz, 150 GHz, and 280 GHz), a situation similar to that found in the "Southern Hole." We show that two ~20-day flights of the SPIDER instrument can constrain the amplitude of the B-mode signal to r<0.03 (99% CL) even when foreground contamination is taken into account. In the absence of foregrounds, the same limit can be reached after one 20-day flight.Comment: 29 pages, 8 figures, 4 tables; v2: matches published version, flight schedule updated, two typos fixed in Table 2, references and minor clarifications added, results unchange

    Simulation techniques for cosmological simulations

    Get PDF
    Modern cosmological observations allow us to study in great detail the evolution and history of the large scale structure hierarchy. The fundamental problem of accurate constraints on the cosmological parameters, within a given cosmological model, requires precise modelling of the observed structure. In this paper we briefly review the current most effective techniques of large scale structure simulations, emphasising both their advantages and shortcomings. Starting with basics of the direct N-body simulations appropriate to modelling cold dark matter evolution, we then discuss the direct-sum technique GRAPE, particle-mesh (PM) and hybrid methods, combining the PM and the tree algorithms. Simulations of baryonic matter in the Universe often use hydrodynamic codes based on both particle methods that discretise mass, and grid-based methods. We briefly describe Eulerian grid methods, and also some variants of Lagrangian smoothed particle hydrodynamics (SPH) methods.Comment: 42 pages, 16 figures, accepted for publication in Space Science Reviews, special issue "Clusters of galaxies: beyond the thermal view", Editor J.S. Kaastra, Chapter 12; work done by an international team at the International Space Science Institute (ISSI), Bern, organised by J.S. Kaastra, A.M. Bykov, S. Schindler & J.A.M. Bleeke

    The PHENIX Experiment at RHIC

    Full text link
    The physics emphases of the PHENIX collaboration and the design and current status of the PHENIX detector are discussed. The plan of the collaboration for making the most effective use of the available luminosity in the first years of RHIC operation is also presented.Comment: 5 pages, 1 figure. Further details of the PHENIX physics program available at http://www.rhic.bnl.gov/phenix
    • …
    corecore