990 research outputs found

    A Scalable Correlator Architecture Based on Modular FPGA Hardware, Reuseable Gateware, and Data Packetization

    Full text link
    A new generation of radio telescopes is achieving unprecedented levels of sensitivity and resolution, as well as increased agility and field-of-view, by employing high-performance digital signal processing hardware to phase and correlate large numbers of antennas. The computational demands of these imaging systems scale in proportion to BMN^2, where B is the signal bandwidth, M is the number of independent beams, and N is the number of antennas. The specifications of many new arrays lead to demands in excess of tens of PetaOps per second. To meet this challenge, we have developed a general purpose correlator architecture using standard 10-Gbit Ethernet switches to pass data between flexible hardware modules containing Field Programmable Gate Array (FPGA) chips. These chips are programmed using open-source signal processing libraries we have developed to be flexible, scalable, and chip-independent. This work reduces the time and cost of implementing a wide range of signal processing systems, with correlators foremost among them,and facilitates upgrading to new generations of processing technology. We present several correlator deployments, including a 16-antenna, 200-MHz bandwidth, 4-bit, full Stokes parameter application deployed on the Precision Array for Probing the Epoch of Reionization.Comment: Accepted to Publications of the Astronomy Society of the Pacific. 31 pages. v2: corrected typo, v3: corrected Fig. 1

    The Precision Array for Probing the Epoch of Reionization: 8 Station Results

    Full text link
    We are developing the Precision Array for Probing the Epoch of Reionization (PAPER) to detect 21cm emission from the early Universe, when the first stars and galaxies were forming. We describe the overall experiment strategy and architecture and summarize two PAPER deployments: a 4-antenna array in the low-RFI environment of Western Australia and an 8-antenna array at our prototyping site in Green Bank, WV. From these activities we report on system performance, including primary beam model verification, dependence of system gain on ambient temperature, measurements of receiver and overall system temperatures, and characterization of the RFI environment at each deployment site. We present an all-sky map synthesized between 139 MHz and 174 MHz using data from both arrays that reaches down to 80 mJy (4.9 K, for a beam size of 2.15e-5 steradians at 154 MHz), with a 10 mJy (620 mK) thermal noise level that indicates what would be achievable with better foreground subtraction. We calculate angular power spectra (CC_\ell) in a cold patch and determine them to be dominated by point sources, but with contributions from galactic synchrotron emission at lower radio frequencies and angular wavemodes. Although the cosmic variance of foregrounds dominates errors in these power spectra, we measure a thermal noise level of 310 mK at =100\ell=100 for a 1.46-MHz band centered at 164.5 MHz. This sensitivity level is approximately three orders of magnitude in temperature above the level of the fluctuations in 21cm emission associated with reionization.Comment: 13 pages, 14 figures, submitted to AJ. Revision 2 corrects a scaling error in the x axis of Fig. 12 that lowers the calculated power spectrum temperatur

    On the alleged simplicity of impure proof

    Get PDF
    Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic,” “extraneous,” “distant,” “remote,” “alien,” or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self- evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast, mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This article is an investigation of this claim, formulated more precisely by proof- theoretic means. After assembling evidence from proof theory that may be thought to support this claim, we will argue that on the contrary this evidence does not support the claim

    Planar localisation analyses: a novel application of a centre of mass approach

    Get PDF
    Sound localisation is one of the key roles for listening, and measuring localisation performance is a mainstay of the hearing research laboratory. Such measurements consider both accuracy and, for incorrect trials, the size of the error. In terms of error analysis, localisation studies have frequently used general univariate techniques in conjunction with either mean signed or unsigned error measurements. This approach can make inappropriate distributional assumptions and so more suitable alternatives based on directional statistics (e.g. based on von Mises distributed data) have also been used. However these are not readily computed using most commercially available, commonly used statistical software, and are generally only defined for simple experimental designs. We describe a novel use of a 'centre of mass' approach for describing localisation data jointly in terms of accuracy and size of error. This spatial method offers powerful, yet flexible, statistical analysis using standard multivariate analysis of variance (MANOVA)

    Search for CP Violation in the Decay Z -> b (b bar) g

    Full text link
    About three million hadronic decays of the Z collected by ALEPH in the years 1991-1994 are used to search for anomalous CP violation beyond the Standard Model in the decay Z -> b \bar{b} g. The study is performed by analyzing angular correlations between the two quarks and the gluon in three-jet events and by measuring the differential two-jet rate. No signal of CP violation is found. For the combinations of anomalous CP violating couplings, h^b=h^AbgVbh^VbgAb{\hat{h}}_b = {\hat{h}}_{Ab}g_{Vb}-{\hat{h}}_{Vb}g_{Ab} and hb=h^Vb2+h^Ab2h^{\ast}_b = \sqrt{\hat{h}_{Vb}^{2}+\hat{h}_{Ab}^{2}}, limits of \hat{h}_b < 0.59and and h^{\ast}_{b} < 3.02$ are given at 95\% CL.Comment: 8 pages, 1 postscript figure, uses here.sty, epsfig.st

    Development of a Core Outcome Set for effectiveness trials aimed at optimising prescribing in older adults in care homes

    Get PDF
    Background: Prescribing medicines for older adults in care homes is known to be sub-optimal. Whilst trials testing interventions to optimise prescribing in this setting have been published, heterogeneity in outcome reporting has hindered comparison of interventions, thus limiting evidence synthesis. The aim of this study was to develop a core outcome set (COS), a list of outcomes which should be measured and reported, as a minimum, for all effectiveness trials involving optimising prescribing in care homes. The COS was developed as part of the Care Homes Independent Pharmacist Prescribing Study (CHIPPS). Methods: A long-list of outcomes was identified through a review of published literature and stakeholder input. Outcomes were reviewed and refined prior to entering a two-round online Delphi exercise and then distributed via a web link to the CHIPPS Management Team, a multidisciplinary team including pharmacists, doctors and Patient Public Involvement representatives (amongst others), who comprised the Delphi panel. The Delphi panellists (n = 19) rated the importance of outcomes on a 9-point Likert scale from 1 (not important) to 9 (critically important). Consensus for an outcome being included in the COS was defined as ≥70% participants scoring 7–9 and <15% scoring 1–3. Exclusion was defined as ≥70% scoring 1–3 and <15% 7–9. Individual and group scores were fed back to participants alongside the second questionnaire round, which included outcomes for which no consensus had been achieved. Results: A long-list of 63 potential outcomes was identified. Refinement of this long-list of outcomes resulted in 29 outcomes, which were included in the Delphi questionnaire (round 1). Following both rounds of the Delphi exercise, 13 outcomes (organised into seven overarching domains: medication appropriateness, adverse drug events, prescribing errors, falls, quality of life, all-cause mortality and admissions to hospital (and associated costs)) met the criteria for inclusion in the final COS. Conclusions: We have developed a COS for effectiveness trials aimed at optimising prescribing in older adults in care homes using robust methodology. Widespread adoption of this COS will facilitate evidence synthesis between trials. Future work should focus on evaluating appropriate tools for these key outcomes to further reduce heterogeneity in outcome measurement in this context

    Integrating plant physiology into simulation of fire behavior and effects

    Get PDF
    Wildfires are a global crisis, but current fire models fail to capture vegetation response to changing climate. With drought and elevated temperature increasing the importance of vegetation dynamics to fire behavior, and the advent of next generation models capable of capturing increasingly complex physical processes, we provide a renewed focus on representation of woody vegetation in fire models. Currently, the most advanced representations of fire behavior and biophysical fire effects are found in distinct classes of fine-scale models and do not capture variation in live fuel (i.e. living plant) properties. We demonstrate that plant water and carbon dynamics, which influence combustion and heat transfer into the plant and often dictate plant survival, provide the mechanistic linkage between fire behavior and effects. Our conceptual framework linking remotely sensed estimates of plant water and carbon to fine-scale models of fire behavior and effects could be a critical first step toward improving the fidelity of the coarse scale models that are now relied upon for global fire forecasting. This process-based approach will be essential to capturing the influence of physiological responses to drought and warming on live fuel conditions, strengthening the science needed to guide fire managers in an uncertain future

    Performance of the CMS Cathode Strip Chambers with Cosmic Rays

    Get PDF
    The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device in the CMS endcaps. Their performance has been evaluated using data taken during a cosmic ray run in fall 2008. Measured noise levels are low, with the number of noisy channels well below 1%. Coordinate resolution was measured for all types of chambers, and fall in the range 47 microns to 243 microns. The efficiencies for local charged track triggers, for hit and for segments reconstruction were measured, and are above 99%. The timing resolution per layer is approximately 5 ns

    Measurement of χ c1 and χ c2 production with s√ = 7 TeV pp collisions at ATLAS

    Get PDF
    The prompt and non-prompt production cross-sections for the χ c1 and χ c2 charmonium states are measured in pp collisions at s√ = 7 TeV with the ATLAS detector at the LHC using 4.5 fb−1 of integrated luminosity. The χ c states are reconstructed through the radiative decay χ c → J/ψγ (with J/ψ → μ + μ −) where photons are reconstructed from γ → e + e − conversions. The production rate of the χ c2 state relative to the χ c1 state is measured for prompt and non-prompt χ c as a function of J/ψ transverse momentum. The prompt χ c cross-sections are combined with existing measurements of prompt J/ψ production to derive the fraction of prompt J/ψ produced in feed-down from χ c decays. The fractions of χ c1 and χ c2 produced in b-hadron decays are also measured
    corecore