1,537 research outputs found

    A deconvolution map-making method for experiments with circular scanning strategies

    Full text link
    Aims. To investigate the performance of a deconvolution map-making algorithm for an experiment with a circular scanning strategy, specifically in this case for the analysis of Planck data, and to quantify the effects of making maps using simplified approximations to the true beams. Methods. We present an implementation of a map-making algorithm which allows the combined treatment of temperature and polarisation data, and removal of instrumental effects, such as detector time constants and finite sampling intervals, as well as the deconvolution of arbitrarily complex beams from the maps. This method may be applied to any experiment with a circular scanning-strategy. Results. Low-resolution experiments were used to demonstrate the ability of this method to remove the effects of arbitrary beams from the maps and to demonstrate the effects on the maps of ignoring beam asymmetries. Additionally, results are presented of an analysis of a realistic full-scale simulated data-set for the Planck LFI 30 GHz channel. Conclusions. Our method successfully removes the effects of the beams from the maps, and although it is computationally expensive, the analysis of the Planck LFI data should be feasible with this approach.Comment: 14 pages, 14 figures, accepte

    Estado actual de la técnica y cuestiones perdurables en la recogida de datos antropométricos

    Get PDF
    The study of human body size and shape has been a topic of research for a very long time. In the past, anthropometry used traditional measuring techniques to record the dimensions of the human body and reported variance in body dimensions as a function of mean and standard deviation. Nowadays, the study of human body dimensions can be carried out more efficiently using three-dimensional body scanners, which can provide large amounts of anthropometric data more quickly than traditional techniques can. This paper presents a description of the broad range of issues related to the collection of anthropometric data using three-dimensional body scanners, including the different types of technologies available and their implications, the standard scanning process needed for effective data collection, and the possible sources of measurement errors that might affect the reliability and validity of the data collected.El estudio del tamaño y la forma del cuerpo humano ha sido un tema de investigación durante un tiempo muy largo. En el pasado, la antropometría utilizó técnicas de medición tradicionales para registrar las dimensiones del cuerpo humano y reportó la variación en las dimensiones del cuerpo en función de la media y la desviación estándar. Hoy en día, el estudio de las dimensiones del cuerpo humano se puede llevar a cabo utilizando maneras más eficientes, como los escáneres tridimensionales del cuerpo, que pueden proporcionar grandes cantidades de datos antropométricos más rápidamente que las técnicas tradicionales. En este trabajo se presenta una descripción de la amplia gama de temas relacionados con la recogida de datos antropométricos utilizando escáneres tridimensionales del cuerpo, incluyendo los diferentes tipos de tecnologías disponibles y sus implicaciones, el proceso de digitalización estándar necesario para la captura efectiva de datos, y las posibles fuentes de los errores de medición que podrán afectar la fiabilidad y validez de los datos recogidos.This work is financed by FEDER funds through the Competitive Factors Operational Program (COMPETE) POCI-01-0145-FEDER-007043 and POCI-01-0145FEDER-007136 and by national funds through FCT – the Portuguese Foundation for Science and Technology, under the projects UID/CEC/00319/2013 and UID/CTM/00264 respectively

    Iterative destriping and photometric calibration for Planck-HFI, polarized, multi-detector map-making

    Full text link
    We present an iterative scheme designed to recover calibrated I, Q, and U maps from Planck-HFI data using the orbital dipole due to the satellite motion with respect to the Solar System frame. It combines a map reconstruction, based on a destriping technique, juxtaposed with an absolute calibration algorithm. We evaluate systematic and statistical uncertainties incurred during both these steps with the help of realistic, Planck-like simulations containing CMB, foreground components and instrumental noise, and assess the accuracy of the sky map reconstruction by considering the maps of the residuals and their spectra. In particular, we discuss destriping residuals for polarization sensitive detectors similar to those of Planck-HFI under different noise hypotheses and show that these residuals are negligible (for intensity maps) or smaller than the white noise level (for Q and U Stokes maps), for l > 50. We also demonstrate that the combined level of residuals of this scheme remains comparable to those of the destriping-only case except at very low l where residuals from the calibration appear. For all the considered noise hypotheses, the relative calibration precision is on the order of a few 10e-4, with a systematic bias of the same order of magnitude.Comment: 18 pages, 21 figures. Match published versio

    SANEPIC: A Map-Making Method for Timestream Data From Large Arrays

    Get PDF
    We describe a map-making method which we have developed for the Balloon-borne Large Aperture Submillimeter Telescope (BLAST) experiment, but which should have general application to data from other submillimeter arrays. Our method uses a Maximum Likelihood based approach, with several approximations, which allows images to be constructed using large amounts of data with fairly modest computer memory and processing requirements. This new approach, Signal And Noise Estimation Procedure Including Correlations (SANEPIC), builds upon several previous methods, but focuses specifically on the regime where there is a large number of detectors sampling the same map of the sky, and explicitly allowing for the the possibility of strong correlations between the detector timestreams. We provide real and simulated examples of how well this method performs compared with more simplistic map-makers based on filtering. We discuss two separate implementations of SANEPIC: a brute-force approach, in which the inverse pixel-pixel covariance matrix is computed; and an iterative approach, which is much more efficient for large maps. SANEPIC has been successfully used to produce maps using data from the 2005 BLAST flight.Comment: 27 Pages, 15 figures; Submitted to the Astrophysical Journal; related results available at http://blastexperiment.info/ [the BLAST Webpage

    Detection of X-ray galaxy clusters based on the Kolmogorov method

    Full text link
    The detection of clusters of galaxies in large surveys plays an important part in extragalactic astronomy, and particularly in cosmology, since cluster counts can give strong constraints on cosmological parameters. X-ray imaging is in particular a reliable means to discover new clusters, and large X-ray surveys are now available. Considering XMM-Newton data for a sample of 40 Abell clusters, we show that their analysis with a Kolmogorov distribution can provide a distinctive signature for galaxy clusters. The Kolmogorov method is sensitive to the correlations in the cluster X-ray properties and can therefore be used for their identification, thus allowing to search reliably for clusters in a simple way

    All-sky convolution for polarimetry experiments

    Get PDF
    We discuss all-sky convolution of the instrument beam with the sky signal in polarimetry experiments, such as the Planck mission which will map the temperature anisotropy and polarization of the cosmic microwave background (CMB). To account properly for stray light (from e.g. the galaxy, sun, and planets) in the far side-lobes of such an experiment, it is necessary to perform the beam convolution over the full sky. We discuss this process in multipole space for an arbitrary beam response, fully including the effects of beam asymmetry and cross-polarization. The form of the convolution in multipole space is such that the Wandelt-Gorski fast technique for all-sky convolution of scalar signals (e.g. temperature) can be applied with little modification. We further show that for the special case of a pure co-polarized, axisymmetric beam the effect of the convolution can be described by spin-weighted window functions. In the limits of a small angle beam and large Legendre multipoles, the spin-weight 2 window function for the linear polarization reduces to the usual scalar window function used in previous analyses of beam effects in CMB polarimetry experiments. While we focus on the example of polarimetry experiments in the context of CMB studies, we emphasise that the formalism we develop is applicable to anisotropic filtering of arbitrary tensor fields on the sphere.Comment: 8 pages, 1 figure; Minor changes to match version accepted by Phys. Rev.

    Fast Pixel Space Convolution for CMB Surveys with Asymmetric Beams and Complex Scan Strategies: FEBeCoP

    Full text link
    Precise measurement of the angular power spectrum of the Cosmic Microwave Background (CMB) temperature and polarization anisotropy can tightly constrain many cosmological models and parameters. However, accurate measurements can only be realized in practice provided all major systematic effects have been taken into account. Beam asymmetry, coupled with the scan strategy, is a major source of systematic error in scanning CMB experiments such as Planck, the focus of our current interest. We envision Monte Carlo methods to rigorously study and account for the systematic effect of beams in CMB analysis. Toward that goal, we have developed a fast pixel space convolution method that can simulate sky maps observed by a scanning instrument, taking into account real beam shapes and scan strategy. The essence is to pre-compute the "effective beams" using a computer code, "Fast Effective Beam Convolution in Pixel space" (FEBeCoP), that we have developed for the Planck mission. The code computes effective beams given the focal plane beam characteristics of the Planck instrument and the full history of actual satellite pointing, and performs very fast convolution of sky signals using the effective beams. In this paper, we describe the algorithm and the computational scheme that has been implemented. We also outline a few applications of the effective beams in the precision analysis of Planck data, for characterizing the CMB anisotropy and for detecting and measuring properties of point sources.Comment: 26 pages, 15 figures. New subsection on beam/PSF statistics, new and better figures, more explicit algebra for polarized beams, added explanatory text at many places following referees comments [Accepted for publication in ApJS

    Quadratic Lagrangians and Topology in Gauge Theory Gravity

    Get PDF
    We consider topological contributions to the action integral in a gauge theory formulation of gravity. Two topological invariants are found and are shown to arise from the scalar and pseudoscalar parts of a single integral. Neither of these action integrals contribute to the classical field equations. An identity is found for the invariants that is valid for non-symmetric Riemann tensors, generalizing the usual GR expression for the topological invariants. The link with Yang-Mills instantons in Euclidean gravity is also explored. Ten independent quadratic terms are constructed from the Riemann tensor, and the topological invariants reduce these to eight possible independent terms for a quadratic Lagrangian. The resulting field equations for the parity non-violating terms are presented. Our derivations of these results are considerably simpler that those found in the literature

    Effect of Fourier filters in removing periodic systematic effects from CMB data

    Full text link
    We consider the application of high-pass Fourier filters to remove periodic systematic fluctuations from full-sky survey CMB datasets. We compare the filter performance with destriping codes commonly used to remove the effect of residual 1/f noise from timelines. As a realistic working case, we use simulations of the typical Planck scanning strategy and Planck Low Frequency Instrument noise performance, with spurious periodic fluctuations that mimic a typical thermal disturbance. We show that the application of Fourier high-pass filters in chunks always requires subsequent normalisation of induced offsets by means of destriping. For a complex signal containing all the astrophysical and instrumental components, the result obtained by applying filter and destriping in series is comparable to the result obtained by destriping only, which makes the usefulness of Fourier filters questionable for removing this kind of effects.Comment: 10 pages, 8 figures, published in Astronomy & Astrophysic
    corecore