1,839 research outputs found

    An Algorithm for Precise Aperture Photometry of Critically Sampled Images

    Full text link
    We present an algorithm for performing precise aperture photometry on critically sampled astrophysical images. The method is intended to overcome the small-aperture limitations imposed by point-sampling. Aperture fluxes are numerically integrated over the desired aperture, with sinc-interpolation used to reconstruct values between pixel centers. Direct integration over the aperture is computationally intensive, but the integrals in question are shown to be convolution integrals and can be computed ~10000x faster as products in the wave-number domain. The method works equally well for annular and elliptical apertures and could be adapted for any geometry. A sample of code is provided to demonstrate the method.Comment: Accepted MNRA

    A Modified Magnitude System that Produces Well-Behaved Magnitudes, Colors, and Errors Even for Low Signal-to-Noise Ratio Measurements

    Get PDF
    We describe a modification of the usual definition of astronomical magnitudes, replacing the usual logarithm with an inverse hyperbolic sine function; we call these modified magnitudes `asinh magnitudes'. For objects detected at signal-to-noise ratios of greater than about five, our modified definition is essentially identical to the traditional one; for fainter objects (including those with a formally negative flux) our definition is well behaved, tending to a definite value with finite errors as the flux goes to zero. This new definition is especially useful when considering the colors of faint objects, as the difference of two `asinh' magnitudes measures the usual flux ratio for bright objects, while avoiding the problems caused by dividing two very uncertain values for faint objects. The Sloan Digital Sky Survey (SDSS) data products will use this scheme to express all magnitudes in their catalogs.Comment: 11 pages, including 3 postscript figures. Submitted to A

    Wavelength Dependent PSFs and their impact on Weak Lensing Measurements

    Full text link
    We measure and model the wavelength dependence of the PSF in the Hyper Suprime-Cam (HSC) Subaru Strategic Program (SSP) survey. We find that PSF chromaticity is present in that redder stars appear smaller than bluer stars in the g,r,g, r, and ii-bands at the 1-2 per cent level and in the zz and yy-bands at the 0.1-0.2 per cent level. From the color dependence of the PSF, we fit a model between the monochromatic PSF trace radius, RR, and wavelength of the form R(λ)∝λbR(\lambda)\propto \lambda^{b}. We find values of bb between -0.2 and -0.5, depending on the epoch and filter. This is consistent with the expectations of a turbulent atmosphere with an outer scale length of ∌10−100\sim 10-100 m, indicating that the atmosphere is dominating the chromaticity. We find evidence in the best seeing data that the optical system and detector also contribute some wavelength dependence. Meyers and Burchat (2015) showed that bb must be measured to an accuracy of ∌0.02\sim 0.02 not to dominate the systematic error budget of the Large Synoptic Survey Telescope (LSST) weak lensing (WL) survey. Using simple image simulations, we find that bb can be inferred with this accuracy in the rr and ii-bands for all positions in the LSST field of view, assuming a stellar density of 1 star arcmin−2^{-2} and that the optical PSF can be accurately modeled. Therefore, it is possible to correct for most, if not all, of the bias that the wavelength-dependent PSF will introduce into an LSST-like WL survey.Comment: 14 pages, 10 figures. Submitted to MNRAS. Comments welcom

    A method for optimal image subtraction

    Full text link
    We present a new method designed for optimal subtraction of two images with different seeing. Using image subtraction appears to be essential for the full analysis of the microlensing survey images, however a perfect subtraction of two images is not easy as it requires the derivation of an extremely accurate convolution kernel. Some empirical attempts to find the kernel have used the Fourier transform of bright stars, but solving the statistical problem of finding the best kernel solution has never really been tackled. We demonstrate that it is possible to derive an optimal kernel solution from a simple least square analysis using all the pixels of both images, and also show that it is possible to fit the differential background variation at the same time. We also show that PSF variations can also be easily handled by the method. To demonstrate the practical efficiency of the method, we analyzed some images from a Galactic Bulge field monitored by the OGLE II project. We find that the residuals in the subtracted images are very close to the photon noise expectations. We also present some light curves of variable stars, and show that, despite high crowding levels, we get an error distribution close to that expected from photon noise alone. We thus demonstrate that nearly optimal differential photometry can be achieved even in very crowded fields. We suggest that this algorithm might be particularly important for microlensing surveys, where the photometric accuracy and completeness levels could be very significantly improved by using this method.Comment: 8,pages, 4 Postscript figures, emulateapj.sty include

    College Student Home Computer Security Adoption

    Get PDF
    The home Internet user faces a hostile environment abundant in potential attacks on their computers. These attacks have been increasing at an alarming rate and cause damage to individuals and organizations regularly, and have the potential to cripple the critical infrastructures of entire countries. Recent research has determined that some individuals are not utilizing additional software protections available to mitigate these potential security risks. This paper seeks to clarify the reasons by proposing a conceptual framework that utilizes the Health Belief Model as a possible way to explain why some people do not perceive a threat sufficient to prompt the adoption of computer security software

    An Efficient Targeting Strategy for Multiobject Spectrograph Surveys: the Sloan Digital Sky Survey "Tiling" Algorithm

    Get PDF
    Large surveys using multiobject spectrographs require automated methods for deciding how to efficiently point observations and how to assign targets to each pointing. The Sloan Digital Sky Survey (SDSS) will observe around 10 6 spectra from targets distributed over an area of about 10,000 deg2, using a multiobject fiber spectrograph that can simultaneously observe 640 objects in a circular field of view (referred to as a "tile") 1°.49 in radius. No two fibers can be placed closer than 55Prime; during the same observation; multiple targets closer than this distance are said to "collide." We present here a method of allocating fibers to desired targets given a set of tile centers that includes the effects of collisions and that is nearly optimally efficient and uniform. Because of large-scale structure in the galaxy distribution (which form the bulk of the SDSS targets), a naive covering of the sky with equally spaced tiles does not yield uniform sampling. Thus, we present a heuristic for perturbing the centers of the tiles from the equally spaced distribution that provides more uniform completeness. For the SDSS sample, we can attain a sampling rate of greater than 92% for all targets, and greater than 99% for the set of targets that do not collide with each other, with an efficiency greater than 90% (defined as the fraction of available fibers assigned to targets). The methods used here may prove useful to those planning other large surveys

    Development of an mHealth platform for HIV Care: gathering user perspectives through co-design workshops and interviews

    Get PDF
    Background: Despite advances in testing and treatment, HIV incidence rates within European countries are at best stable or else increasing. mHealth technology has been advocated to increase quality and cost-effectiveness of health services while dealing with growing patient numbers. However, studies suggested that mHealth apps are rarely adopted and often considered to be of low quality by users. Only a few studies (conducted in the United States) have involved people living with HIV (PLWH) in the design of mHealth. Objective: The goal of this study was to facilitate a co-design process among PLWH and clinicians across 5 clinical sites in the European Union to inform the development of an mHealth platform to be integrated into clinical care pathways. We aimed to (1) elicit experiences of living with HIV and of working in HIV care, (2) identify mHealth functionalities that are considered useful for HIV care, and (3) identify potential benefits as well as concerns about mHealth. Methods: Between January and June 2016, 14 co-design workshops and 22 semistructured interviews were conducted, involving 97 PLWH and 63 clinicians. Data were analyzed thematically and iteratively, drawing on grounded theory techniques. Results: Findings were established into 3 thematic clusters: (1) approaching the mHealth platform, (2) imagining the mHealth platform, and (3) anticipating the mHealth platform’s implications. Co-design participants approached the mHealth platform with pre-existing concerns arising from their experiences of receiving or providing care. PLWH particularly addressed issues of stigma and questioned how mHealth could enable them to manage their HIV. Clinicians problematized the compatibility of mHealth with existing information technology systems and questioned which patients should be targeted by mHealth. Imagining the potential of mHealth for HIV care, co-design participants suggested medical functionalities (accessing test results, managing medicines and appointments, and digital communication channels), social functionalities (peer support network, international travel, etc), and general features (security and privacy, credibility, language, etc). Co-design participants also anticipated potential implications of mHealth for self-management and the provision of care. Conclusions: Our approach to co-design enabled us to facilitate early engagement in the mHealth platform, enabling patient and clinician feedback to become embedded in the development process at a preprototype phase. Although the technologies in question were not yet present, understanding how users approach, imagine, and anticipate technology formed an important source of knowledge and proved highly significant within the technology design and development process
    • 

    corecore