45 research outputs found

    Towards a Low-Cost Mobile Subcutaneous Vein Detection Solution Using Near-Infrared Spectroscopy

    Get PDF
    Excessive venipunctures are both time- and resource-consuming events, which cause anxiety, pain, and distress in patients, or can lead to severe harmful injuries. We propose a low-cost mobile health solution for subcutaneous vein detection using near-infrared spectroscopy, along with an assessment of the current state of the art in this field. The first objective of this study was to get a deeper overview of the research topic, through the initial team discussions and a detailed literature review (using both academic and grey literature). The second objective, that is, identifying the commercial systems employing near-infrared spectroscopy, was conducted using the PubMed database. The goal of the third objective was to identify and evaluate (using the IEEE Xplore database) the research efforts in the field of low-cost near-infrared imaging in general, as a basis for the conceptual model of the upcoming prototype. Although the reviewed commercial devices have demonstrated usefulness and value for peripheral veins visualization, other evaluated clinical outcomes are less conclusive. Previous studies regarding low-cost near-infrared systems demonstrated the general feasibility of developing cost-effective vein detection systems; however, their limitations are restricting their applicability to clinical practice. Finally, based on the current findings, we outline the future research direction

    Agile software development in an earned value world: a survival guide

    Get PDF
    Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions

    The Zwicky Transient Facility Alert Distribution System

    Get PDF
    The Zwicky Transient Facility (ZTF) survey generates real-time alerts for optical transients, variables, and moving objects discovered in its wide-field survey. We describe the ZTF alert stream distribution and processing (filtering) system. The system uses existing open-source technologies developed in industry: Kafka, a real-time streaming platform, and Avro, a binary serialization format. The technologies used in this system provide a number of advantages for the ZTF use case, including (1) built-in replication, scalability, and stream rewind for the distribution mechanism; (2) structured messages with strictly enforced schemas and dynamic typing for fast parsing; and (3) a Python-based stream processing interface that is similar to batch for a familiar and user-friendly plug-in filter system, all in a modular, primarily containerized system. The production deployment has successfully supported streaming up to 1.2 million alerts or roughly 70 GB of data per night, with each alert available to a consumer within about 10 s of alert candidate production. Data transfer rates of about 80,000 alerts/minute have been observed. In this paper, we discuss this alert distribution and processing system, the design motivations for the technology choices for the framework, performance in production, and how this system may be generally suitable for other alert stream use cases, including the upcoming Large Synoptic Survey Telescope.Comment: Published in PASP Focus Issue on the Zwicky Transient Facility (doi: 10.1088/1538-3873/aae904). 9 Pages, 2 Figure

    Agile software development in an earned value world: a survival guide

    Get PDF
    Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions

    The Zwicky Transient Facility Alert Distribution System

    Get PDF
    The Zwicky Transient Facility (ZTF) survey generates real-time alerts for optical transients, variables, and moving objects discovered in its wide-field survey. We describe the ZTF alert stream distribution and processing (filtering) system. The system uses existing open-source technologies developed in industry: Kafka, a real-time streaming platform, and Avro, a binary serialization format. The technologies used in this system provide a number of advantages for the ZTF use case, including (1) built-in replication, scalability, and stream rewind for the distribution mechanism; (2) structured messages with strictly enforced schemas and dynamic typing for fast parsing; and (3) a Python-based stream processing interface that is similar to batch for a familiar and user-friendly plug-in filter system, all in a modular, primarily containerized system. The production deployment has successfully supported streaming up to 1.2 million alerts or roughly 70 GB of data per night, with each alert available to a consumer within about 10 s of alert candidate production. Data transfer rates of about 80,000 alerts/minute have been observed. In this paper, we discuss this alert distribution and processing system, the design motivations for the technology choices for the framework, performance in production, and how this system may be generally suitable for other alert stream use cases, including the upcoming Large Synoptic Survey Telescope

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    The stellar halo of the Galaxy

    Get PDF
    Stellar halos may hold some of the best preserved fossils of the formation history of galaxies. They are a natural product of the merging processes that probably take place during the assembly of a galaxy, and hence may well be the most ubiquitous component of galaxies, independently of their Hubble type. This review focuses on our current understanding of the spatial structure, the kinematics and chemistry of halo stars in the Milky Way. In recent years, we have experienced a change in paradigm thanks to the discovery of large amounts of substructure, especially in the outer halo. I discuss the implications of the currently available observational constraints and fold them into several possible formation scenarios. Unraveling the formation of the Galactic halo will be possible in the near future through a combination of large wide field photometric and spectroscopic surveys, and especially in the era of Gaia.Comment: 46 pages, 16 figures. References updated and some minor changes. Full-resolution version available at http://www.astro.rug.nl/~ahelmi/stellar-halo-review.pd

    The Third Data Release of the Sloan Digital Sky Survey

    Get PDF
    This paper describes the Third Data Release of the Sloan Digital Sky Survey (SDSS). This release, containing data taken up through June 2003, includes imaging data in five bands over 5282 deg^2, photometric and astrometric catalogs of the 141 million objects detected in these imaging data, and spectra of 528,640 objects selected over 4188 deg^2. The pipelines analyzing both images and spectroscopy are unchanged from those used in our Second Data Release.Comment: 14 pages, including 2 postscript figures. Submitted to AJ. Data available at http://www.sdss.org/dr

    The 4D nucleome project

    Get PDF
    corecore