8,807 research outputs found

    mFish Alpha Pilot: Building a Roadmap for Effective Mobile Technology to Sustain Fisheries and Improve Fisher Livelihoods.

    Get PDF
    In June 2014 at the Our Ocean Conference in Washington, DC, United States Secretary of State John Kerry announced the ambitious goal of ending overfishing by 2020. To support that goal, the Secretary's Office of Global Partnerships launched mFish, a public-private partnership to harness the power of mobile technology to improve fisher livelihoods and increase the sustainability of fisheries around the world. The US Department of State provided a grant to 50in10 to create a pilot of mFish that would allow for the identification of behaviors and incentives that might drive more fishers to adopt novel technology. In May 2015 50in10 and Future of Fish designed a pilot to evaluate how to improve adoption of a new mobile technology platform aimed at improving fisheries data capture and fisher livelihoods. Full report

    Augmented Reality in Astrophysics

    Full text link
    Augmented Reality consists of merging live images with virtual layers of information. The rapid growth in the popularity of smartphones and tablets over recent years has provided a large base of potential users of Augmented Reality technology, and virtual layers of information can now be attached to a wide variety of physical objects. In this article, we explore the potential of Augmented Reality for astrophysical research with two distinct experiments: (1) Augmented Posters and (2) Augmented Articles. We demonstrate that the emerging technology of Augmented Reality can already be used and implemented without expert knowledge using currently available apps. Our experiments highlight the potential of Augmented Reality to improve the communication of scientific results in the field of astrophysics. We also present feedback gathered from the Australian astrophysics community that reveals evidence of some interest in this technology by astronomers who experimented with Augmented Posters. In addition, we discuss possible future trends for Augmented Reality applications in astrophysics, and explore the current limitations associated with the technology. This Augmented Article, the first of its kind, is designed to allow the reader to directly experiment with this technology.Comment: 15 pages, 11 figures. Accepted for publication in Ap&SS. The final publication will be available at link.springer.co

    Adaptive User Perspective Rendering for Handheld Augmented Reality

    Full text link
    Handheld Augmented Reality commonly implements some variant of magic lens rendering, which turns only a fraction of the user's real environment into AR while the rest of the environment remains unaffected. Since handheld AR devices are commonly equipped with video see-through capabilities, AR magic lens applications often suffer from spatial distortions, because the AR environment is presented from the perspective of the camera of the mobile device. Recent approaches counteract this distortion based on estimations of the user's head position, rendering the scene from the user's perspective. To this end, approaches usually apply face-tracking algorithms on the front camera of the mobile device. However, this demands high computational resources and therefore commonly affects the performance of the application beyond the already high computational load of AR applications. In this paper, we present a method to reduce the computational demands for user perspective rendering by applying lightweight optical flow tracking and an estimation of the user's motion before head tracking is started. We demonstrate the suitability of our approach for computationally limited mobile devices and we compare it to device perspective rendering, to head tracked user perspective rendering, as well as to fixed point of view user perspective rendering

    Low-cost, smartphone-based instant three-dimensional registration system for infant functional near-infrared spectroscopy applications

    Get PDF
    Significance To effectively apply functional near-infrared spectroscopy (fNIRS)/diffuse optical tomography (DOT) devices, a three-dimensional (3D) model of the position of each optode on a subject’s scalp and the positions of that subject’s cranial landmarks are critical. Obtaining this information accurately in infants, who rarely stop moving, is an ongoing challenge. Aim We propose a smartphone-based registration system that can potentially achieve a full-head 3D scan of a 6-month-old infant instantly. Approach The proposed system is remotely controlled by a custom-designed Bluetooth controller. The scanned images can either be manually or automatically aligned to generate a 3D head surface model. Results A full-head 3D scan of a 6-month-old infant can be achieved within 2 s via this system. In testing on a realistic but static infant head model, the average Euclidean error of optode position using this device was 1.8 mm. Conclusions This low-cost 3D registration system therefore has the potential to permit accurate and near-instant fNIRS/DOT spatial registration
    • …
    corecore