16,968 research outputs found

    Capturing natural-colour 3D models of insects for species discovery

    Full text link
    Collections of biological specimens are fundamental to scientific understanding and characterization of natural diversity. This paper presents a system for liberating useful information from physical collections by bringing specimens into the digital domain so they can be more readily shared, analyzed, annotated and compared. It focuses on insects and is strongly motivated by the desire to accelerate and augment current practices in insect taxonomy which predominantly use text, 2D diagrams and images to describe and characterize species. While these traditional kinds of descriptions are informative and useful, they cannot cover insect specimens "from all angles" and precious specimens are still exchanged between researchers and collections for this reason. Furthermore, insects can be complex in structure and pose many challenges to computer vision systems. We present a new prototype for a practical, cost-effective system of off-the-shelf components to acquire natural-colour 3D models of insects from around 3mm to 30mm in length. Colour images are captured from different angles and focal depths using a digital single lens reflex (DSLR) camera rig and two-axis turntable. These 2D images are processed into 3D reconstructions using software based on a visual hull algorithm. The resulting models are compact (around 10 megabytes), afford excellent optical resolution, and can be readily embedded into documents and web pages, as well as viewed on mobile devices. The system is portable, safe, relatively affordable, and complements the sort of volumetric data that can be acquired by computed tomography. This system provides a new way to augment the description and documentation of insect species holotypes, reducing the need to handle or ship specimens. It opens up new opportunities to collect data for research, education, art, entertainment, biodiversity assessment and biosecurity control.Comment: 24 pages, 17 figures, PLOS ONE journa

    Smartphone-based, rapid, wide-field fundus photography for diagnosis of pediatric retinal diseases

    Get PDF
    PurposeAn important, unmet clinical need is for cost-effective, reliable, easy-to-use, and portable retinal photography to evaluate preventable causes of vision loss in children. This study presents the feasibility of a novel smartphone-based retinal imaging device tailored to imaging the pediatric fundus.MethodsSeveral modifications for children were made to our previous device, including a child-friendly 3D printed housing of animals, attention-grabbing targets, enhanced image stitching, and video-recording capabilities. Retinal photographs were obtained in children undergoing routine dilated eye examination. Experienced masked retina-specialist graders determined photograph quality and made diagnoses based on the images, which were compared to the treating clinician's diagnosis.ResultsDilated fundus photographs were acquired in 43 patients with a mean age of 6.7 years. The diagnoses included retinoblastoma, Coats' disease, commotio retinae, and optic nerve hypoplasia, among others. Mean time to acquire five standard photographs totaling 90-degree field of vision was 2.3 ± 1.1 minutes. Patients rated their experience of image acquisition favorably, with a Likert score of 4.6 ± 0.8 out of 5. There was 96% agreement between image-based diagnosis and the treating clinician's diagnosis.ConclusionsWe report a handheld smartphone-based device with modifications tailored for wide-field fundus photography in pediatric patients that can rapidly acquire fundus photos while being well-tolerated.Translational relevanceAdvances in handheld smartphone-based fundus photography devices decrease the technical barrier for image acquisition in children and may potentially increase access to ophthalmic care in communities with limited resources

    A smartphone-based tool for rapid, portable, and automated wide-field retinal imaging

    Get PDF

    Reflectance Transformation Imaging (RTI) System for Ancient Documentary Artefacts

    No full text
    This tutorial summarises our uses of reflectance transformation imaging in archaeological contexts. It introduces the UK AHRC funded project reflectance Transformation Imaging for Anciant Documentary Artefacts and demonstrates imaging methodologies

    Multi-contrast imaging and digital refocusing on a mobile microscope with a domed LED array

    Get PDF
    We demonstrate the design and application of an add-on device for improving the diagnostic and research capabilities of CellScope--a low-cost, smartphone-based point-of-care microscope. We replace the single LED illumination of the original CellScope with a programmable domed LED array. By leveraging recent advances in computational illumination, this new device enables simultaneous multi-contrast imaging with brightfield, darkfield, and phase imaging modes. Further, we scan through illumination angles to capture lightfield datasets, which can be used to recover 3D intensity and phase images without any hardware changes. This digital refocusing procedure can be used for either 3D imaging or software-only focus correction, reducing the need for precise mechanical focusing during field experiments. All acquisition and processing is performed on the mobile phone and controlled through a smartphone application, making the computational microscope compact and portable. Using multiple samples and different objective magnifications, we demonstrate that the performance of our device is comparable to that of a commercial microscope. This unique device platform extends the field imaging capabilities of CellScope, opening up new clinical and research possibilities

    SmartHeLP: Smartphone-based Hemoglobin Level Prediction Using an Artificial Neural Network

    Get PDF
    Blood hemoglobin level (Hgb) measurement has a vital role in the diagnosis, evaluation, and management of numerous diseases. We describe the use of smartphone video imaging and an artificial neural network (ANN) system to estimate Hgb levels non-invasively. We recorded 10 second-300 frame fingertip videos using a smartphone in 75 adults. Red, green, and blue pixel intensities were estimated for each of 100 area blocks in each frame and the patterns across the 300 frames were described. ANN was then used to develop a model using the extracted video features to predict hemoglobin levels. In our study sample, with patients 20-56 years of age, and gold standard hemoglobin levels of 7.6 to 13.5 g/dL., we observed a 0.93 rank order of correlation between model and gold standard hemoglobin levels. Moreover, we identified specific regions of interest in the video images which reduced the required feature space

    An Interactive Zoo Guide: A Case Study of Collaborative Learning

    Full text link
    Real Industry Projects and team work can have a great impact on student learning but providing these activities requires significant commitment from academics. It requires several years planning implementing to create a collaborative learning environment that mimics the real world ICT (Information and Communication Technology) industry workplace. In this project, staff from all the three faculties, namely the Faculty of Health, Engineering and Science, Faculty of Arts, Education and Human Development, and Faculty of Business and Law in higher education work together to establish a detailed project management plan and to develop the unit guidelines for participating students. The proposed project brings together students from business, multimedia and computer science degrees studying their three project-based units within each faculty to work on a relatively large IT project with our industry partner, Melbourne Zoo. This paper presents one multimedia software project accomplished by one of the multi-discipline student project teams. The project was called 'Interactive ZooOz Guide' and developed on a GPS-enabled PDA device in 2007. The developed program allows its users to navigate through the Zoo via an interactive map and provides multimedia information of animals on hotspots at the 'Big Cats' section of the Zoo so that it enriches user experience at the Zoo. A recent development in zoo applications is also reviewed. This paper is also intended to encourage academia to break boundaries to enhance students' learning beyond classroom.Comment: 11 Page
    • …
    corecore