7 research outputs found

    Creative Collaborations with Art, Music and Engineering: Improving the Perceptual Abilities of Novice Clinician

    No full text
    Mastery of the arts of inspection/observation, listening/auscultation, and touching/palpation is an essential skill for health-care providers. This presentation will demonstrate our innovative pedagogy and its impact on the physical examination skills of novice clinicians. We will introduce the audience to the use of art and visual training on observational and diagnostic reasoning skills, and demonstrate our music auditory training protocol. We will also detail our research findings related to students\u27 increased competence in detecting of heart, lung and bowel sounds. Finally, students and faculty from the Center for Engineering Innovation and Design (CEID) will detail our newest extension into the field of palpation and their experience creating a model to improve the perceptual skills of our students

    Development of an untethered, mobile, low-cost head-mounted eye tracker

    No full text
    Abstract Head-mounted eye-tracking systems allow us to observe participants' gaze behaviors in largely unconstrained, real-world settings. We have developed novel, untethered, mobile, low-cost, lightweight, easily-assembled head-mounted eye-tracking devices, comprised entirely of off-the-shelf components, including untethered, point-of-view, sports cameras. In total, the parts we have used cost ~153,andwesuggestuntestedalternativecomponentsthatreducethecostofpartsto 153, and we suggest untested alternative components that reduce the cost of parts to ~31. Our device can be easily assembled using hobbying skills and techniques. We have developed hardware, software, and methodological techniques to perform point-of-regard estimation, and to temporally align scene and eye videos in the face of variable frame rate, which plagues low-cost, lightweight, untethered cameras. We describe an innovative technique for synchronizing eye and scene videos using synchronized flashing lights. Our hardware, software, and calibration designs will be made publicly available, and we describe them in detail here, to facilitate replication of our system. We also describe novel smooth-pursuit-based calibration methodology, which affords rich sampling of calibration data while compensating for lack of information regarding the extent of visibility on participants' scene recordings. Validation experiments indicate accuracy within 0.752 degrees of visual angle on average. Introduction Eye tracking can provide information about an individual's cognitive state and about the nature of atypical attentional processes in individuals with neuropsychiatric conditions. In contrast to table-mounted or remote systems, head-mounted eye-tracking systems allow participants to move, extending gaze tracking from constrained stimuli in controlled environments, and out into the real world. Unfortunately, commercially developed headmounted eye trackers tend to be expensive, costing 10,000−10,000-40,000 per unit. The cost is likely generated by extensive testing, research, and development. For social research in particular, study designs may necessitate eye tracking of multiple participants, making the high price of commercial eye trackers especially prohibitive. As an alternative to commercial devices, several groups have developed custom-built head-mounted eye-trackers. This paper describes our efforts to create an untethered, headmounted eye tracking system which is affordable, usable in realworld settings, and functionally equivalent (that is, as accurate as) a commercial system. We discuss hardware design, video-based gaze estimation software design, and the calibration and operation protocols we have developed to create our system. We also describe empirical validation of acceptable accuracy in point-of-regard estimation. Our system costs approximately 153 in easily obtainable parts and can be assembled and used with minimal technical training. Our selection and modification of components, and our software development are all tuned to the specific formats and shortcomings of our particular equipment. Including our freely available software, our system manifests a highly accessible, highly accurate eye-tracking device. While ours is less robust than a commercial system, we have begun using it successfully in studies of visual attention in community settings, including classrooms andmuseums, with adults both with typical development and with autism spectrum disorders. Device design Cameras and storage media We use two cameras in our design; one (eye camera) pointed at the wearer's eye, to detect the pupil; and the other (scene camera) capturing the wearer's point of view. We selected Veho's Muvi Atom camera (60-$90, 30g or 1.05oz), which records video at a variable frame rate around 30 frames per second (fps), in VGA (640 x 480 pixel) resolution, in RGB 8-bit color, with automatic (and no manual) exposure adjustment, at an internally adjustable focal length. To compensate for observed shortcomings of the cameras, we developed extensive optical, electronic, mechanical, and software solutions. Although the parameters of our solutions
    corecore