10,102 research outputs found

    Immunochromatographic diagnostic test analysis using Google Glass.

    Get PDF
    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health

    ENHANCING USERS’ EXPERIENCE WITH SMART MOBILE TECHNOLOGY

    Get PDF
    The aim of this thesis is to investigate mobile guides for use with smartphones. Mobile guides have been successfully used to provide information, personalisation and navigation for the user. The researcher also wanted to ascertain how and in what ways mobile guides can enhance users' experience. This research involved designing and developing web based applications to run on smartphones. Four studies were conducted, two of which involved testing of the particular application. The applications tested were a museum mobile guide application and a university mobile guide mapping application. Initial testing examined the prototype work for the ‘Chronology of His Majesty Sultan Haji Hassanal Bolkiah’ application. The results were used to assess the potential of using similar mobile guides in Brunei Darussalam’s museums. The second study involved testing of the ‘Kent LiveMap’ application for use at the University of Kent. Students at the university tested this mapping application, which uses crowdsourcing of information to provide live data. The results were promising and indicate that users' experience was enhanced when using the application. Overall results from testing and using the two applications that were developed as part of this thesis show that mobile guides have the potential to be implemented in Brunei Darussalam’s museums and on campus at the University of Kent. However, modifications to both applications are required to fulfil their potential and take them beyond the prototype stage in order to be fully functioning and commercially viable

    Horizon Report 2009

    Get PDF
    El informe anual Horizon investiga, identifica y clasifica las tecnologías emergentes que los expertos que lo elaboran prevén tendrån un impacto en la enseñanza aprendizaje, la investigación y la producción creativa en el contexto educativo de la enseñanza superior. También estudia las tendencias clave que permiten prever el uso que se harå de las mismas y los retos que ellos suponen para las aulas. Cada edición identifica seis tecnologías o pråcticas. Dos cuyo uso se prevé emergerå en un futuro inmediato (un año o menos) dos que emergerån a medio plazo (en dos o tres años) y dos previstas a mås largo plazo (5 años)

    The imperial war museum’s social interpretation project

    Get PDF
    This report represents the output from research undertaken by University of Salford and MTM London as part of the joint Digital R&D Fund for Arts and Culture, operated by Nesta, Arts Council England and the AHRC. University of Salford and MTM London received funding from the programme to act as researchers on the Social Interpretation (SI) project, which was led by the Imperial War Museum (IWM) and their technical partners, The Centre for Digital Humanities, University College London, Knowledge Integration, and Gooii. The project was carried out between October 2011 and October 2012

    Integrating the Kinect camera, gesture recognition and mobile devices for interactive discussion

    Get PDF
    Session H4CThe Microsoft Kinect camera is a revolutionary and useful depth camera giving new user experience of interactive gaming on the Xbox platform through gesture or motion detection. Besides the infrared-based depth camera, an array of built-in microphones for voice command is installed along the horizontal bar of the Kinect camera. As a result, there are increasing interests to apply the Kinect camera for various real-life applications including the control of squirt guns for outdoor swimming pools. In additional to the Kinect camera, mobile devices such as the smartphones readily integrated with motion sensors have been used for different real-time control tasks like the remote control of robots. In this project, we propose to integrate the Microsoft Kinect camera together with the smartphones as intelligent control for interactive discussion or presentation for the future e-learning system. To demonstrate the feasibility of our proposal, a prototype of our proposed gesture recognition and command specification software is built using the C# language on the MS.NET platform, and will be evaluated with a careful plan. Furthermore, there are many interesting directions for further investigation of our proposal. © 2012 IEEE.published_or_final_versio
    • 

    corecore