8 research outputs found

    Spatiotemporal enabled Content-based Image Retrieval

    Full text link

    Recognition of Everyday Activities through Wearable Sensors and Machine Learning

    Get PDF
    Over the past several years, the use of wearable devices has increased dramatically, primarily for fitness monitoring, largely due to their greater sensor reliability, increased functionality, smaller size, increased ease of use, and greater affordability. These devices have helped many people of all ages live healthier lives and achieve their personal fitness goals, as they are able to see quantifiable and graphical results of their efforts every step of the way (i.e. in real-time). Yet, while these device systems work well within the fitness domain, they have yet to achieve a convincing level of functionality in the larger domain of healthcare. As an example, according to the Alzheimer’s Association, there are currently approximately 5.5 million Americans with Alzheimer’s Disease and approximately 5.3 million of them are over the age of 65, comprising 10% of this age group in the U.S. The economic toll of this disease is estimated to be around 259billion.By2050thenumberofAmericanswithAlzheimer’sdiseaseispredictedtoreacharound16millionwithaneconomictollofover259 billion. By 2050 the number of Americans with Alzheimer’s disease is predicted to reach around 16 million with an economic toll of over 1 trillion. There are other prevalent and chronic health conditions that are critically important to monitor, such as diabetes, complications from obesity, congestive heart failure, and chronic obstructive pulmonary disease (COPD) among others. The goal of this research is to explore and develop accurate and quantifiable sensing and machine learning techniques for eventual real-time health monitoring by wearable device systems. To that end, a two-tier recognition system is presented that is designed to identify health activities in a naturalistic setting based on accelerometer data of common activities. In Tier I a traditional activity recognition approach is employed to classify short windows of data, while in Tier II these classified windows are grouped to identify instances of a specific activity. Everyday activities that were explored in this research include brushing one’s teeth, combing one’s hair, scratching one’s chin, washing one’s hands, taking medication, and drinking. Results show that an F-measure of 0.83 is achievable when identifying these activities from each other and an F-measure of 0.82 is achievable when identifying instances of brushing teeth over the course of a day

    The Future of Ocean Governance and Capacity Development

    Get PDF
    The International Ocean Institute – Canada has compiled more than 80 insightful essays on the future of ocean governance and capacity development, based largely on themes of its Training Program at Dalhousie University in Canada, to honor the work of Elisabeth Mann Borgese (1918-2002). Readership: The essays cover a broad range of ocean governance and capacity development issues and explore future benefits and challenges. This essential collection is aimed at professionals, students and citizens alike

    Eye Tracking Methods for Analysis of Visuo-Cognitive Behavior in Medical Imaging

    Get PDF
    Predictive modeling of human visual search behavior and the underlying metacognitive processes is now possible thanks to significant advances in bio-sensing device technology and machine intelligence. Eye tracking bio-sensors, for example, can measure psycho-physiological response through change events in configuration of the human eye. These events include positional changes such as visual fixation, saccadic movements, and scanpath, and non-positional changes such as blinks and pupil dilation and constriction. Using data from eye-tracking sensors, we can model human perception, cognitive processes, and responses to external stimuli. In this study, we investigated the visuo-cognitive behavior of clinicians during the diagnostic decision process for breast cancer screening under clinically equivalent experimental conditions involving multiple monitors and breast projection views. Using a head-mounted eye tracking device and a customized user interface, we recorded eye change events and diagnostic decisions from 10 clinicians (three breast-imaging radiologists and seven Radiology residents) for a corpus of 100 screening mammograms (comprising cases of varied pathology and breast parenchyma density). We proposed novel features and gaze analysis techniques, which help to encode discriminative pattern changes in positional and non-positional measures of eye events. These changes were shown to correlate with individual image readers' identity and experience level, mammographic case pathology and breast parenchyma density, and diagnostic decision. Furthermore, our results suggest that a combination of machine intelligence and bio-sensing modalities can provide adequate predictive capability for the characterization of a mammographic case and image readers diagnostic performance. Lastly, features characterizing eye movements can be utilized for biometric identification purposes. These findings are impactful in real-time performance monitoring and personalized intelligent training and evaluation systems in screening mammography. Further, the developed algorithms are applicable in other application domains involving high-risk visual tasks

    Challenges and Opportunities in Applied System Innovation

    Get PDF
    This book introduces and provides solutions to a variety of problems faced by society, companies and individuals in a quickly changing and technology-dependent world. The wide acceptance of artificial intelligence, the upcoming fourth industrial revolution and newly designed 6G technologies are seen as the main enablers and game changers in this environment. The book considers these issues not only from a technological viewpoint but also on how society, labor and the economy are affected, leading to a circular economy that affects the way people design, function and deploy complex systems

    Envisioning a future for a spatial-health CyberGIS marketplace

    No full text
    corecore