177 research outputs found

    Medical data processing and analysis for remote health and activities monitoring

    Get PDF
    Recent developments in sensor technology, wearable computing, Internet of Things (IoT), and wireless communication have given rise to research in ubiquitous healthcare and remote monitoring of human\u2019s health and activities. Health monitoring systems involve processing and analysis of data retrieved from smartphones, smart watches, smart bracelets, as well as various sensors and wearable devices. Such systems enable continuous monitoring of patients psychological and health conditions by sensing and transmitting measurements such as heart rate, electrocardiogram, body temperature, respiratory rate, chest sounds, or blood pressure. Pervasive healthcare, as a relevant application domain in this context, aims at revolutionizing the delivery of medical services through a medical assistive environment and facilitates the independent living of patients. In this chapter, we discuss (1) data collection, fusion, ownership and privacy issues; (2) models, technologies and solutions for medical data processing and analysis; (3) big medical data analytics for remote health monitoring; (4) research challenges and opportunities in medical data analytics; (5) examples of case studies and practical solutions

    InContexto: Multisensor Architecture to Obtain People Context from Smartphones

    Get PDF
    The way users intectact with smartphones is changing after the improvements made in their embedded sensors. Increasingly, these devices are being employed as tools to observe individuals habits. Smartphones provide a great set of embedded sensors, such as accelerometer, digital compass, gyroscope, GPS, microphone, and camera. This paper aims to describe a distributed architecture, called inContexto, to recognize user context information using mobile phones. Moreover, it aims to infer physical actions performed by users such as walking, running, and still. Sensory data is collected by HTC magic application made in Android OS, and it was tested achieving about 97% of accuracy classifying five different actions (still, walking and running).This work was supported in part by Projects CICYT TIN2011-28620-C02-01, CICYT TEC2011-28626-C02-02, CAM CONTEXTS (S2009/TIC-1485), and DPS2008-07029- C02-02.Publicad

    Sensor Technologies for Intelligent Transportation Systems

    Get PDF
    Modern society faces serious problems with transportation systems, including but not limited to traffic congestion, safety, and pollution. Information communication technologies have gained increasing attention and importance in modern transportation systems. Automotive manufacturers are developing in-vehicle sensors and their applications in different areas including safety, traffic management, and infotainment. Government institutions are implementing roadside infrastructures such as cameras and sensors to collect data about environmental and traffic conditions. By seamlessly integrating vehicles and sensing devices, their sensing and communication capabilities can be leveraged to achieve smart and intelligent transportation systems. We discuss how sensor technology can be integrated with the transportation infrastructure to achieve a sustainable Intelligent Transportation System (ITS) and how safety, traffic control and infotainment applications can benefit from multiple sensors deployed in different elements of an ITS. Finally, we discuss some of the challenges that need to be addressed to enable a fully operational and cooperative ITS environment

    Learning Sensory Representations with Minimal Supervision

    Get PDF

    Driver Assistance Technologies

    Get PDF
    Topic: Driver Assistance Technology is emerging as new driving technology popularly known as ADAS. It is supported with Adaptive Cruise Control, Automatic Emergency Brake, blind spot monitoring, lane change assistance, and forward collision warnings etc. It is an important platform to integrate these multiple applications by using data from multifunction sensors, cameras, radars, lidars etc. and send command to plural actuators, engine, brake, steering etc. ADAS technology can detect some objects, do basic classification, alert the driver of hazardous road conditions, and in some cases, slow or stop the vehicle. The architecture of the electronic control units (ECUs) is responsible for executing advanced driver assistance systems (ADAS) in vehicle which is changing as per its response during the process of driving. Automotive system architecture integrates multiple applications into ADAS ECUs that serve multiple sensors for their functions. Hardware architecture of ADAS and autonomous driving, includes automotive Ethernet, TSN, Ethernet switch and gateway, and domain controller while Software architecture of ADAS and autonomous driving, including AUTOSAR Classic and Adaptive, ROS 2.0 and QNX. This chapter explains the functioning of Assistance Driving Technology with the help of its architecture and various types of sensors

    On driver behavior recognition for increased safety:A roadmap

    Get PDF
    Advanced Driver-Assistance Systems (ADASs) are used for increasing safety in the automotive domain, yet current ADASs notably operate without taking into account drivers’ states, e.g., whether she/he is emotionally apt to drive. In this paper, we first review the state-of-the-art of emotional and cognitive analysis for ADAS: we consider psychological models, the sensors needed for capturing physiological signals, and the typical algorithms used for human emotion classification. Our investigation highlights a lack of advanced Driver Monitoring Systems (DMSs) for ADASs, which could increase driving quality and security for both drivers and passengers. We then provide our view on a novel perception architecture for driver monitoring, built around the concept of Driver Complex State (DCS). DCS relies on multiple non-obtrusive sensors and Artificial Intelligence (AI) for uncovering the driver state and uses it to implement innovative Human–Machine Interface (HMI) functionalities. This concept will be implemented and validated in the recently EU-funded NextPerception project, which is briefly introduced

    Sensor fusion in smart camera networks for ambient intelligence

    Get PDF
    This short report introduces the topics of PhD research that was conducted on 2008-2013 and was defended on July 2013. The PhD thesis covers sensor fusion theory, gathers it into a framework with design rules for fusion-friendly design of vision networks, and elaborates on the rules through fusion experiments performed with four distinct applications of Ambient Intelligence

    Review of data fusion methods for real-time and multi-sensor traffic flow analysis

    Get PDF
    Recently, development in intelligent transportation systems (ITS) requires the input of various kinds of data in real-time and from multiple sources, which imposes additional research and application challenges. Ongoing studies on Data Fusion (DF) have produced significant improvement in ITS and manifested an enormous impact on its growth. This paper reviews the implementation of DF methods in ITS to facilitate traffic flow analysis (TFA) and solutions that entail the prediction of various traffic variables such as driving behavior, travel time, speed, density, incident, and traffic flow. It attempts to identify and discuss real-time and multi-sensor data sources that are used for various traffic domains, including road/highway management, traffic states estimation, and traffic controller optimization. Moreover, it attempts to associate abstractions of data level fusion, feature level fusion, and decision level fusion on DF methods to better understand the role of DF in TFA and ITS. Consequently, the main objective of this paper is to review DF methods used for real-time and multi-sensor (heterogeneous) TFA studies. The review outcomes are (i) a guideline of constructing DF methods which involve preprocessing, filtering, decision, and evaluation as core steps, (ii) a description of the recent DF algorithms or methods that adopt real-time and multi-sensor sources data and the impact of these data sources on the improvement of TFA, (iii) an examination of the testing and evaluation methodologies and the popular datasets and (iv) an identification of several research gaps, some current challenges, and new research trends
    • 

    corecore