2,798 research outputs found

    Fast and Robust Detection of Fallen People from a Mobile Robot

    Full text link
    This paper deals with the problem of detecting fallen people lying on the floor by means of a mobile robot equipped with a 3D depth sensor. In the proposed algorithm, inspired by semantic segmentation techniques, the 3D scene is over-segmented into small patches. Fallen people are then detected by means of two SVM classifiers: the first one labels each patch, while the second one captures the spatial relations between them. This novel approach showed to be robust and fast. Indeed, thanks to the use of small patches, fallen people in real cluttered scenes with objects side by side are correctly detected. Moreover, the algorithm can be executed on a mobile robot fitted with a standard laptop making it possible to exploit the 2D environmental map built by the robot and the multiple points of view obtained during the robot navigation. Additionally, this algorithm is robust to illumination changes since it does not rely on RGB data but on depth data. All the methods have been thoroughly validated on the IASLAB-RGBD Fallen Person Dataset, which is published online as a further contribution. It consists of several static and dynamic sequences with 15 different people and 2 different environments

    Non-overlapping dual camera fall detection using the NAO humanoid robot

    Get PDF
    With an aging population and a greater desire for independence, the dangers of falling incidents in the elderly have become particularly pronounced. In light of this, several technologies have been developed with the aim of preventing or monitoring falls. Failing to strike the balance between several factors including reliability, complexity and invasion of privacy has seen prohibitive in the uptake of these systems. Some systems rely on cameras being mounted in all rooms of a user's home while others require being worn 24 hours a day. This paper explores a system using a humanoid NAO robot with dual vertically mounted cameras to perform the task of fall detection

    Assistive technology design and development for acceptable robotics companions for ageing years

    Get PDF
    © 2013 Farshid Amirabdollahian et al., licensee Versita Sp. z o. o. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs license, which means that the text may be used for non-commercial purposes, provided credit is given to the author.A new stream of research and development responds to changes in life expectancy across the world. It includes technologies which enhance well-being of individuals, specifically for older people. The ACCOMPANY project focuses on home companion technologies and issues surrounding technology development for assistive purposes. The project responds to some overlooked aspects of technology design, divided into multiple areas such as empathic and social human-robot interaction, robot learning and memory visualisation, and monitoring persons’ activities at home. To bring these aspects together, a dedicated task is identified to ensure technological integration of these multiple approaches on an existing robotic platform, Care-O-Bot®3 in the context of a smart-home environment utilising a multitude of sensor arrays. Formative and summative evaluation cycles are then used to assess the emerging prototype towards identifying acceptable behaviours and roles for the robot, for example role as a butler or a trainer, while also comparing user requirements to achieved progress. In a novel approach, the project considers ethical concerns and by highlighting principles such as autonomy, independence, enablement, safety and privacy, it embarks on providing a discussion medium where user views on these principles and the existing tension between some of these principles, for example tension between privacy and autonomy over safety, can be captured and considered in design cycles and throughout project developmentsPeer reviewe

    Proceedings of the 23th Bilateral Student Workshop CTU Prague: Dresden (Germany) 2019

    Get PDF
    This technical report publishes the proceedings of the 23th Prague Workshop, which was held from 29th to 30th November 2019. The workshop offers a possibility for young scientists to present their current research work in the fields of computer graphics, human-computer-interaction, robotics and usability. The works is meant as a platform to bring together researchers from both the Czech Technical University in Prague (CTU) and the University of Applied Sciences Dresden (HTW). The German Academic Exchange Service offers its financial support to allow student participants the bilateral exchange between Prague and Dresden.:1) Incremental Pose Estimation of multiple LiDAR Scanners using their Pointclouds, S.3 2) Soft- and Hardware Developments for Immersive Learning, S.6 3) Qualitative comparison of methods for example-based style transfer, S.13 4) External Labeling With Utilization of Containment Information, S.16 5) Real Time Viewing Direction Analysis to Store Recognized Faces, S.20 6) Raising living standards of older adults - User research, S.29 7) Raising living standards of older adults - Concept, S.33 8) Towards the RoNiSCo Mobile Application, S.36 9) Development of a Fallen People Detector, S.41 10) Interactive tactile map for visually impaired older adults, S.47 11) Physical 3D LED display, S.5

    A Fallen Person Detector with a Privacy-Preserving Edge-AI Camera

    Get PDF
    As the population ages, Ambient-Assisted Living (AAL) environments are increasingly used to support older individuals’ safety and autonomy. In this study, we propose a low-cost, privacy-preserving sensor system integrated with mobile robots to enhance fall detection in AAL environments. We utilized the Luxonis OAKD Edge-AI camera mounted on a mobile robot to detect fallen individuals. The system was trained using YOLOv6 network on the E-FPDS dataset and optimized with a knowledge distillation approach onto the more compact YOLOv5 network, which was deployed on the camera. We evaluated the system’s performance using a custom dataset captured with a robot-mounted camera. We achieved a precision of 96.52%, a recall of 95.10%, and a recognition rate of 15 frames per second. The proposed system enhances the safety and autonomy of older individuals by enabling the rapid detection and response to falls.This work has been part supported by the visuAAL project on Privacy-Aware and Acceptable Video-Based Technologies and Services for Active and Assisted Living (https://www.visuaal-itn.eu/) funded by the EU H2020 Marie Skłodowska-Curie grant agreement No. 861091. The project has also been part supported by the SFI Future Innovator Award SFI/21/FIP/DO/9955 project Smart Hangar

    THINK Robots

    Get PDF
    Retailers rely on Kiva Systems’ warehouse robots to deliver order-fulfillment services, but current systems are frequently interrupted and require physical barriers to ensure compliance with safety regulations since Kiva does not currently rely on the obstacle detection system to contribute to the functional safety of its overall system. After evaluating operating scenarios and detection technologies, a solution comprised of a stereo vision system to detect static objects and a radio ranging system to identify humans in the vicinity was designed, built, and verified, with the aim of reducing undue downtime and allowing humans and robots to safely interact without physical restrictions
    • …
    corecore