8 research outputs found

    Cognitive computing and wireless communications on the edge for healthcare service robots

    Get PDF
    In recent years, we have witnessed dramatic developments of mobile healthcare robots, which enjoy many advantages over their human counterparts. Previous communication networks for healthcare robots always suffer from high response latency and/or time-consuming computing demands. Robust and high-speed communications and swift processing are critical, sometimes vital in particular in the case of healthcare robots, to the healthcare receivers. As a promising solution, offloading delay-sensitive and communicating-intensive tasks to the robot is expected to improve the services and benefit users. In this paper, we review several state-of-the-art technologies, such as the human鈥搑obot interface, environment and user status perceiving, navigation, robust communication and artificial intelligence, of a mobile healthcare robot and discuss in details the customized demands over offloading the computation and communication tasks. According to the intrinsic demands of tasks over the network usage, we categorize abilities of a typical healthcare robot into alternative classes: the edge functionalities and the core functionalities. Many latency-sensitive tasks, such as user interaction, or time-consuming tasks including health receiver status recognition and autonomous moving, can be processed by the robot without frequent communications with data centers. On the other hand, several fundamental abilities, such as radio resource management, mobility management, service provisioning management, need to update the main body with the cutting-edge artificial intelligence. Robustness and safety, in this case, are the primary goals in wireless communications that AI may provide ground-breaking solutions. Based on this partition, this article refers to several state-of-the-art technologies of a mobile healthcare robot and reviews some challenges to be met for its wireless communications

    The Internet of Things in Healthcare. An Overview

    Get PDF
    La prestaci贸n de servicios de salud est谩 experimentando enormes cambios alrededor del mundo. El envejecimiento de la poblaci贸n, la creciente incidencia de enfermedades cr贸nicas, y la escasez de recursos se est谩n convirtiendo en una carga pesada para los actuales sistemas de salud y podr铆an comprometer la prestaci贸n de servicios de salud en las pr贸ximas d茅cadas. Por otro lado, la creciente popularidad de dispositivos para el cuidado de la salud y el bienestar, junto con avances en comunicaciones inal谩mbricas y en sensores abren la puerta a nuevos modelos para la prestaci贸n de servicios de salud respaldados por el Internet de las cosas (IoT). Este art铆culo presenta una revisi贸n general de las tendencias que est谩n impulsando el desarrollo de aplicaciones para el cuidado de la salud basadas en IoT, y las describe brevemente a nivel de sistema.The provision of healthcare is experimenting enormous changes worldwide. Population ageing, rising incidence of chronic diseases, and shortages of resources are placing a heavy burden in current healthcare systems and have the potential to risk the delivery of healthcare in the next few decades. On the other hand, the growing popularity of smart devices for healthcare and wellness, along with advances in wireless communications and sensors are opening the door to novel models of health care delivery supported by the Internet of things (IoT). This paper presents a review of the trends that are driving the development of IoT-based applications for healthcare and briefly describe them at a system level

    The Effectiveness of Medical Record Software to Improve Administrative Service Ability and Student Motivation

    Get PDF
    Technology can improve the learning process more quickly and effectively to improve students' cognitive skills. Technology-based learning media is not only for prospective teachers but can be helpful for prospective health workers. One indicator that shows prospective health workers' quality is the health services' ability. Therefore, preparing prospective health worker students who are technologically literate and highly motivated is necessary. This study aims to determine and analyze the effectiveness of Medical Information System software to improve the ability of health administration services and student motivation. The quantitative method with pre-experimental design. Pre-experimental design with one group through pre-test, intervention (treatment), and post-test. This research was conducted at STIKes Cirebon with a total sample of 110 students. The results showed that after implementing the medical record software, descriptively, there was an increase in the percentages of responsiveness, reliability, assurance, empathy, and administrative service quality. In addition, each indicator of student motivation increases. Based on hypothesis testing, it can be concluded that Medical Information System medical record software can significantly improve the ability of health administration services and student motivation. This research contributes to providing information to students and lecturers about health administration learning technology

    Towards a Human-Centric Digital Twin for Human鈥揗achine Collaboration:A Review on Enabling Technologies and Methods

    Get PDF
    With the intent to further increase production efficiency while making human the centre of the processes, human-centric manufacturing focuses on concepts such as digital twins and human鈥搈achine collaboration. This paper presents enabling technologies and methods to facilitate the creation of human-centric applications powered by digital twins, also from the perspective of Industry 5.0. It analyses and reviews the state of relevant information resources about digital twins for human鈥搈achine applications with an emphasis on the human perspective, but also on their collaborated relationship and the possibilities of their applications. Finally, it presents the results of the review and expected future works of research in this area

    A survey on deep transfer learning and edge computing for mitigating the COVID-19 pandemic

    Get PDF
    This is an accepted manuscript of an article published by Elsevier in Journal of Systems Architecture on 30/06/2020, available online: https://doi.org/10.1016/j.sysarc.2020.101830 The accepted version of the publication may differ from the final published version.Global Health sometimes faces pandemics as are currently facing COVID-19 disease. The spreading and infection factors of this disease are very high. A huge number of people from most of the countries are infected within six months from its first report of appearance and it keeps spreading. The required systems are not ready up to some stages for any pandemic; therefore, mitigation with existing capacity becomes necessary. On the other hand, modern-era largely depends on Artificial Intelligence(AI) including Data Science; and Deep Learning(DL) is one of the current flag-bearer of these techniques. It could use to mitigate COVID-19 like pandemics in terms of stop spread, diagnosis of the disease, drug & vaccine discovery, treatment, patient care, and many more. But this DL requires large datasets as well as powerful computing resources. A shortage of reliable datasets of a running pandemic is a common phenomenon. So, Deep Transfer Learning(DTL) would be effective as it learns from one task and could work on another task. In addition, Edge Devices(ED) such as IoT, Webcam, Drone, Intelligent Medical Equipment, Robot, etc. are very useful in a pandemic situation. These types of equipment make the infrastructures sophisticated and automated which helps to cope with an outbreak. But these are equipped with low computing resources, so, applying DL is also a bit challenging; therefore, DTL also would be effective there. This article scholarly studies the potentiality and challenges of these issues. It has described relevant technical backgrounds and reviews of the related recent state-of-the-art. This article also draws a pipeline of DTL over Edge Computing as a future scope to assist the mitigation of any pandemic

    Dynamic Scheduling Algorithm in Cyber Mimic Defense Architecture of Volunteer Computing

    Get PDF
    Volunteer computing uses computers volunteered by the general public to do distributed scientific computing. Volunteer computing is being used in high-energy physics, molecular biology, medicine, astrophysics, climate study, and other areas. These projects have attained unprecedented computing power. However, with the development of information technology, the traditional defense system cannot deal with the unknown security problems of volunteer computing. At the same time, Cyber Mimic Defense (CMD) can defend the unknown attack behavior through its three characteristics: dynamic, heterogeneous, and redundant. As an important part of the CMD, the dynamic scheduling algorithm realizes the dynamic change of the service centralized executor, which can enusre the security and reliability of CMD of volunteer computing. Aiming at the problems of passive scheduling and large scheduling granularity existing in the existing scheduling algorithms, this article first proposes a scheduling algorithm based on time threshold and task threshold and realizes the dynamic randomness of mimic defense from two different dimensions; finally, combining time threshold and random threshold, a dynamic scheduling algorithm based on multi-level queue is proposed. The experiment shows that the dynamic scheduling algorithm based on multi-level queue can take both security and reliability into account, has better dynamic heterogeneous redundancy characteristics, and can effectively prevent the transformation rule of heterogeneous executors from being mastered by attackers

    A Systematic Literature Review on Distributed Machine Learning in Edge Computing

    Get PDF
    Distributed edge intelligence is a disruptive research area that enables the execution of machine learning and deep learning (ML/DL) algorithms close to where data are generated. Since edge devices are more limited and heterogeneous than typical cloud devices, many hindrances have to be overcome to fully extract the potential benefits of such an approach (such as data-in-motion analytics). In this paper, we investigate the challenges of running ML/DL on edge devices in a distributed way, paying special attention to how techniques are adapted or designed to execute on these restricted devices. The techniques under discussion pervade the processes of caching, training, inference, and offloading on edge devices. We also explore the benefits and drawbacks of these strategies
    corecore