241 research outputs found

    Research on the Evolution of Health Information Behavior From a Chinese Perspective

    Get PDF
    China has been undergoing a tremendous development in the reform of health system. It has great effects all the citizens and the nation as a whole. This paper aims to focus on the individuals from the aspect of information behavior. It is expected that the review on health information behavior could be conducted in a systematic way. Moreover, some statistical methods and software have been occupied in order to find out the entire progress of health information behavior. Specifically, both vertical and horizontal comparison have been conducted in this study, and a scientometric method has also be used. After a systematic and profound literature review, the whole progress has been explored and the main topics of great importance have been discovered. Moreover, highly cited papers and their relationship have also been revealed

    Maintaining privacy for a recommender system diagnosis using blockchain and deep learning.

    Get PDF
    The healthcare sector has been revolutionized by Blockchain and AI technologies. Artificial intelligence uses algorithms, recommender systems, decision-making abilities, and big data to display a patient's health records using blockchain. Healthcare professionals can make use of Blockchain to display a patient's medical records with a secured medical diagnostic process. Traditionally, data owners have been hesitant to share medical and personal information due to concerns about privacy and trustworthiness. Using Blockchain technology, this paper presents an innovative model for integrating healthcare data sharing into a recommender diagnostic computer system. Using the model, medical records can be secured, controlled, authenticated, and kept confidential. In this paper, researchers propose a framework for using the Ethereum Blockchain and x-rays as a mechanism for access control, establishing hierarchical identities, and using pre-processing and deep learning to diagnose COVID-19. Along with solving the challenges associated with centralized access control systems, this mechanism also ensures data transparency and traceability, which will allow for efficient diagnosis and secure data sharing

    A patient agent controlled customized blockchain based framework for internet of things

    Get PDF
    Although Blockchain implementations have emerged as revolutionary technologies for various industrial applications including cryptocurrencies, they have not been widely deployed to store data streaming from sensors to remote servers in architectures known as Internet of Things. New Blockchain for the Internet of Things models promise secure solutions for eHealth, smart cities, and other applications. These models pave the way for continuous monitoring of patient’s physiological signs with wearable sensors to augment traditional medical practice without recourse to storing data with a trusted authority. However, existing Blockchain algorithms cannot accommodate the huge volumes, security, and privacy requirements of health data. In this thesis, our first contribution is an End-to-End secure eHealth architecture that introduces an intelligent Patient Centric Agent. The Patient Centric Agent executing on dedicated hardware manages the storage and access of streams of sensors generated health data, into a customized Blockchain and other less secure repositories. As IoT devices cannot host Blockchain technology due to their limited memory, power, and computational resources, the Patient Centric Agent coordinates and communicates with a private customized Blockchain on behalf of the wearable devices. While the adoption of a Patient Centric Agent offers solutions for addressing continuous monitoring of patients’ health, dealing with storage, data privacy and network security issues, the architecture is vulnerable to Denial of Services(DoS) and single point of failure attacks. To address this issue, we advance a second contribution; a decentralised eHealth system in which the Patient Centric Agent is replicated at three levels: Sensing Layer, NEAR Processing Layer and FAR Processing Layer. The functionalities of the Patient Centric Agent are customized to manage the tasks of the three levels. Simulations confirm protection of the architecture against DoS attacks. Few patients require all their health data to be stored in Blockchain repositories but instead need to select an appropriate storage medium for each chunk of data by matching their personal needs and preferences with features of candidate storage mediums. Motivated by this context, we advance third contribution; a recommendation model for health data storage that can accommodate patient preferences and make storage decisions rapidly, in real-time, even with streamed data. The mapping between health data features and characteristics of each repository is learned using machine learning. The Blockchain’s capacity to make transactions and store records without central oversight enables its application for IoT networks outside health such as underwater IoT networks where the unattended nature of the nodes threatens their security and privacy. However, underwater IoT differs from ground IoT as acoustics signals are the communication media leading to high propagation delays, high error rates exacerbated by turbulent water currents. Our fourth contribution is a customized Blockchain leveraged framework with the model of Patient-Centric Agent renamed as Smart Agent for securely monitoring underwater IoT. Finally, the smart Agent has been investigated in developing an IoT smart home or cities monitoring framework. The key algorithms underpinning to each contribution have been implemented and analysed using simulators.Doctor of Philosoph

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    The Internet of Everything

    Get PDF
    In the era before IoT, the world wide web, internet, web 2.0 and social media made people’s lives comfortable by providing web services and enabling access personal data irrespective of their location. Further, to save time and improve efficiency, there is a need for machine to machine communication, automation, smart computing and ubiquitous access to personal devices. This need gave birth to the phenomenon of Internet of Things (IoT) and further to the concept of Internet of Everything (IoE)

    Cybersecurity and the Digital Health: An Investigation on the State of the Art and the Position of the Actors

    Get PDF
    Cybercrime is increasingly exposing the health domain to growing risk. The push towards a strong connection of citizens to health services, through digitalization, has undisputed advantages. Digital health allows remote care, the use of medical devices with a high mechatronic and IT content with strong automation, and a large interconnection of hospital networks with an increasingly effective exchange of data. However, all this requires a great cybersecurity commitment—a commitment that must start with scholars in research and then reach the stakeholders. New devices and technological solutions are increasingly breaking into healthcare, and are able to change the processes of interaction in the health domain. This requires cybersecurity to become a vital part of patient safety through changes in human behaviour, technology, and processes, as part of a complete solution. All professionals involved in cybersecurity in the health domain were invited to contribute with their experiences. This book contains contributions from various experts and different fields. Aspects of cybersecurity in healthcare relating to technological advance and emerging risks were addressed. The new boundaries of this field and the impact of COVID-19 on some sectors, such as mhealth, have also been addressed. We dedicate the book to all those with different roles involved in cybersecurity in the health domain

    Challenges and Opportunities in Applied System Innovation

    Get PDF
    This book introduces and provides solutions to a variety of problems faced by society, companies and individuals in a quickly changing and technology-dependent world. The wide acceptance of artificial intelligence, the upcoming fourth industrial revolution and newly designed 6G technologies are seen as the main enablers and game changers in this environment. The book considers these issues not only from a technological viewpoint but also on how society, labor and the economy are affected, leading to a circular economy that affects the way people design, function and deploy complex systems

    Analyzing Small Businesses\u27 Adoption of Big Data Security Analytics

    Get PDF
    Despite the increased cost of data breaches due to advanced, persistent threats from malicious sources, the adoption of big data security analytics among U.S. small businesses has been slow. Anchored in a diffusion of innovation theory, the purpose of this correlational study was to examine ways to increase the adoption of big data security analytics among small businesses in the United States by examining the relationship between small business leaders\u27 perceptions of big data security analytics and their adoption. The research questions were developed to determine how to increase the adoption of big data security analytics, which can be measured as a function of the user\u27s perceived attributes of innovation represented by the independent variables: relative advantage, compatibility, complexity, observability, and trialability. The study included a cross-sectional survey distributed online to a convenience sample of 165 small businesses. Pearson correlations and multiple linear regression were used to statistically understand relationships between variables. There were no significant positive correlations between relative advantage, compatibility, and the dependent variable adoption; however, there were significant negative correlations between complexity, trialability, and the adoption. There was also a significant positive correlation between observability and the adoption. The implications for positive social change include an increase in knowledge, skill sets, and jobs for employees and increased confidentiality, integrity, and availability of systems and data for small businesses. Social benefits include improved decision making for small businesses and increased secure transactions between systems by detecting and eliminating advanced, persistent threats

    A Fog Computing Approach for Cognitive, Reliable and Trusted Distributed Systems

    Get PDF
    In the Internet of Things era, a big volume of data is generated/gathered every second from billions of connected devices. The current network paradigm, which relies on centralised data centres (a.k.a. Cloud computing), becomes an impractical solution for IoT data storing and processing due to the long distance between the data source (e.g., sensors) and designated data centres. It worth noting that the long distance in this context refers to the physical path and time interval of when data is generated and when it get processed. To explain more, by the time the data reaches a far data centre, the importance of the data can be depreciated. Therefore, the network topologies have evolved to permit data processing and storage at the edge of the network, introducing what so-called fog Computing. The later will obviously lead to improvements in quality of service via processing and responding quickly and efficiently to varieties of data processing requests. Although fog computing is recognized as a promising computing paradigm, it suffers from challenging issues that involve: i) concrete adoption and management of fogs for decentralized data processing. ii) resources allocation in both cloud and fog layers. iii) having a sustainable performance since fog have a limited capacity in comparison with cloud. iv) having a secure and trusted networking environment for fogs to share resources and exchange data securely and efficiently. Hence, the thesis focus is on having a stable performance for fog nodes by enhancing resources management and allocation, along with safety procedures, to aid the IoT-services delivery and cloud computing in the ever growing industry of smart things. The main aspects related to the performance stability of fog computing involves the development of cognitive fog nodes that aim at provide fast and reliable services, efficient resources managements, and trusted networking, and hence ensure the best Quality of Experience, Quality of Service and Quality of Protection to end-users. Therefore the contribution of this thesis in brief is a novel Fog Resource manAgeMEnt Scheme (FRAMES) which has been proposed to crystallise fog distribution and resource management with an appropriate service's loads distribution and allocation based on the Fog-2-Fog coordination. Also, a novel COMputIng Trust manageMENT (COMITMENT) which is a software-based approach that is responsible for providing a secure and trusted environment for fog nodes to share their resources and exchange data packets. Both FRAMES and COMITMENT are encapsulated in the proposed Cognitive Fog (CF) computing which aims at making fog able to not only act on the data but also interpret the gathered data in a way that mimics the process of cognition in the human mind. Hence, FRAMES provide CF with elastic resource managements for load balancing and resolving congestion, while the COMITMENT employ trust and recommendations models to avoid malicious fog nodes in the Fog-2-Fog coordination environment. The proposed algorithms for FRAMES and COMITMENT have outperformed the competitive benchmark algorithms, namely Random Walks Offloading (RWO) and Nearest Fog Offloading (NFO) in the experiments to verify the validity and performance. The experiments were conducted on the performance (in terms of latency), load balancing among fog nodes and fogs trustworthiness along with detecting malicious events and attacks in the Fog-2-Fog environment. The performance of the proposed FRAMES's offloading algorithms has the lowest run-time (i.e., latency) against the benchmark algorithms (RWO and NFO) for processing equal-number of packets. Also, COMITMENT's algorithms were able to detect the collaboration requests whether they are secure, malicious or anonymous. The proposed work shows potential in achieving a sustainable fog networking paradigm and highlights significant benefits of fog computing in the computing ecosystem
    • 

    corecore