14 research outputs found

    Networking Architecture and Key Technologies for Human Digital Twin in Personalized Healthcare: A Comprehensive Survey

    Full text link
    Digital twin (DT), refers to a promising technique to digitally and accurately represent actual physical entities. One typical advantage of DT is that it can be used to not only virtually replicate a system's detailed operations but also analyze the current condition, predict future behaviour, and refine the control optimization. Although DT has been widely implemented in various fields, such as smart manufacturing and transportation, its conventional paradigm is limited to embody non-living entities, e.g., robots and vehicles. When adopted in human-centric systems, a novel concept, called human digital twin (HDT) has thus been proposed. Particularly, HDT allows in silico representation of individual human body with the ability to dynamically reflect molecular status, physiological status, emotional and psychological status, as well as lifestyle evolutions. These prompt the expected application of HDT in personalized healthcare (PH), which can facilitate remote monitoring, diagnosis, prescription, surgery and rehabilitation. However, despite the large potential, HDT faces substantial research challenges in different aspects, and becomes an increasingly popular topic recently. In this survey, with a specific focus on the networking architecture and key technologies for HDT in PH applications, we first discuss the differences between HDT and conventional DTs, followed by the universal framework and essential functions of HDT. We then analyze its design requirements and challenges in PH applications. After that, we provide an overview of the networking architecture of HDT, including data acquisition layer, data communication layer, computation layer, data management layer and data analysis and decision making layer. Besides reviewing the key technologies for implementing such networking architecture in detail, we conclude this survey by presenting future research directions of HDT

    Edge Intelligence : Empowering Intelligence to the Edge of Network

    Get PDF
    Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis proximity to where data are captured based on artificial intelligence. Edge intelligence aims at enhancing data processing and protects the privacy and security of the data and users. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this article, we present a thorough and comprehensive survey of the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, i.e., edge caching, edge training, edge inference, and edge offloading based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare, and analyze the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, and so on. This article provides a comprehensive survey of edge intelligence and its application areas. In addition, we summarize the development of the emerging research fields and the current state of the art and discuss the important open issues and possible theoretical and technical directions.Peer reviewe

    Edge Intelligence : Empowering Intelligence to the Edge of Network

    Get PDF
    Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis proximity to where data are captured based on artificial intelligence. Edge intelligence aims at enhancing data processing and protects the privacy and security of the data and users. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this article, we present a thorough and comprehensive survey of the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, i.e., edge caching, edge training, edge inference, and edge offloading based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare, and analyze the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, and so on. This article provides a comprehensive survey of edge intelligence and its application areas. In addition, we summarize the development of the emerging research fields and the current state of the art and discuss the important open issues and possible theoretical and technical directions.Peer reviewe

    Electronic information sharing model for Yemen education sectors based on layered behaviour model

    Get PDF
    Electronic Information Sharing (EIS) is a proof of the advancement of ICT that supports daily transaction and decision making, including in the higher education sector. In the higher education system in the Republic of Yemen, EIS brings benefits to the Yemen Center for Information Technology in Higher Education (YCIT-HE) and Yemen public universities in many aspects including enhancing information accuracy and timeliness while providing fast services at minimal costs to students and employees of public universities. Albeit a high degree of information sharing, there are limited electronic information sharing between Yemen public universities and YCIT-HE. This limitation creates difficulties and delays in getting services and making decisions. The majority of the previous frameworks and models had concentrated on EIS in the government sector, while very few had focused on the higher education sector. There were also very minimum utilization of the Layer Behavior Model (LBM) in Information Sharing studies in the context of understanding the different situations of EIS at different levels of an organization. Thus, this study aims to develop an EIS model based on Social Exchange Theory, Information Sharing Theory and Layered Behavior Model (LBM). This model combines individual, environment, organization and technology factors which influence EIS. In order to validate the model, Structural Equation Modelling (SEM) was applied. A purposive sampling was applied to collect data involving deans of the faculties, engineers responsible for the system, Director of the computer centre, Vice-Chancellor of the University, the senior management of the universities, and the Student Affairs from each of these six universities. A total of 260 questionnaires were distributed to six universities in Yemen via email and face-to-face meeting with 173 (66.53%) returned response. The model demonstrates three dimensions and ten influential factors that can essentially increase the electronic information sharing between Yemen public universities and YCIT-HE. The significant technological factors include IT capability, information quality, IT compatibility, cloud computing, and social media. As for the significant environment factors, they include upper-level leadership as well as policy and law. Organization factors include top management support, interagency trust and Financial Capability. The model can assist the management of university in planning and managing technological, organizational and environmental aspects of the universities in their way forward to improvise and enhance EIS in the future

    Central Washington University 2012-2013 Undergraduate/Graduate Catalog

    Get PDF
    https://digitalcommons.cwu.edu/catalogs/1170/thumbnail.jp

    Central Washington University 2019-2020 Graduate Catalog

    Get PDF
    https://digitalcommons.cwu.edu/catalogs/1181/thumbnail.jp

    Content-aware compression for big textual data analysis

    Get PDF
    A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements

    Central Washington University 2019-2020 Undergraduate Catalog

    Get PDF
    https://digitalcommons.cwu.edu/catalogs/1182/thumbnail.jp

    Enhancing the conceptual design process of automotive exterior systems

    Get PDF
    Thesis (S.M. in Engineering and Management)--Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 152-154).Product development cycles in the automotive industry are being reduced and competition is more demanding than ever before. To be successful in this environment, Original Equipment Manufacturers need a product development process that delivers best-in-class value, at a competitive cost and with the shortest lead time. Within the development process, the conceptual design is the most important phase in the delivery of a nocompromise design solution. In this phase, design teams have the largest amount of latitude to create value in the product, but they also face high levels of uncertainty and incomplete information to make decisions. At a high level, the conceptual design phase encompasses four major steps. In the first step, value is defined from the stakeholder perspective and system objectives are defined. The second step involves a divergent process in which design space is explored and several concept alternatives are generated to meet the system objectives. The third is a convergent process in which design alternatives are matured, evaluated and one is selected. In the fourth step, the architecture of the system is articulated. The intended impact of this thesis is to enhance the value delivered in the conceptual design phase and prevent waste in downstream activities within the product development process. To achieve this, the conceptual design processes of a major automotive manufacturer were studied to identify the problems that constrain value delivery and generate waste. The findings of this study and the exploration of existing concept development frameworks were synthesized in a concept development methodology focused on automotive Exterior Systems.by David Diaz Dominguez.S.M.in Engineering and Managemen

    Central Washington University 2012-2013 Graduate Catalog

    Get PDF
    https://digitalcommons.cwu.edu/catalogs/1285/thumbnail.jp
    corecore