248,317 research outputs found

    A Learning Health-Care System for Improving Renal Health Services in Peru Using Data Analytics

    Get PDF
    The health sector around the world faces the continuous challenge of improving the services provided to patients. Therefore, digital transformation in health services plays a key role in integrating new technologies such as artificial intelligence. However, the health system in Peru has not yet taken the big step towards digitising its services, currently ranking 71st according to the World Health Organisation (WHO). This article proposes a learning health system for the management and monitoring of private health services in Peru based on the three key components of intelligent health care: (1) a health data platform (HDP); (2) intelligent technologies (IT); and (3) an intelligent health care suite (HIS). The solution consists of four layers: (1) data source, (2) data warehousing, (3) data analytics, and (4) visualization. In layer 1, all data sources are selected to create a database. The proposed learning health system is built, and the data storage is executed through the extract, transform and load (ETL) process in layer 2. In layer 3, the Kaggle dataset and the decision tree (DT) and random forest (RF) algorithms are used to predict the diagnosis of disease, resulting in the RF algorithm having the best performance. Finally, in layer 4, the intelligent health-care suite dashboards and interfaces are designed. The proposed system was applied in a clinic focused on preventing chronic kidney disease. A total of 100 patients and six kidney health experts participated. The results proved that the diagnosis of chronic kidney disease by the learning health system had a low error rate in positive diagnoses (err = 1.12%). Additionally, it was demonstrated that experts were “satisfied” with the dashboards and interfaces of the intelligent health-care suite as well as the quality of the learning health system.Revisión por pare

    An Intelligent Management System for Hybrid Network between Visible Light Communication and Radio Frequency

    Get PDF
    This thesis investigates the challenges and potential solutions associated with hybrid Visible Light Communication (VLC) and Radio Frequency (RF) systems for indoor network environments. The rapid development of VLC technology, characterized by its high data rates, energy efficiency, and inherent security features, offers promising opportunities to complement RF networks in providing seamless connectivity and improved performance. However, integrating VLC and RF technologies effectively requires addressing a range of research and engineering challenges, including network coexistence, handover mechanisms, resource allocation, localization, and standardization.We begin by conducting a comprehensive literature review encompassing existing research, technologies, and solutions related to hybrid VLC/RF architectures, handover management, indoor localization techniques, and the challenges faced by these systems. This background provides a solid foundation for understanding the current state-of-the-art and identifying research gaps in the field of hybrid VLC/RF networks.Next, we propose a novel hybrid network architecture that integrates VLC and RF communication systems to enhance their strengths while mitigating their weaknesses. We discuss various types of hybrid VLC/RF architectures found in the literature and present our proposed design, which addresses the identified challenges through innovative strategies and mechanisms.To improve system performance in our hybrid system, we develop an enhanced priority feedback channel that optimizes the traffic priority based on user preferences and network conditions. This approach minimizes service disruptions, reduces latency, and maintains user Quality of Experience (QoE)\nomenclature{QoE}{Quality of Experience}.Furthermore, we introduce a novel intelligent management system architecture tailored for hybrid VLC/RF networks. This system employs advanced algorithms and techniques to optimize resource allocation, load balancing, localization, and handover management, ensuring efficient operation and seamless connectivity.We evaluate the performance of our proposed solutions through extensive simulations and testbed experiments, considering different network scenarios and metrics. The results demonstrate significant improvements in terms of data rate, latency, handover success rate, and localization accuracy, validating the effectiveness of our proposed architecture and management system.Lastly, we explore several real-world applications and case studies of our intelligent management system in various indoor environments, such as retail stores, offices, and hospitals. These examples illustrate the practical benefits of our solution in enhancing customer experiences, optimizing operational efficiency, facilitating targeted marketing, and improving energy management.In conclusion, this thesis contributes to the advancement of hybrid VLC/RF networks by proposing an innovative architecture and intelligent management system that address the key challenges faced by these systems in indoor environments. The findings and solutions presented in this work provided the backbone for the future research and development efforts aimed at fully harnessing the potential of VLC technology in combination with RF networks

    Memory-full context-aware predictive mobility management in dual connectivity 5G networks

    Get PDF
    Network densification with small cell deployment is being considered as one of the dominant themes in the fifth generation (5G) cellular system. Despite the capacity gains, such deployment scenarios raise several challenges from mobility management perspective. The small cell size, which implies a small cell residence time, will increase the handover (HO) rate dramatically. Consequently, the HO latency will become a critical consideration in the 5G era. The latter requires an intelligent, fast and light-weight HO procedure with minimal signalling overhead. In this direction, we propose a memory-full context-aware HO scheme with mobility prediction to achieve the aforementioned objectives. We consider a dual connectivity radio access network architecture with logical separation between control and data planes because it offers relaxed constraints in implementing the predictive approaches. The proposed scheme predicts future HO events along with the expected HO time by combining radio frequency performance to physical proximity along with the user context in terms of speed, direction and HO history. To minimise the processing and the storage requirements whilst improving the prediction performance, a user-specific prediction triggering threshold is proposed. The prediction outcome is utilised to perform advance HO signalling whilst suspending the periodic transmission of measurement reports. Analytical and simulation results show that the proposed scheme provides promising gains over the conventional approach

    A Literature Review on The Design of Intelligent Supply Chain for Natural Fibre Agroindustry

    Get PDF
    Natural fibre is an environmentally friendly raw material that has a great potential to develop, and is abundantly available in nature [1]. Currently, the growth of natural fibre processing industries in the world has been increasingly important [2]. Processing of abundant natural fibre in both upstream and downstream productions requires effective and collaborative supply chain management in terms of information sharing. Thus, an intelligent system would be implemented in supply chain management from upstream to downstream. Based on review of 46 scientific papers discussing on types of natural fibre, process, technology, and methods, as well as application areas of natural fibre in downstream industries. According to review on different aspects in 55 scientific papers, there were 5 aspects mapped, i.e. supply chain analytic, value chain, performance, collaboration, big data, and decision support system. A concept of 4.0 industry underlies utilization of opportunities for application of supply chain analytic [3]. Upcoming research opportunities include mediating relationship in supply chain network by utilizing Internet of things (IoT) and Big data (BD), in a collaborative relationship to use information sharing. The most possibly contributing research is the development of collaboration between supply chain and genetic algorithm [4]. Integration between production and inventory planning becomes an approach that utilizes Particle swarm optimization (PSO) by developing production planning [5], and production and inventory planning [6]. There is a research opportunity in the design of intelligent supply chain for natural fibre agroindustry by implementing IoT and BD as a tool in supply chain analytic, collaboration through Collaboration prediction forecasting and replenishment (CPFR) that occurs between stakeholders with the aim of improving agroindustry supply chain performance in production integration material and inventory, and performance measurement by integrating the Value chain operation reference (VCOR) model developed in supply chain analytic

    Software architecture knowledge for intelligent light maintenance

    Get PDF
    The maintenance management plays an important role in the monitoring of business activities. It ensures a certain level of services in industrial systems by improving the ability to function in accordance with prescribed procedures. This has a decisive impact on the performance of these systems in terms of operational efficiency, reliability and associated intervention costs. To support the maintenance processes of a wide range of industrial services, a knowledge-based component is useful to perform the intelligent monitoring. In this context we propose a generic model for supporting and generating industrial lights maintenance processes. The modeled intelligent approach involves information structuring and knowledge sharing in the industrial setting and the implementation of specialized maintenance management software in the target information system. As a first step we defined computerized procedures from the conceptual structure of industrial data to ensure their interoperability and effective use of information and communication technologies in the software dedicated to the management of maintenance (E-candela). The second step is the implementation of this software architecture with specification of business rules, especially by organizing taxonomical information of the lighting systems, and applying intelligencebased operations and analysis to capitalize knowledge from maintenance experiences. Finally, the third step is the deployment of the software with contextual adaptation of the user interface to allow the management of operations, editions of the balance sheets and real-time location obtained through geolocation data. In practice, these computational intelligence-based modes of reasoning involve an engineering framework that facilitates the continuous improvement of a comprehensive maintenance regime

    Developing a dynamic digital twin at a building level: Using Cambridge campus as case study

    Get PDF
    A Digital Twin (DT) refers to a digital replica of physical assets, processes and systems. DTs integrate artificial intelligence, machine learning and data analytics to create dynamic digital models that are able to learn and update the status of the physical counterpart from multiple sources. A DT, if equipped with appropriate algorithms will represent and predict future condition and performance of their physical counterparts. Current developments related to DTs are still at an early stage with respect to buildings and other infrastructure assets. Most of these developments focus on the architectural and engineering/construction point of view. Less attention has been paid to the operation & maintenance (O&M) phase, where the value potential is immense. A systematic and clear architecture verified with practical use cases for constructing a DT is the foremost step for effective operation and maintenance of assets. This paper presents a system architecture for developing dynamic DTs in building levels for integrating heterogeneous data sources, support intelligent data query, and provide smarter decision-making processes. This will further bridge the gaps between human relationships with buildings/regions via a more intelligent, visual and sustainable channels. This architecture is brought to life through the development of a dynamic DT demonstrator of the West Cambridge site of the University of Cambridge. Specifically, this demonstrator integrates an as-is multi-layered IFC Building Information Model (BIM), building management system data, space management data, real-time Internet of Things (IoT)-based sensor data, asset registry data, and an asset tagging platform. The demonstrator also includes two applications: (1) improving asset maintenance and asset tracking using Augmented Reality (AR); and (2) equipment failure prediction. The long-term goals of this demonstrator are also discussed in this paper

    Sensor data-based decision making

    Get PDF
    Increasing globalization and growing industrial system complexity has amplified the interest in the use of information provided by sensors as a means of improving overall manufacturing system performance and maintainability. However, utilization of sensors can only be effective if the real-time data can be integrated into the necessary business processes, such as production planning, scheduling and execution systems. This integration requires the development of intelligent decision making models that can effectively process the sensor data into information and suggest appropriate actions. To be able to improve the performance of a system, the health of the system also needs to be maintained. In many cases a single sensor type cannot provide sufficient information for complex decision making including diagnostics and prognostics of a system. Therefore, a combination of sensors should be used in an integrated manner in order to achieve desired performance levels. Sensor generated data need to be processed into information through the use of appropriate decision making models in order to improve overall performance. In this dissertation, which is presented as a collection of five journal papers, several reactive and proactive decision making models that utilize data from single and multi-sensor environments are developed. The first paper presents a testbed architecture for Auto-ID systems. An adaptive inventory management model which utilizes real-time RFID data is developed in the second paper. In the third paper, a complete hardware and inventory management solution, which involves the integration of RFID sensors into an extremely low temperature industrial freezer, is presented. The last two papers in the dissertation deal with diagnostic and prognostic decision making models in order to assure the healthy operation of a manufacturing system and its components. In the fourth paper a Mahalanobis-Taguchi System (MTS) based prognostics tool is developed and it is used to estimate the remaining useful life of rolling element bearings using data acquired from vibration sensors. In the final paper, an MTS based prognostics tool is developed for a centrifugal water pump, which fuses information from multiple types of sensors in order to take diagnostic and prognostics decisions for the pump and its components --Abstract, page iv

    Auto-configuration of Savants in a complex, variable network

    Get PDF
    Thesis (M. Eng. and S.B.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.Includes bibliographical references (p. 63-64).In this thesis, present a system design that enables Savants to automatically configure both their network settings and their required application programs when connected to an intelligent data management and application system. Savants are intelligent routers in a large network used to manage the data and events related to communications with electronic identification tags [10]. The ubiquitous nature of the identification tags and the access points that communicate with them requires an information and management system that is equally ubiquitous and able to deal with huge volumes of data. The Savant systems were designed to be such a ubiquitous information and management system. Deploying any ubiquitous system is difficult, and automation is required to streamline its deployment and improve system management, reliability, and performance. My solution to this auto-configuration problem uses NETCONF as a standard language and protocol for configuration communication among Savants. It also uses the Content-Addressable Network (CAN) as a discovery service to help Savants locate configuration information, since a new Savant may not have information about the network structure. With these tools, new Savants can configure themselves automatically with the help of other Savants.(cont.) Specifically, they can configure their network settings, download and set up software, and integrate with network distributed applications. Future work could expand upon my project by studying an implementation, making provisions for resource-limited Savants, or improving security.by Joseph Hon Yu.M.Eng.and S.B

    Process mining online assessment data

    Get PDF
    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for generating recommendations and advice to students, for improving management of learning objects, etc. However, most of the traditional data mining techniques focus on data dependencies or simple patterns and do not provide a visual representation of the complete educational (assessment) process ready to be analyzed. To allow for these types of analysis (in which the process plays the central role), a new line of data-mining research, called process mining, has been initiated. Process mining focuses on the development of a set of intelligent tools and techniques aimed at extracting process-related knowledge from event logs recorded by an information system. In this paper we demonstrate the applicability of process mining, and the ProM framework in particular, to educational data mining context. We analyze assessment data from recently organized online multiple choice tests and demonstrate the use of process discovery, conformance checking and performance analysis techniques

    Enhancing Government Decision Making through Knowledge Discovery from Data

    Get PDF
    A major challenge facing management in developed countries is improving the performance of knowledge and service workers, i.e. the decision makers. In a developing country such as South Africa, with a welldeveloped business sector, the need to improve the performance of decision makers, especially in government, is even more crucial. South Africa has to face many new challenges in the 21st century - growing environmental concerns, massive social and economic inequalities, an ageing population, low productivity, massive unemployment and the nation\u27s evolving role in Africa. The importance of science and technology to address these pressing issues cannot be overemphasised. This paper discussed the development of a knowledge-base to aid government decision makers in interpreting the results of the National Research and Technology (NRT) Audit that was undertaken by the South African Department of Arts, Culture, Science and Technology. An intelligent data analysis tool is employed to construct a knowledge-base, using a data-driven rather than a knowledge-driven approach to knowledge-base con-struction. The knowledge-base is constructed directly from the data as contained in the NRT Audit data warehouse. The rules contained in the knowledge-base are produced by a team of data mining techniques that cooperate as members of a learning system. This knowledge-base is used to augment the knowledge of the human experts. Results show that the information, as discovered during the knowledge-base construction process, either enhanced or contradicted the finding of the human experts
    corecore