58 research outputs found

    Digital forensics challenges and readiness for 6G Internet of Things (IoT) networks

    Get PDF
    The development of sixth-generation (6G) wireless communication technology is expected to provide super high-speed data transmission, and advanced network performance than the current fifth-generation (5G) and be fully functional by the 2030s. This development will have a significant impact and add improvements to digital extended reality (XR), autonomous systems, vehicular ad hoc networks (VANETs), artificial intelligence (AI), underwater communications, blockchain technology, pervasive biomedical informatics and smart cities built on the digital infrastructure backbone of the Internet of Things (IoT). The ubiquitous nature of this large-scale 6G-enabled IoT that offers faster connectivity capabilities and integrates both terrestrial and non-terrestrial networks will not only create new data security and privacy issues but also provide a treasure trove of digital evidence useful for digital forensic examiners investigating security incidents and cybercrime. However, for digital forensic examiners, evidence collection, preservation and analysis will become a priority in the successful deployment of 6G IoT networks. In this study, we define key applications of 6G network technology to the Internet of Things and its existing architectures. The survey introduces potential digital forensic challenges and related issues affecting digital forensic investigations specific to 6G IoT networks. Finally, we highlight and discuss forensic readiness and future research directions for identified challenges within the 6G IoT network environments

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Task-oriented cross-system design for Metaverse in 6G era

    Get PDF
    As an emerging concept, the Metaverse has the potential to revolutionize social interaction in the post-pandemic era by establishing a digital world for online education, remote healthcare, immersive business, intelligent transportation, and advanced manufacturing. The goal is ambitious, yet the methodologies and technologies to achieve the full vision of the Metaverse remain unclear. In this thesis, we first introduce the three pillars of infrastructure that lay the foundation of the Metaverse, i.e., Human-Computer Interfaces (HCIs), sensing and communication systems, and network architectures. Then, we depict the roadmap towards the Metaverse that consists of four stages with different applications. As one of the essential building blocks for the Metaverse, we also review the state-of-the-art Computer Vision for the Metaverse as well as the future scope. To support diverse applications in the Metaverse, we put forward a novel design methodology: task-oriented cross-system design, and further review the potential solutions and future challenges. Specifically, we establish a task-oriented cross-system design for a simple case, where sampling, communications, and prediction modules are jointly optimized for the synchronization of the real-world devices and digital model in the Metaverse. We use domain knowledge to design a deep reinforcement learning (DRL) algorithm to minimize the communication load subject to an average tracking error constraint. We validate our framework on a prototype composed of a real-world robotic arm and its digital model. The results show that our framework achieves a better trade-off between the average tracking error and the average communication load compared to a communication system without sampling and prediction. For example, the average communication load can be reduced to 87% when the average track error constraint is 0.002â—¦ . In addition, our policy outperforms the benchmark with the static sampling rate and prediction horizon optimized by exhaustive search, in terms of the tail probability of the tracking error. Furthermore, with the assistance of expert knowledge, the proposed algorithm achieves a better convergence time, stability, communication load, and average tracking error. Furthermore, we establish a task-oriented cross-system design framework for a general case, where the goal is to minimize the required packet rate for timely and accurate modeling of a real-world robotic arm in the Metaverse. Specifically, different modules including sensing, communications, prediction, control, and rendering are considered. To optimize a scheduling policy and prediction horizons, we design a Constraint Proximal Policy Optimization (CPPO) algorithm by integrating domain knowledge from relevant systems into the advanced reinforcement learning algorithm, Proximal Policy Optimization (PPO). Specifically, the Jacobian matrix for analyzing the motion of the robotic arm is included in the state of the CPPO algorithm, and the Conditional Value-at-Risk (CVaR) of the state-value function characterizing the long-term modeling error is adopted in the constraint. Besides, the policy is represented by a two-branch neural network determining the scheduling policy and the prediction horizons, respectively. To evaluate our algorithm, we build a prototype including a real-world robotic arm and its digital model in the Metaverse. The experimental results indicate that domain knowledge helps to reduce the convergence time and the required packet rate by up to 50%, and the cross-system design framework outperforms a baseline framework in terms of the required packet rate and the tail distribution of the modeling error

    Continuous Perception for Immersive Interaction and Computation in Molecular Sciences

    Get PDF
    Chemistry aims to understand the structure and reactions of molecules, which involve phenomena occurring at microscopic scales. However, scientists perceive the world at macroscopic scales, making it difficult to study complex molecular objects. Graphical representations, such as structural formulas, were developed to bridge this gap and aid in understanding. The advent of Quantum Mechanics further increased the complexity of the representation of microscopic objects. This dichotomy between conceptual representation and predictive quantification forms the foundation of Chemistry, now further explored with the rise of Artificial Intelligence. Recent advancements in computational sciences, increased computational power, and developments in Machine-Learning (ML) raise questions about the traditional scientific method. Computational scientists, who have relied on approximations based on fundamental rules, now face the possibility of accurately simulating nature without strictly adhering to its laws. This shift challenges the association between progress in understanding a phenomenon and the ability to predict it. Deep learning models can not only make predictions but also create new data. While these techniques find applications in fields like Natural Language Processing, they suffer from limitations and lack true intelligence or awareness of physical laws. The thesis aims to create mathematical descriptors for atom types, bond types, and angle types in ML procedures, ensuring the retention of their chemical meaning. The goal is to make quantitative predictions while interpreting changes in descriptors as chemical changes. To achieve this, the thesis develops a software called Proxima for Molecular Perception, which automatically perceives features from molecules. Proxima treats strongly coupled electrons as covalent bonds and lone pairs, while delocalized electrons are modeled using a Tight-Binding model. The resulting Molecular Graph captures the weak interactions between these units. Overall, this thesis explores the intersection of computational chemistry and Machine-Learning to enhance our understanding and predictive capabilities in the field of Chemistry by building the so-called Virtual Laboratory, a virtual environment with automatic access to structural databases to test chemical ideas on the fly (pre-processing) and explore the output of computational software (post-processing).  &nbsp

    Systematic Approaches for Telemedicine and Data Coordination for COVID-19 in Baja California, Mexico

    Get PDF
    Conference proceedings info: ICICT 2023: 2023 The 6th International Conference on Information and Computer Technologies Raleigh, HI, United States, March 24-26, 2023 Pages 529-542We provide a model for systematic implementation of telemedicine within a large evaluation center for COVID-19 in the area of Baja California, Mexico. Our model is based on human-centric design factors and cross disciplinary collaborations for scalable data-driven enablement of smartphone, cellular, and video Teleconsul-tation technologies to link hospitals, clinics, and emergency medical services for point-of-care assessments of COVID testing, and for subsequent treatment and quar-antine decisions. A multidisciplinary team was rapidly created, in cooperation with different institutions, including: the Autonomous University of Baja California, the Ministry of Health, the Command, Communication and Computer Control Center of the Ministry of the State of Baja California (C4), Colleges of Medicine, and the College of Psychologists. Our objective is to provide information to the public and to evaluate COVID-19 in real time and to track, regional, municipal, and state-wide data in real time that informs supply chains and resource allocation with the anticipation of a surge in COVID-19 cases. RESUMEN Proporcionamos un modelo para la implementación sistemática de la telemedicina dentro de un gran centro de evaluación de COVID-19 en el área de Baja California, México. Nuestro modelo se basa en factores de diseño centrados en el ser humano y colaboraciones interdisciplinarias para la habilitación escalable basada en datos de tecnologías de teleconsulta de teléfonos inteligentes, celulares y video para vincular hospitales, clínicas y servicios médicos de emergencia para evaluaciones de COVID en el punto de atención. pruebas, y para el tratamiento posterior y decisiones de cuarentena. Rápidamente se creó un equipo multidisciplinario, en cooperación con diferentes instituciones, entre ellas: la Universidad Autónoma de Baja California, la Secretaría de Salud, el Centro de Comando, Comunicaciones y Control Informático. de la Secretaría del Estado de Baja California (C4), Facultades de Medicina y Colegio de Psicólogos. Nuestro objetivo es proporcionar información al público y evaluar COVID-19 en tiempo real y rastrear datos regionales, municipales y estatales en tiempo real que informan las cadenas de suministro y la asignación de recursos con la anticipación de un aumento de COVID-19. 19 casos.ICICT 2023: 2023 The 6th International Conference on Information and Computer Technologieshttps://doi.org/10.1007/978-981-99-3236-

    Big Data and Artificial Intelligence in Digital Finance

    Get PDF
    This open access book presents how cutting-edge digital technologies like Big Data, Machine Learning, Artificial Intelligence (AI), and Blockchain are set to disrupt the financial sector. The book illustrates how recent advances in these technologies facilitate banks, FinTech, and financial institutions to collect, process, analyze, and fully leverage the very large amounts of data that are nowadays produced and exchanged in the sector. To this end, the book also describes some more the most popular Big Data, AI and Blockchain applications in the sector, including novel applications in the areas of Know Your Customer (KYC), Personalized Wealth Management and Asset Management, Portfolio Risk Assessment, as well as variety of novel Usage-based Insurance applications based on Internet-of-Things data. Most of the presented applications have been developed, deployed and validated in real-life digital finance settings in the context of the European Commission funded INFINITECH project, which is a flagship innovation initiative for Big Data and AI in digital finance. This book is ideal for researchers and practitioners in Big Data, AI, banking and digital finance

    Developing digital transformative capabilities of industrial businesses by leveraging the industrial internet of things

    Get PDF
    Industrial businesses are going through a period of digital disruption and firms are under severe pressure to undertake Digital Transformation and leverage the Industrial Internet of Things (IIoT). Yet, there is next to no scholarly guidance for such an endeavour. Most industrial firms are developing their Digital Transformation strategies, however, they are not sure what kind of capabilities they should develop for such transformation. Though there is limited academic literature about Digital Transformation and how firms are developing digital transformative capabilities, a systematic literature review was performed to disentangle capability transformation processes and how firms are developing dynamic capabilities to remain competitive in a high-velocity environment. The current study extended dynamic capability theory and proposed digital transformative capabilities (DTCs) for Digital Transformation. To understand the IIoT landscape and how it influences Digital Transformation, an industry review was performed. The research was conducted in two phases. Based on the literature review and industry review, in the first phase, two qualitative exploratory studies were performed. The preliminary exploratory study was conducted to get an understanding of the IIoT landscape and how firms were developing capabilities for transformation. Based on the insights from preliminary exploratory study, a detailed exploratory study was performed which revealed critical themes for Digital Transformation and, based on these themes, a conceptual framework for Digital Transformation was derived. The conceptual framework was divided into two models. The front-end model viii identified three DTCs (Business Model Transformation, Operating Model Transformation and Cultural Transformation), three inputs (Digital Twin, Digital Thread and Digital Mindset) and the factors influencing the DTCs. The back-end model examined the influence of DTCs on dynamic capabilities, which may be indicative of digital transformation in a company. In the second phase, these two models were tested through a quantitative analysis, utilizing data generated from 107 respondents from 87 industrial companies via a self-reported online questionnaire and the application of multiple linear regression analysis. The Digital Twin is widely touted as an important input for DTC but the result did not support that. Digital Thread as an input for DTC was supported and Digital Mindset as an input for DTC was partially supported. Using moderator analysis, important insights were identified. The moderators, Technology Turbulence, Market Turbulence, Competitor Turbulence and Path Dependency had some positive moderation effects. The positive influence of ‘DTC – Business Model Transformation’ on dynamic capabilities which may be indicative of digital transformation in a company was not supported. However, the positive influence of ‘DTC – Operating Model Transformation’ was supported and ‘DTC – Cultural Transformation’ was partially supported. The moderation effects of ecosystem partnership and resource scarcity and constraints were partially supported, and the moderation effects of customer and market demands and digital commitment were not supported or refuted.</div
    • …
    corecore