6,444 research outputs found

    Enhancing healthcare services through cloud service: a systematic review

    Get PDF
    Although cloud-based healthcare services are booming, in-depth research has not yet been conducted in this field. This study aims to address the shortcomings of previous research by analyzing all journal articles from the last five years using the preferred reporting items for systematic reviews and meta-analyses (PRISMA) systematic literature review methodology. The findings of this study highlight the benefits of cloud-based healthcare services for healthcare providers and patients, including enhanced healthcare services, data security, privacy issues, and innovative information technology (IT) service delivery models. However, this study also identifies challenges associated with using cloud services in healthcare, such as security and privacy concerns, and proposes solutions to address these issues. This study concludes by discussing future research directions and the need for a complete solution that addresses the conflicting requirements of the security, privacy, efficiency, and scalability of cloud technologies in healthcare

    E-learning in the Cloud Computing Environment: Features, Architecture, Challenges and Solutions

    Get PDF
    The need to constantly and consistently improve the quality and quantity of the educational system is essential. E-learning has emerged from the rapid cycle of change and the expansion of new technologies. Advances in information technology have increased network bandwidth, data access speed, and reduced data storage costs. In recent years, the implementation of cloud computing in educational settings has garnered the interest of major companies, leading to substantial investments in this area. Cloud computing improves engineering education by providing an environment that can be accessed from anywhere and allowing access to educational resources on demand. Cloud computing is a term used to describe the provision of hosting services on the Internet. It is predicted to be the next generation of information technology architecture and offers great potential to enhance productivity and reduce costs. Cloud service providers offer their processing and memory resources to users. By paying for the use of these resources, users can access them for their calculations and processing anytime and anywhere. Cloud computing provides the ability to increase productivity, save information technology resources, and enhance computing power, converting processing power into a tool with constant access capabilities. The use of cloud computing in a system that supports remote education has its own set of characteristics and requires a unique strategy. Students can access a wide variety of instructional engineering materials at any time and from any location, thanks to cloud computing. Additionally, they can share their materials with other community members. The use of cloud computing in e-learning offers several advantages, such as unlimited computing resources, high scalability, and reduced costs associated with e-learning. An improvement in the quality of teaching and learning is achieved through the use of flexible cloud computing, which offers a variety of resources for educators and students. In light of this, the current research presents cloud computing technology as a suitable and superior option for e-learning systems

    Digital twin modeling method based on IFC standards for building construction processes

    Get PDF
    Intelligent construction is a necessary way to improve the traditional construction method, and digital twin can be a crucial technology to promote intelligent construction. However, the construction field currently needs a unified method to build a standardized and universally applicable digital twin model, which is incredibly challenging in construction. Therefore, this paper proposes a general method to construct a digital twin construction process model based on the Industry Foundation Classes (IFC) standard, aiming to realize real-time monitoring, control, and visualization management of the construction site. The method constructs a digital twin fusion model from three levels: geometric model, resource model, and behavioral model by establishing an IFC semantic model of the construction process, storing the fusion model data and the construction site data into a database, and completing the dynamic interaction of the twin data in the database. At the same time, the digital twin platform is developed to realize the visualization and control of the construction site. Combined with practical cases and analysis, the implementation effect of the method is shown and verified. The results show that the method can adapt itself to different scenarios on the construction site, which is conducive to promoting application of the digital twin in the field of construction and provides a reference to the research of practicing digital twin theory and practice

    AI Lifecycle Zero-Touch Orchestration within the Edge-to-Cloud Continuum for Industry 5.0

    Get PDF
    The advancements in human-centered artificial intelligence (HCAI) systems for Industry 5.0 is a new phase of industrialization that places the worker at the center of the production process and uses new technologies to increase prosperity beyond jobs and growth. HCAI presents new objectives that were unreachable by either humans or machines alone, but this also comes with a new set of challenges. Our proposed method accomplishes this through the knowlEdge architecture, which enables human operators to implement AI solutions using a zero-touch framework. It relies on containerized AI model training and execution, supported by a robust data pipeline and rounded off with human feedback and evaluation interfaces. The result is a platform built from a number of components, spanning all major areas of the AI lifecycle. We outline both the architectural concepts and implementation guidelines and explain how they advance HCAI systems and Industry 5.0. In this article, we address the problems we encountered while implementing the ideas within the edge-to-cloud continuum. Further improvements to our approach may enhance the use of AI in Industry 5.0 and strengthen trust in AI systems

    Modern computing: Vision and challenges

    Get PDF
    Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress

    Natural and Technological Hazards in Urban Areas

    Get PDF
    Natural hazard events and technological accidents are separate causes of environmental impacts. Natural hazards are physical phenomena active in geological times, whereas technological hazards result from actions or facilities created by humans. In our time, combined natural and man-made hazards have been induced. Overpopulation and urban development in areas prone to natural hazards increase the impact of natural disasters worldwide. Additionally, urban areas are frequently characterized by intense industrial activity and rapid, poorly planned growth that threatens the environment and degrades the quality of life. Therefore, proper urban planning is crucial to minimize fatalities and reduce the environmental and economic impacts that accompany both natural and technological hazardous events

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Distributed Deep Neural-Network-Based Middleware for Cyber-Attacks Detection in Smart IoT Ecosystem: A Novel Framework and Performance Evaluation Approach

    Get PDF
    Cyberattacks always remain the major threats and challenging issues in the modern digital world. With the increase in the number of internet of things (IoT) devices, security challenges in these devices, such as lack of encryption, malware, ransomware, and IoT botnets, leave the devices vulnerable to attackers that can access and manipulate the important data, threaten the system, and demand ransom. The lessons from the earlier experiences of cyberattacks demand the development of the best-practices benchmark of cybersecurity, especially in modern Smart Environments. In this study, we propose an approach with a framework to discover malware attacks by using artificial intelligence (AI) methods to cover diverse and distributed scenarios. The new method facilitates proactively tracking network traffic data to detect malware and attacks in the IoT ecosystem. Moreover, the novel approach makes Smart Environments more secure and aware of possible future threats. The performance and concurrency testing of the deep neural network (DNN) model deployed in IoT devices are computed to validate the possibility of in-production implementation. By deploying the DNN model on two selected IoT gateways, we observed very promising results, with less than 30 kb/s increase in network bandwidth on average, and just a 2% increase in CPU consumption. Similarly, we noticed minimal physical memory and power consumption, with 0.42 GB and 0.2 GB memory usage for NVIDIA Jetson and Raspberry Pi devices, respectively, and an average 13.5% increase in power consumption per device with the deployed model. The ML models were able to demonstrate nearly 93% of detection accuracy and 92% f1-score on both utilized datasets. The result of the models shows that our framework detects malware and attacks in Smart Environments accurately and efficiently.publishedVersio
    corecore