179 research outputs found

    Reliability issues related to the usage of Cloud Computing in Critical Infrastructures

    Get PDF
    The use of cloud computing is extending to all kind of systems, including the ones that are part of Critical Infrastructures, and measuring the reliability is becoming more difficult. Computing is becoming the 5th utility, in part thanks to the use of cloud services. Cloud computing is used now by all types of systems and organizations, including critical infrastructure, creating hidden inter-dependencies on both public and private cloud models. This paper investigates the use of cloud computing by critical infrastructure systems, the reliability and continuity of services risks associated with their use by critical systems. Some examples are presented of their use by different critical industries, and even when the use of cloud computing by such systems is not widely extended, there is a future risk that this paper presents. The concepts of macro and micro dependability and the model we introduce are useful for inter-dependency definition and for analyzing the resilience of systems that depend on other systems, specifically in the cloud model

    Hosting critical infrastructure services in the cloud environment considerations

    Get PDF
    Critical infrastructure technology vendors will inevitability take advantage of the benefits offered by the cloud computing paradigm. While this may offer improved performance and scalability, the associated security threats impede this progression. Hosting critical infrastructure services in the cloud environment may seem inane to some, but currently remote access to the control system over the internet is commonplace. This shares the same characteristics as cloud computing, i.e., on-demand access and resource pooling. There is a wealth of data used within critical infrastructure. There needs to be an assurance that the confidentiality, integrity and availability of this data remains. Authenticity and non-repudiation are also important security requirements for critical infrastructure systems. This paper provides an overview of critical infrastructure and the cloud computing relationship, whilst detailing security concerns and existing protection methods. Discussion on the direction of the area is presented, as is a survey of current protection methods and their weaknesses. Finally, we present our observation and our current research into hosting critical infrastructure services in the cloud environment, and the considerations for detecting cloud attacks. © 2015 Inderscience Enterprises Ltd

    Cloud Computing and Big Data for Oil and Gas Industry Application in China

    Get PDF
    The oil and gas industry is a complex data-driven industry with compute-intensive, data-intensive and business-intensive features. Cloud computing and big data have a broad application prospect in the oil and gas industry. This research aims to highlight the cloud computing and big data issues and challenges from the informatization in oil and gas industry. In this paper, the distributed cloud storage architecture and its applications for seismic data of oil and gas industry are focused on first. Then,cloud desktop for oil and gas industry applications are also introduced in terms of efficiency, security and usability. Finally, big data architecture and security issues of oil and gas industry are analyzed. Cloud computing and big data architectures have advantages in many aspects, such as system scalability, reliability, and serviceability. This paper also provides a brief description for the future development of Cloud computing and big data in oil and gas industry. Cloud computing and big data can provide convenient information sharing and high quality service for oil and gas industry

    Data Migration in Cloud: A Systematic Review

    Get PDF
    Data migration needs to  securely transfer for maintaining confidentiality such that migration can strongly and effectively done with no data loss due to active attacks . Many techniques and methods have already been proposed by researchers around the world to secure data migration. This paper provides a critical overview of these problems and solutions and giving a proposed solution for data migration that is Attunity which can help to optimize data for replicating and transferring data thus providing a simple, faster and safer path to accelerate data by providing

    Contributions to Edge Computing

    Get PDF
    Efforts related to Internet of Things (IoT), Cyber-Physical Systems (CPS), Machine to Machine (M2M) technologies, Industrial Internet, and Smart Cities aim to improve society through the coordination of distributed devices and analysis of resulting data. By the year 2020 there will be an estimated 50 billion network connected devices globally and 43 trillion gigabytes of electronic data. Current practices of moving data directly from end-devices to remote and potentially distant cloud computing services will not be sufficient to manage future device and data growth. Edge Computing is the migration of computational functionality to sources of data generation. The importance of edge computing increases with the size and complexity of devices and resulting data. In addition, the coordination of global edge-to-edge communications, shared resources, high-level application scheduling, monitoring, measurement, and Quality of Service (QoS) enforcement will be critical to address the rapid growth of connected devices and associated data. We present a new distributed agent-based framework designed to address the challenges of edge computing. This actor-model framework implementation is designed to manage large numbers of geographically distributed services, comprised from heterogeneous resources and communication protocols, in support of low-latency real-time streaming applications. As part of this framework, an application description language was developed and implemented. Using the application description language a number of high-order management modules were implemented including solutions for resource and workload comparison, performance observation, scheduling, and provisioning. A number of hypothetical and real-world use cases are described to support the framework implementation

    Future Trends and Directions for Secure Infrastructure Architecture in the Education Sector: A Systematic Review of Recent Evidence

    Get PDF
    The most efficient approach to giving large numbers of students’ access to computational resources is through a data center. A contemporary method for building the data center\u27s computer infrastructure is the software-defined model, which enables user tasks to be processed in a reasonable amount of time and at a reasonable cost. The researcher examines potential directions and trends for a secured infrastructure design in this article. Additionally, interoperable, highly reusable modules that can include the newest trends in the education industry are made possible by cloud-based educational software. The Reference Architecture for University Education System Using AWS Services is presented in the paper. In conclusion, automation boosts efficiency by 20% while decreasing researcher involvement in kinetics modeling using CHEMKIN by 10%. Future work will focus on integrating GPUs into open-source programs that will be automated and shared on CloudFlame as a service resource for cooperation in the educational sector

    A forensics and compliance auditing framework for critical infrastructure protection

    Get PDF
    Contemporary societies are increasingly dependent on products and services provided by Critical Infrastructure (CI) such as power plants, energy distribution networks, transportation systems and manufacturing facilities. Due to their nature, size and complexity, such CIs are often supported by Industrial Automation and Control Systems (IACS), which are in charge of managing assets and controlling everyday operations. As these IACS become larger and more complex, encompassing a growing number of processes and interconnected monitoring and actuating devices, the attack surface of the underlying CIs increases. This situation calls for new strategies to improve Critical Infrastructure Protection (CIP) frameworks, based on evolved approaches for data analytics, able to gather insights from the CI. In this paper, we propose an Intrusion and Anomaly Detection System (IADS) framework that adopts forensics and compliance auditing capabilities at its core to improve CIP. Adopted forensics techniques help to address, for instance, post-incident analysis and investigation, while the support of continuous auditing processes simplifies compliance management and service quality assessment. More specifically, after discussing the rationale for such a framework, this paper presents a formal description of the proposed components and functions and discusses how the framework can be implemented using a cloud-native approach, to address both functional and non-functional requirements. An experimental analysis of the framework scalability is also provided.info:eu-repo/semantics/publishedVersio

    Analysis of the Requirements and Methods of Cloud Migration to SaaS Model

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceIn a fast pace changing globe, where quality and security is a must have and the threads are new every day, companies and IT administrations still doubt about the benefits in running a cloud-based technology and stick with local systems which can be expensive in Cybersecurity and vulnerable at their own risk. This thesis aims to provide guidance and understanding of the key factors that serve as a foundation for the process of migrating a system to a cloud-based software. There are three main types of cloud computing service models and in this thesis is only considered SaaS, also known as Software-as-a-Service. The focus is to demystify some ghosts regarding cloud-based technology, through a comprehensive Model Migration proposal with step-to-step indications, based on a thorough Methodology which is supported by a Literature Review to clarify and justify the decisions made
    corecore