3,196 research outputs found
Recommended from our members
A survey on security issues and solutions at different layers of Cloud computing
Cloud computing offers scalable on-demand services to consumers with greater flexibility and lesser infrastructure investment. Since Cloud services are delivered using classical network protocols and formats over the Internet, implicit vulnerabilities existing in these protocols as well as threats introduced by newer architectures raise many security and privacy concerns. In this paper, we survey the factors affecting Cloud computing adoption, vulnerabilities and attacks, and identify relevant solution directives to strengthen security and privacy in the Cloud environment
A Novel Framework for Big Data Security Infrastructure Components
Big data encompasses enormous data and management of huge data collected from various sources like online social media contents, log files, sensor records, surveys and online transactions. It is essential to provide new security models, concerns and efficient security designs and approaches for confronting security and privacy aspects of the same. This paper intends to provide initial analysis of the security challenges in Big Data. The paper introduces the basic concepts of Big Data and its enormous growth rate in terms of pita and zettabytes. A model framework for Big Data Infrastructure Security Components Framework (BDAF) is proposed that includes components like Security Life Cycle, Fine-grained data-centric access control policies, the Dynamic Infrastructure Trust Bootstrap Protocol (DITBP). The framework allows deploying trusted remote virtualised data processing environment and federated access control and identity management
Efficient data uncertainty management for health industrial internet of things using machine learning
[EN] In modern technologies, the industrial internet of things (IIoT) has gained rapid growth in the fields of medical, transportation, and engineering. It consists of a self-governing configuration and cooperated with sensors to collect, process, and analyze the processes of a real-time system. In the medical system, healthcare IIoT (HIIoT) provides analytics of a huge amount of data and offers low-cost storage systems with the collaboration of cloud systems for the monitoring of patient information. However, it faces certain connectivity, nodes failure, and rapid data delivery challenges in the development of e-health systems. Therefore, to address such concerns, this paper presents an efficient data uncertainty management model for HIIoT using machine learning (EDM-ML) with declining nodes prone and data irregularity. Its aim is to increase the efficacy for the collection and processing of real-time data along with smart functionality against anonymous nodes. It developed an algorithm for improving the health services against disruption of network status and overheads. Also, the multi-objective function decreases the uncertainty in the management of medical data. Furthermore, it expects the routing decisions using a machine learning-based algorithm and increases the uniformity in health operations by balancing the network resources and trust distribution. Finally, it deals with a security algorithm and established control methods to protect the distributed data in the exposed health industry. Extensive simulations are performed, and their results reveal the significant performance of the proposed model in the context of uncertainty and intelligence than benchmark algorithms.This research is supported by Artificial Intelligence & Data Analytics Lab (AIDA) CCIS Prince Sultan University, Riyadh Saudi Arabia. Authors are thankful for the support.Haseeb, K.; Saba, T.; Rehman, A.; Ahmed, I.; Lloret, J. (2021). Efficient data uncertainty management for health industrial internet of things using machine learning. International Journal of Communication Systems. 34(16):1-14. https://doi.org/10.1002/dac.4948114341
Trusted Computing and Secure Virtualization in Cloud Computing
Large-scale deployment and use of cloud computing in industry
is accompanied and in the same time hampered by concerns regarding protection of
data handled by cloud computing providers. One of the consequences of moving
data processing and storage off company premises is that organizations have
less control over their infrastructure. As a result, cloud service (CS) clients
must trust that the CS provider is able to protect their data and
infrastructure from both external and internal attacks. Currently however, such
trust can only rely on organizational processes declared by the CS
provider and can not be remotely verified and validated by an external party.
Enabling the CS client to verify the integrity of the host where the
virtual machine instance will run, as well as to ensure that the virtual
machine image has not been tampered with, are some steps towards building
trust in the CS provider. Having the tools to perform such
verifications prior to the launch of the VM instance allows the CS
clients to decide in runtime whether certain data should be stored- or calculations
should be made on the VM instance offered by the CS provider.
This thesis combines three components -- trusted computing, virtualization technology
and cloud computing platforms -- to address issues of trust and
security in public cloud computing environments. Of the three components,
virtualization technology has had the longest evolution and is a cornerstone
for the realization of cloud computing. Trusted computing is a recent
industry initiative that aims to implement the root of trust in a hardware
component, the trusted platform module. The initiative has been formalized
in a set of specifications and is currently at version 1.2. Cloud computing
platforms pool virtualized computing, storage and network resources in
order to serve a large number of customers customers that use a multi-tenant
multiplexing model to offer on-demand self-service over broad network.
Open source cloud computing platforms are, similar to trusted computing, a
fairly recent technology in active development.
The issue of trust in public cloud environments is addressed
by examining the state of the art within cloud computing security and
subsequently addressing the issues of establishing trust in the launch of a
generic virtual machine in a public cloud environment. As a result, the thesis
proposes a trusted launch protocol that allows CS clients
to verify and ensure the integrity of the VM instance at launch time, as
well as the integrity of the host where the VM instance is launched. The protocol
relies on the use of Trusted Platform Module (TPM) for key generation and data protection.
The TPM also plays an essential part in the integrity attestation of the
VM instance host. Along with a theoretical, platform-agnostic protocol,
the thesis also describes a detailed implementation design of the protocol
using the OpenStack cloud computing platform.
In order the verify the implementability of the proposed protocol, a prototype
implementation has built using a distributed deployment of OpenStack.
While the protocol covers only the trusted launch procedure using generic
virtual machine images, it presents a step aimed to contribute towards
the creation of a secure and trusted public cloud computing environment
Cloud Storage Performance and Security Analysis with Hadoop and GridFTP
Even though cloud server has been around for a few years, most of the web hosts today have not converted to cloud yet. If the purpose of the cloud server is distributing and storing files on the internet, FTP servers were much earlier than the cloud. FTP server is sufficient to distribute content on the internet. Therefore, is it worth to shift from FTP server to cloud server? The cloud storage provider declares high durability and availability for their users, and the ability to scale up for more storage space easily could save users tons of money. However, does it provide higher performance and better security features? Hadoop is a very popular platform for cloud computing. It is free software under Apache License. It is written in Java and supports large data processing in a distributed environment. Characteristics of Hadoop include partitioning of data, computing across thousands of hosts, and executing application computations in parallel. Hadoop Distributed File System allows rapid data transfer up to thousands of terabytes, and is capable of operating even in the case of node failure. GridFTP supports high-speed data transfer for wide-area networks. It is based on the FTP and features multiple data channels for parallel transfers. This report describes the technology behind HDFS and enhancement to the Hadoop security features with Kerberos. Based on data transfer performance and security features of HDFS and GridFTP server, we can decide if we should replace GridFTP server with HDFS. According to our experiment result, we conclude that GridFTP server provides better throughput than HDFS, and Kerberos has minimal impact to HDFS performance. We proposed a solution which users authenticate with HDFS first, and get the file from HDFS server to the client using GridFTP
Will SDN be part of 5G?
For many, this is no longer a valid question and the case is considered
settled with SDN/NFV (Software Defined Networking/Network Function
Virtualization) providing the inevitable innovation enablers solving many
outstanding management issues regarding 5G. However, given the monumental task
of softwarization of radio access network (RAN) while 5G is just around the
corner and some companies have started unveiling their 5G equipment already,
the concern is very realistic that we may only see some point solutions
involving SDN technology instead of a fully SDN-enabled RAN. This survey paper
identifies all important obstacles in the way and looks at the state of the art
of the relevant solutions. This survey is different from the previous surveys
on SDN-based RAN as it focuses on the salient problems and discusses solutions
proposed within and outside SDN literature. Our main focus is on fronthaul,
backward compatibility, supposedly disruptive nature of SDN deployment,
business cases and monetization of SDN related upgrades, latency of general
purpose processors (GPP), and additional security vulnerabilities,
softwarization brings along to the RAN. We have also provided a summary of the
architectural developments in SDN-based RAN landscape as not all work can be
covered under the focused issues. This paper provides a comprehensive survey on
the state of the art of SDN-based RAN and clearly points out the gaps in the
technology.Comment: 33 pages, 10 figure
- …