57,940 research outputs found
Performance Improvement of Cloud Computing Data Centers Using Energy Efficient Task Scheduling Algorithms
Cloud computing is a technology that provides a platform for the sharing of resources such as software, infrastructure, application and other information. It brings a revolution in Information Technology industry by offering on-demand of resources. Clouds are basically virtualized datacenters and applications offered as services. Data center hosts hundreds or thousands of servers which comprised of software and hardware to respond the client request. A large amount of energy requires to perform the operation.. Cloud Computing is facing lot of challenges like Security of Data, Consumption of energy, Server Consolidation, etc. The research work focuses on the study of task scheduling management in a cloud environment.
The main goal is to improve the performance (resource utilization and redeem the consumption of energy) in data centers. Energy-efficient scheduling of workloads helps to redeem the consumption of energy in data centers, thus helps in better USAge of resource. This is further reducing operational costs and provides benefits to the clients and also to cloud service provider. In this abstract of paper, the task scheduling in data centers have been compared. Cloudsim a toolkit for modeling and simulation of cloud computing environment has been used to implement and demonstrate the experimental results. The results aimed at analyzing the energy consumed in data centers and shows that by having reduce the consumption of energy the cloud productivity can be improved
Cloud-native RStudio on Kubernetes for Hopsworks
In order to fully benefit from cloud computing, services are designed
following the "multi-tenant" architectural model, which is aimed at maximizing
resource sharing among users. However, multi-tenancy introduces challenges of
security, performance isolation, scaling, and customization. RStudio server is
an open-source Integrated Development Environment (IDE) accessible over a web
browser for the R programming language. We present the design and
implementation of a multi-user distributed system on Hopsworks, a
data-intensive AI platform, following the multi-tenant model that provides
RStudio as Software as a Service (SaaS). We use the most popular cloud-native
technologies: Docker and Kubernetes, to solve the problems of performance
isolation, security, and scaling that are present in a multi-tenant
environment. We further enable secure data sharing in RStudio server instances
to provide data privacy and allow collaboration among RStudio users. We
integrate our system with Apache Spark, which can scale and handle Big Data
processing workloads. Also, we provide a UI where users can provide custom
configurations and have full control of their own RStudio server instances. Our
system was tested on a Google Cloud Platform cluster with four worker nodes,
each with 30GB of RAM allocated to them. The tests on this cluster showed that
44 RStudio servers, each with 2GB of RAM, can be run concurrently. Our system
can scale out to potentially support hundreds of concurrently running RStudio
servers by adding more resources (CPUs and RAM) to the cluster or system.Comment: 8 pages, 4 figure
A Novel Capability Maturity Model with Quantitative Metrics for Securing Cloud Computing
University of Technology Sydney. Faculty of Engineering and Information Technology.Cloud computing is a cutting-edge technology for building resource-sharing, on-demand infrastructures that support Internet of Things (IOTs), big data analytics, and software-defined systems/services. However, cloud infrastructures and their interconnections are increasingly exposed to attackers while accommodating a massive number of IOT devices and provisioning numerous sophisticated emerging applications.
There exist several cloud security models and standards dealing with emerging cloud security threats. They provide simplistic and brute-force approaches to addressing the cloud security problems: preventing security breaches by cautiously avoiding possible causes or fix them through trial and error attempts. Two major issues have been identified with the current approach to cloud security. First, it lacks quantitative measures in assessing the security level of security domains within a cloud space. Second, it lacks a model that can depict the overall security status of the cloud system.
In the light of the above, the aim of this dissertation is to investigate relevant quantitative security metrics and propose a novel Capability Maturity Model with Quantitative Security Metrics for Securing Cloud Computing. First, we propose a new security metric named Mean Security Remediation Cost to assess the cost attributed to cloud stakeholders when a security attack has occurred. Moreover, we propose three different quantitative novel models for quantifying the probability of a cloud threat materialising into an attack. Second, a new Cloud Security Capability Maturity Model (CSCMM) for the cloud will be proposed. The model includes cloud-specific security domains and the quantitative assessment of the overall security of the cloud under consideration. To support the measuring of security maturity levels, a security metric framework is introduced. The CSCMM Model will be quantitatively validated by proposed security metrics. We evaluate the model in a cloud computing environment and compare the consequences by simulating different parameters of the proposed security quantitative metric.
The thesis contributes to the theoretical body of knowledge in cloud security. The thesis proposes for the first time a Capability Maturity Model for cloud security. Additionally, the novel model will be used in practice by managers, security experts and practitioners for both assessing the overall security status of the organisation/system and taking new quantitative measures to mitigate weaknesses of any specific aspects of the system as identified by the assessment. The major research outcomes from the thesis have been delivered in academic papers published in international peer-reviewed journals and conferences in cyber security and cloud computing
A threshold secure data sharing scheme for federated clouds
Cloud computing allows users to view computing in a new direction, as it uses
the existing technologies to provide better IT services at low-cost. To offer
high QOS to customers according SLA, cloud services broker or cloud service
provider uses individual cloud providers that work collaboratively to form a
federation of clouds. It is required in applications like Real-time online
interactive applications, weather research and forecasting etc., in which the
data and applications are complex and distributed. In these applications secret
data should be shared, so secure data sharing mechanism is required in
Federated clouds to reduce the risk of data intrusion, the loss of service
availability and to ensure data integrity. So In this paper we have proposed
zero knowledge data sharing scheme where Trusted Cloud Authority (TCA) will
control federated clouds for data sharing where the secret to be exchanged for
computation is encrypted and retrieved by individual cloud at the end. Our
scheme is based on the difficulty of solving the Discrete Logarithm problem
(DLOG) in a finite abelian group of large prime order which is NP-Hard. So our
proposed scheme provides data integrity in transit, data availability when one
of host providers are not available during the computation.Comment: 8 pages, 3 Figures, International Journal of Research in Computer
Science 2012. arXiv admin note: text overlap with arXiv:1003.3920 by other
author
- …