51,299 research outputs found
Analysis of cloud storage prices
Cloud storage is fast securing its role as a major repository for both
consumers and business customers. Many companies now offer storage solutions,
sometimes for free for limited amounts of capacity. We have surveyed the
pricing plans of a selection of major cloud providers and compared them using
the unit price as the means of comparison. All the providers, excepting Amazon,
adopt a bundling pricing scheme; Amazon follows instead a block-declining
pricing policy. We compare the pricing plans through a double approach: a
pointwise comparison for each value of capacity, and an overall comparison
using a two-part tariff approximation and a Pareto-dominance criterion. Under
both approaches, most providers appear to offer pricing plans that are more
expensive and can be excluded from a procurement selection in favour of a
limited number of dominant providers.Comment: 17 pages, 17 figures, 17 reference
Cloud Computing with Reference to Data Mining
The performance information of knowledge mining techniques throughout Cloud computing can allow the users to induce back of import information from much incorporated data warehouse that reduces the prices of infrastructure and storage. This analysis describes however data processing is employed in cloud computing by scrutiny key options of cloud computing firms. The mix of knowledge mining techniques into traditional regular behavior has become common place. Data processing techniques and applications square measure considerably fascinating within the cloud computing pattern. The information mining in Cloud Computing permits organizations to mix the management of software package and data storage, with guarantee of well organized, consistent and safe services for his or her users
On the combination of multi-cloud and network coding for cost-efficient storage in industrial applications
The adoption of both Cyber–Physical Systems (CPSs) and the Internet-of-Things (IoT) has
enabled the evolution towards the so-called Industry 4.0. These technologies, together with cloud
computing and artificial intelligence, foster new business opportunities. Besides, several industrial
applications need immediate decision making and fog computing is emerging as a promising solution
to address such requirement. In order to achieve a cost-efficient system, we propose taking advantage
from spot instances, a new service offered by cloud providers, which provide resources at lower prices.
The main downside of these instances is that they do not ensure service continuity and they might
suffer from interruptions. An architecture that combines fog and multi-cloud deployments along with
Network Coding (NC) techniques, guarantees the needed fault-tolerance for the cloud environment,
and also reduces the required amount of redundant data to provide reliable services. In this paper
we analyze how NC can actually help to reduce the storage cost and improve the resource efficiency
for industrial applications, based on a multi-cloud infrastructure. The cost analysis has been carried
out using both real AWS EC2 spot instance prices and, to complement them, prices obtained from
a model based on a finite Markov chain, derived from real measurements. We have analyzed the
overall system cost, depending on different parameters, showing that configurations that seek to
minimize the storage yield a higher cost reduction, due to the strong impact of storage cost
On the combination of multi-cloud and network coding for cost-efficient storage in industrial applications
The adoption of both Cyber-Physical Systems (CPSs) and the Internet-of-Things (IoT) has enabled the evolution towards the so-called Industry 4.0. These technologies, together with cloud computing and artificial intelligence, foster new business opportunities. Besides, several industrial applications need immediate decision making and fog computing is emerging as a promising solution to address such requirement. In order to achieve a cost-efficient system, we propose taking advantage from spot instances, a new service offered by cloud providers, which provide resources at lower prices. The main downside of these instances is that they do not ensure service continuity and they might suffer from interruptions. An architecture that combines fog and multi-cloud deployments along with Network Coding (NC) techniques, guarantees the needed fault-tolerance for the cloud environment, and also reduces the required amount of redundant data to provide reliable services. In this paper we analyze how NC can actually help to reduce the storage cost and improve the resource efficiency for industrial applications, based on a multi-cloud infrastructure. The cost analysis has been carried out using both real AWS EC2 spot instance prices and, to complement them, prices obtained from a model based on a finite Markov chain, derived from real measurements. We have analyzed the overall system cost, depending on different parameters, showing that configurations that seek to minimize the storage yield a higher cost reduction, due to the strong impact of storage cost.This work has been partially supported by the Basque Government through the Elkartek program (Grant agreement no. KK-2018/00115), the H2020 research framework of the European Commission under the ELASTIC project (Grant agreement no. 825473), and the Spanish Ministry of Economy and Competitiveness through the CARMEN project (TEC2016-75067-C4-3-R), the ADVICE project (TEC2015-71329-C2-1-R), and the COMONSENS network (TEC2015-69648-REDC)
The state of SQL-on-Hadoop in the cloud
Managed Hadoop in the cloud, especially SQL-on-Hadoop, has been gaining attention recently. On Platform-as-a-Service (PaaS), analytical services like Hive and Spark come preconfigured for general-purpose and ready to use. Thus, giving companies a quick entry and on-demand deployment of ready SQL-like solutions for their big data needs. This study evaluates cloud services from an end-user perspective, comparing providers including: Microsoft Azure, Amazon Web Services, Google Cloud,
and Rackspace. The study focuses on performance, readiness, scalability, and cost-effectiveness of the different solutions at entry/test level clusters sizes. Results are based on over 15,000 Hive queries derived from the industry standard TPC-H benchmark.
The study is framed within the ALOJA research project, which features an open source benchmarking and analysis platform that has been recently extended to support SQL-on-Hadoop engines.
The ALOJA Project aims to lower the total cost of ownership (TCO) of big data deployments and study their performance characteristics for optimization.
The study benchmarks cloud providers across a diverse range instance types, and uses input data scales from 1GB to 1TB, in order to survey the popular entry-level PaaS SQL-on-Hadoop solutions, thereby establishing a common results-base upon which subsequent research can be carried out by the project. Initial results already show the main performance trends to both hardware and software configuration, pricing, similarities and architectural differences of the evaluated PaaS solutions. Whereas some
providers focus on decoupling storage and computing resources while offering network-based elastic storage, others choose to keep the local processing model from Hadoop for high performance, but reducing flexibility. Results also show the importance of application-level tuning and how keeping up-to-date hardware and software stacks can influence performance even more than replicating the on-premises model in the cloud.This work is partially supported by the Microsoft Azure for Research program, the European Research Council (ERC) under
the EUs Horizon 2020 programme (GA 639595), the Spanish Ministry of Education (TIN2015-65316-P), and the Generalitat
de Catalunya (2014-SGR-1051).Peer ReviewedPostprint (author's final draft
Decision Support Tools for Cloud Migration in the Enterprise
This paper describes two tools that aim to support decision making during the
migration of IT systems to the cloud. The first is a modeling tool that
produces cost estimates of using public IaaS clouds. The tool enables IT
architects to model their applications, data and infrastructure requirements in
addition to their computational resource usage patterns. The tool can be used
to compare the cost of different cloud providers, deployment options and usage
scenarios. The second tool is a spreadsheet that outlines the benefits and
risks of using IaaS clouds from an enterprise perspective; this tool provides a
starting point for risk assessment. Two case studies were used to evaluate the
tools. The tools were useful as they informed decision makers about the costs,
benefits and risks of using the cloud.Comment: To appear in IEEE CLOUD 201
- …