47,146 research outputs found
BIG DATA AND CLOUD COMPUTING
Big Data may be a data analysis methodology enabled by recent advances in technologies and architecture. However, big data leads to enormous commitment of hardware and processing resources, which prevents the adoption costs of massive data technology for small and medium-sized businesses.Cloud computing offers the promise of massive data implementation to small and medium sized businesses.Big processing is performed through a programming paradigm referred to as MapReduce. Typically, implementation of the MapReduce paradigm requires networked attached storage and multiprocessing . The computing needs of MapReduce programming are often beyond what small and medium sized business are ready to commit. Cloud computing is on-demand network access to computing resources, provided by an outdoor entity. Common deployment models for cloud computing include platform as a service (PaaS), software as a service (SaaS), infrastructure as a service (IaaS), and hardware as a service (HaaS). The three sorts of cloud computing are the general public cloud, the private cloud, and therefore the hybrid cloud. A public cloud is that the pay- as-you-go services. a personal cloud is internal data center of a business not available to the overall public but supported cloud structure. The hybrid cloud may be a combination of the general public cloud and personal cloud. Three major reasons for little to medium sized businesses to use cloud computing for giant data technology implementation are hardware cost reduction, processing cost reduction, and skill to check the worth of massive data. the main concerns regarding cloud computing are security and loss ofcontrol. 
High-Performance Cloud Computing: A View of Scientific Applications
Scientific computing often requires the availability of a massive number of
computers for performing large scale experiments. Traditionally, these needs
have been addressed by using high-performance computing solutions and installed
facilities such as clusters and super computers, which are difficult to setup,
maintain, and operate. Cloud computing provides scientists with a completely
new model of utilizing the computing infrastructure. Compute resources, storage
resources, as well as applications, can be dynamically provisioned (and
integrated within the existing infrastructure) on a pay per use basis. These
resources can be released when they are no more needed. Such services are often
offered within the context of a Service Level Agreement (SLA), which ensure the
desired Quality of Service (QoS). Aneka, an enterprise Cloud computing
solution, harnesses the power of compute resources by relying on private and
public Clouds and delivers to users the desired QoS. Its flexible and service
based infrastructure supports multiple programming paradigms that make Aneka
address a variety of different scenarios: from finance applications to
computational science. As examples of scientific computing in the Cloud, we
present a preliminary case study on using Aneka for the classification of gene
expression data and the execution of fMRI brain imaging workflow.Comment: 13 pages, 9 figures, conference pape
- …