2,846 research outputs found
Cloudbus Toolkit for Market-Oriented Cloud Computing
This keynote paper: (1) presents the 21st century vision of computing and
identifies various IT paradigms promising to deliver computing as a utility;
(2) defines the architecture for creating market-oriented Clouds and computing
atmosphere by leveraging technologies such as virtual machines; (3) provides
thoughts on market-based resource management strategies that encompass both
customer-driven service management and computational risk management to sustain
SLA-oriented resource allocation; (4) presents the work carried out as part of
our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a
Service software system containing SDK (Software Development Kit) for
construction of Cloud applications and deployment on private or public Clouds,
in addition to supporting market-oriented resource management; (ii)
internetworking of Clouds for dynamic creation of federated computing
environments for scaling of elastic applications; (iii) creation of 3rd party
Cloud brokering services for building content delivery networks and e-Science
applications and their deployment on capabilities of IaaS providers such as
Amazon along with Grid mashups; (iv) CloudSim supporting modelling and
simulation of Clouds for performance studies; (v) Energy Efficient Resource
Allocation Mechanisms and Techniques for creation and management of Green
Clouds; and (vi) pathways for future research.Comment: 21 pages, 6 figures, 2 tables, Conference pape
Harnessing Technology Schools Survey 2008: report 2: data
This report provides a detailed analysis of the data and methodologies adopted in the 2008 HTSS and provides copies of all research instruments used in the survey
High-Performance Cloud Computing: A View of Scientific Applications
Scientific computing often requires the availability of a massive number of
computers for performing large scale experiments. Traditionally, these needs
have been addressed by using high-performance computing solutions and installed
facilities such as clusters and super computers, which are difficult to setup,
maintain, and operate. Cloud computing provides scientists with a completely
new model of utilizing the computing infrastructure. Compute resources, storage
resources, as well as applications, can be dynamically provisioned (and
integrated within the existing infrastructure) on a pay per use basis. These
resources can be released when they are no more needed. Such services are often
offered within the context of a Service Level Agreement (SLA), which ensure the
desired Quality of Service (QoS). Aneka, an enterprise Cloud computing
solution, harnesses the power of compute resources by relying on private and
public Clouds and delivers to users the desired QoS. Its flexible and service
based infrastructure supports multiple programming paradigms that make Aneka
address a variety of different scenarios: from finance applications to
computational science. As examples of scientific computing in the Cloud, we
present a preliminary case study on using Aneka for the classification of gene
expression data and the execution of fMRI brain imaging workflow.Comment: 13 pages, 9 figures, conference pape
Systematizing Decentralization and Privacy: Lessons from 15 Years of Research and Deployments
Decentralized systems are a subset of distributed systems where multiple
authorities control different components and no authority is fully trusted by
all. This implies that any component in a decentralized system is potentially
adversarial. We revise fifteen years of research on decentralization and
privacy, and provide an overview of key systems, as well as key insights for
designers of future systems. We show that decentralized designs can enhance
privacy, integrity, and availability but also require careful trade-offs in
terms of system complexity, properties provided, and degree of
decentralization. These trade-offs need to be understood and navigated by
designers. We argue that a combination of insights from cryptography,
distributed systems, and mechanism design, aligned with the development of
adequate incentives, are necessary to build scalable and successful
privacy-preserving decentralized systems
Harnessing high-dimensional hyperentanglement through a biphoton frequency comb
Quantum entanglement is a fundamental resource for secure information
processing and communications, where hyperentanglement or high-dimensional
entanglement has been separately proposed towards high data capacity and error
resilience. The continuous-variable nature of the energy-time entanglement
makes it an ideal candidate for efficient high-dimensional coding with minimal
limitations. Here we demonstrate the first simultaneous high-dimensional
hyperentanglement using a biphoton frequency comb to harness the full potential
in both energy and time domain. The long-postulated Hong-Ou-Mandel quantum
revival is exhibited, with up to 19 time-bins, 96.5% visibilities. We further
witness the high-dimensional energy-time entanglement through Franson revivals,
which is observed periodically at integer time-bins, with 97.8% visibility.
This qudit state is observed to simultaneously violate the generalized Bell
inequality by up to 10.95 deviations while observing recurrent
Clauser-Horne-Shimony-Holt S-parameters up to 2.76. Our biphoton frequency comb
provides a platform in photon-efficient quantum communications towards the
ultimate channel capacity through energy-time-polarization high-dimensional
encoding
- …