134,962 research outputs found
Cloudbus Toolkit for Market-Oriented Cloud Computing
This keynote paper: (1) presents the 21st century vision of computing and
identifies various IT paradigms promising to deliver computing as a utility;
(2) defines the architecture for creating market-oriented Clouds and computing
atmosphere by leveraging technologies such as virtual machines; (3) provides
thoughts on market-based resource management strategies that encompass both
customer-driven service management and computational risk management to sustain
SLA-oriented resource allocation; (4) presents the work carried out as part of
our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a
Service software system containing SDK (Software Development Kit) for
construction of Cloud applications and deployment on private or public Clouds,
in addition to supporting market-oriented resource management; (ii)
internetworking of Clouds for dynamic creation of federated computing
environments for scaling of elastic applications; (iii) creation of 3rd party
Cloud brokering services for building content delivery networks and e-Science
applications and their deployment on capabilities of IaaS providers such as
Amazon along with Grid mashups; (iv) CloudSim supporting modelling and
simulation of Clouds for performance studies; (v) Energy Efficient Resource
Allocation Mechanisms and Techniques for creation and management of Green
Clouds; and (vi) pathways for future research.Comment: 21 pages, 6 figures, 2 tables, Conference pape
Enabling stream processing for people-centric IoT based on the fog computing paradigm
The world of machine-to-machine (M2M) communication is gradually moving from vertical single purpose solutions to multi-purpose and collaborative applications interacting across industry verticals, organizations and people - A world of Internet of Things (IoT). The dominant approach for delivering IoT applications relies on the development of cloud-based IoT platforms that collect all the data generated by the sensing elements and centrally process the information to create real business value. In this paper, we present a system that follows the Fog Computing paradigm where the sensor resources, as well as the intermediate layers between embedded devices and cloud computing datacenters, participate by providing computational, storage, and control. We discuss the design aspects of our system and present a pilot deployment for the evaluating the performance in a real-world environment. Our findings indicate that Fog Computing can address the ever-increasing amount of data that is inherent in an IoT world by effective communication among all elements of the architecture
Cold Storage Data Archives: More Than Just a Bunch of Tapes
The abundance of available sensor and derived data from large scientific
experiments, such as earth observation programs, radio astronomy sky surveys,
and high-energy physics already exceeds the storage hardware globally
fabricated per year. To that end, cold storage data archives are the---often
overlooked---spearheads of modern big data analytics in scientific,
data-intensive application domains. While high-performance data analytics has
received much attention from the research community, the growing number of
problems in designing and deploying cold storage archives has only received
very little attention.
In this paper, we take the first step towards bridging this gap in knowledge
by presenting an analysis of four real-world cold storage archives from three
different application domains. In doing so, we highlight (i) workload
characteristics that differentiate these archives from traditional,
performance-sensitive data analytics, (ii) design trade-offs involved in
building cold storage systems for these archives, and (iii) deployment
trade-offs with respect to migration to the public cloud. Based on our
analysis, we discuss several other important research challenges that need to
be addressed by the data management community
Next Generation Cloud Computing: New Trends and Research Directions
The landscape of cloud computing has significantly changed over the last
decade. Not only have more providers and service offerings crowded the space,
but also cloud infrastructure that was traditionally limited to single provider
data centers is now evolving. In this paper, we firstly discuss the changing
cloud infrastructure and consider the use of infrastructure from multiple
providers and the benefit of decentralising computing away from data centers.
These trends have resulted in the need for a variety of new computing
architectures that will be offered by future cloud infrastructure. These
architectures are anticipated to impact areas, such as connecting people and
devices, data-intensive computing, the service space and self-learning systems.
Finally, we lay out a roadmap of challenges that will need to be addressed for
realising the potential of next generation cloud systems.Comment: Accepted to Future Generation Computer Systems, 07 September 201
From ”Sapienza” to “Sapienza, State Archives in Rome”. A looping effect bringing back to the original source communication and culture by innovative and low cost 3D surveying, imaging systems and GIS applications
Applicazione di tecnologie mensorie integrate Low Cost,web GIS,applicazione di tecniche di Computational photography per la comunicazione e condivisione dei dati, sistemi di Cloud computing.Archiviazione Grandi DatiHigh Quality survey models, realized by multiple Low Cost methods and technologies, as a container to sharing Cultural and Archival Heritage, this is the aim guiding our research, here described in its primary applications. The SAPIENZA building, a XVI century masterpiece that represented the first unified headquarters of University in Rome, plays since year 1936, when the University moved to its newly edified campus, the role of the main venue for the State Archives. By the collaboration of a group of students of the Architecture Faculty, some integrated survey methods were applied on the monument with success. The beginning was the topographic survey, creating a reference on ground and along the monument for the upcoming applications, a GNNS RTK survey followed georeferencing points on the internal courtyard. Dense stereo matching photogrammetry is nowadays an accepted method for generating 3D survey models, accurate and scalable; it often substitutes 3D laser scanning for its low cost, so that it became our choice. Some 360°shots were planned for creating panoramic views of the double portico from the courtyard, plus additional single shots of some lateral spans and of pillars facing the court, as a single operation with a double finality: to create linked panotours with hotspots to web-linked databases, and 3D textured and georeferenced surface models, allowing to study the harmonic proportions of the classical architectural order. The use of free web Gis platforms, to load the work in Google Earth and the realization of low cost 3D prototypes of some representative parts, has been even performed
- …