2,403 research outputs found
Securely Launching Virtual Machines on Trustworthy Platforms in a Public Cloud
In this paper we consider the Infrastructure-as-a-Service (IaaS) cloud model which allows cloud users to run their own virtual machines (VMs) on available cloud computing resources. IaaS gives enterprises the possibility to outsource their process workloads with minimal effort and expense. However, one major problem with existing approaches of cloud leasing, is that the users can only get contractual guarantees regarding the integrity of the offered platforms. The fact that the IaaS user himself or herself cannot verify the provider promised cloud platform integrity, is a security risk which threatens to prevent the IaaS business in general. In this paper we address this issue and propose a novel secure VM launch protocol using Trusted Computing techniques. This protocol allows the cloud IaaS users to securely bind the VM to a trusted computer configuration such that the clear text VM only will run on a platform that has been booted into a trustworthy state. This capability builds user confidence and can serve as an important enabler for creating trust in public clouds. We evaluate the feasibility of our proposed protocol via a full scale system implementation and perform a system security analysis
FROM 3D SURVEYING DATA TO BIM TO BEM: THE INCUBE DATASET
In recent years, the improvement of sensors and methodologies for 3D reality-based surveying has exponentially enhanced the possibility of creating digital replicas of the real world. LiDAR technologies and photogrammetry are currently standard approaches for collecting 3D geometric information of indoor and outdoor environments at different scales. This information can potentially be part of a broader processing workflow that, starting from 3D surveyed data and through Building Information Models (BIM) generation, leads to more complex analyses of buildings’ features and behavior (Figure 1). However, creating BIM models, especially of historic and heritage assets (HBIM), is still resource-intensive and time-consuming due to the manual efforts required for data creation and enrichment. Improve 3D data processing, interoperability, and the automation of the BIM generation process are some of the trending research topics, and benchmark datasets are extremely helpful in evaluating newly developed algorithms and methodologies for these scopes. This paper introduces the InCUBE dataset, resulting from the activities of the recently funded EU InCUBE project, focused on unlocking the EU building renovation through integrated strategies and processes for efficient built-environment management (including the use of innovative renewable energy technologies and digitalization). The set of data collects raw and processed data produced for the Italian demo site in the Santa Chiara district of Trento (Italy). The diversity of the shared data enables multiple possible uses, investigations and developments, and some of them are presented in this contribution
Deploying Virtual Machines on Shared Platforms
In this report, we describe mechanisms for secure deployment of virtual machines on shared platforms looking into a telecommunication cloud use case, which is also presented in this report. The architecture we present focuses on the security requirements of the major stakeholders’ part of the scenario we present. This report comprehensively covers all major security aspects including different security mechanisms and protocols, leveraging existing standards and state-of-the art wherever applicable. In particular, our architecture uses TCG technologies for trust establishment in the deployment of operator virtual machines on shared resource platforms. We also propose a novel procedure for securely launching and cryptographically binding a virtual machine to a target platform thereby protecting the operator virtual machine and its related credentials
The medical science DMZ: a network design pattern for data-intensive medical science
Abstract:
Objective
We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations.
Materials and Methods
High-end networking, packet-filter firewalls, network intrusion-detection systems.
Results
We describe a “Medical Science DMZ” concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs.
Discussion
The exponentially increasing amounts of “omics” data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research “Big Data.” The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows.
Conclusion
By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements
Hybrid clouds for data-Intensive, 5G-Enabled IoT applications: an overview, key issues and relevant architecture
Hybrid cloud multi-access edge computing (MEC) deployments have been proposed as efficient
means to support Internet of Things (IoT) applications, relying on a plethora of nodes and data. In this paper, an overview on the area of hybrid clouds considering relevant research areas is given, providing technologies and mechanisms for the formation of such MEC deployments, as well as emphasizing several key issues that should be tackled by novel approaches, especially under the 5G paradigm. Furthermore, a decentralized hybrid cloud MEC architecture, resulting in a Platform-as-a-Service (PaaS) is proposed and its main building blocks and layers are thoroughly described. Aiming to offer a broad perspective on the business potential of such a platform, the stakeholder ecosystem is also analyzed. Finally, two use cases in the context of smart cities and mobile health are presented, aimed at showing how the proposed PaaS enables the development of respective IoT applications.Peer ReviewedPostprint (published version
The European Commission's Case Against Microsoft: Fool Monti Kills Bill?
Refereed Working Papers / of international relevanc
Recommended from our members
Cloud Computing Technology: Leveraging the Power of the Internet to Improve Business Performance
In recent years, Cloud Computing Technology (CCT) has emerged as a meaningful technology that could contribute to operational efficiency of an IT platform by providing infrastructure and software solutions for the whole IT needs of an enterprise via Internet. The cloud has revolutionized IT infrastructure. It is predicted that 2017 will mark the rapid proliferation of enterprises transitioning to the cloud-based computing technology. The utilization of this innovative technology makes collaboration easier among companies and has the potential to create financial and operational benefits. This study discusses potential strategic benefits of this technology, highlights its evolving technologies and trends and their future impact, reviews different phases necessary to deploy the technology, highlights key adoption factors, and surveys its potential application in different industries
System Support For Stream Processing In Collaborative Cloud-Edge Environment
Stream processing is a critical technique to process huge amount of data in real-time manner.
Cloud computing has been used for stream processing due to its unlimited computation
resources. At the same time, we are entering the era of Internet of Everything (IoE). The emerging
edge computing benefits low-latency applications by leveraging computation resources at
the proximity of data sources. Billions of sensors and actuators are being deployed worldwide
and huge amount of data generated by things are immersed in our daily life. It has become
essential for organizations to be able to stream and analyze data, and provide low-latency analytics
on streaming data. However, cloud computing is inefficient to process all data in a centralized
environment in terms of the network bandwidth cost and response latency. Although
edge computing offloads computation from the cloud to the edge of the Internet, there is not
a data sharing and processing framework that efficiently utilizes computation resources in the
cloud and the edge. Furthermore, the heterogeneity of edge devices brings more difficulty to the development of collaborative cloud-edge applications.
To explore and attack the challenges of stream processing system in collaborative cloudedge
environment, in this dissertation we design and develop a series of systems to support
stream processing applications in hybrid cloud-edge analytics. Specifically, we develop an
hierarchical and hybrid outlier detection model for multivariate time series streams that automatically
selects the best model for different time series. We optimize one of the stream
processing system (i.e., Spark Streaming) to reduce the end-to-end latency. To facilitate the
development of collaborative cloud-edge applications, we propose and implement a new computing
framework, Firework that allows stakeholders to share and process data by leveraging
both the cloud and the edge. A vision-based cloud-edge application is implemented to demonstrate
the capabilities of Firework. By combining all these studies, we provide comprehensive
system support for stream processing in collaborative cloud-edge environment
- …