115,930 research outputs found

    Enabling and sharing storage space under a federated cloud environment

    Get PDF
    To support the Portuguese scientific community LIP, Laboratório de Instrumentação e Física Experimental de Partículas, has been operating a high performance computing (HPC) infrastructure under a grant from the Infraestrutura Nacional de Computação Distribuída (INCD) Program. Now, LIP is promoting another initiative towards the same community, that is, to build a Cloud Computing (CC) service which orchestrates the three fundamental resources: compute, network, and storage. The main goal of this dissertation is to research, implement, benchmark and adopt the most appropriate backend storage architecture for the OpenStack cloud computing software service, chosen by LIP (following EGI, the European Grid Infrastructure) to be the cloud platform to be deployed in the new CC-INCD program. For this work, our objectives are: a) to gain an understanding of OpenStack – its architecture, and the way it works to offer an Infrastructure-as-a-Service (IaaS) platform; b) look for candidates suitable to be deployed as OpenStack’ storage backends, which should be able to store templates (images, in the OpenStack terminology) of virtual machines (VMs) and ISO images (CDs/DVDs), ephemeral and persistent virtual disks for VM instances, etc.; c) to present a preliminary study of three file systems that are strong candidates to be integrated with OpenStack: NFS, Ceph and GlusterFS; and, d) to choose a candidate to integrate with OpenStack, and perform an experimental evaluation

    Towards distributed architecture for collaborative cloud services in community networks

    Get PDF
    Internet and communication technologies have lowered the costs for communities to collaborate, leading to new services like user-generated content and social computing, and through collaboration, collectively built infrastructures like community networks have also emerged. Community networks get formed when individuals and local organisations from a geographic area team up to create and run a community-owned IP network to satisfy the community’s demand for ICT, such as facilitating Internet access and providing services of local interest. The consolidation of today’s cloud technologies offers now the possibility of collectively built community clouds, building upon user-generated content and user-provided networks towards an ecosystem of cloud services. To address the limitation and enhance utility of community networks, we propose a collaborative distributed architecture for building a community cloud system that employs resources contributed by the members of the community network for provisioning infrastructure and software services. Such architecture needs to be tailored to the specific social, economic and technical characteristics of the community networks for community clouds to be successful and sustainable. By real deployments of clouds in community networks and evaluation of application performance, we show that community clouds are feasible. Our result may encourage collaborative innovative cloud-based services made possible with the resources of a community.Peer ReviewedPostprint (author’s final draft

    HYBRID CLOUD METHODOLOGY FOR SAFE APPROVED DEDUPLICATIONS

    Get PDF
    Previous systems cannot support differential authorization duplicate check, in many applications. Inside the recent occasions, structural design was offered that made up of dual clouds for effective outsourcing of understanding in addition to arbitrary computations towards an untrustworthy commodity cloud. With the introduction of cloud-computing, efficient secure data deduplication has attracted much concentration in recent occasions from research community. Data deduplication could be a committed data compression technique that's generally introduced for eliminating duplicate copies of repeating storage data. Inside our work we solve impracticality of deduplication by differential legal rights within cloud-computing, we create a hybrid cloud structural design made up of everybody cloud and cloud

    Cold Storage Data Archives: More Than Just a Bunch of Tapes

    Full text link
    The abundance of available sensor and derived data from large scientific experiments, such as earth observation programs, radio astronomy sky surveys, and high-energy physics already exceeds the storage hardware globally fabricated per year. To that end, cold storage data archives are the---often overlooked---spearheads of modern big data analytics in scientific, data-intensive application domains. While high-performance data analytics has received much attention from the research community, the growing number of problems in designing and deploying cold storage archives has only received very little attention. In this paper, we take the first step towards bridging this gap in knowledge by presenting an analysis of four real-world cold storage archives from three different application domains. In doing so, we highlight (i) workload characteristics that differentiate these archives from traditional, performance-sensitive data analytics, (ii) design trade-offs involved in building cold storage systems for these archives, and (iii) deployment trade-offs with respect to migration to the public cloud. Based on our analysis, we discuss several other important research challenges that need to be addressed by the data management community

    TOWARDS A NOVEL HYBRID APPROACH FOR REMOVING DUPLICATE COPIES OF REPEATED DATA

    Get PDF
    Previous systems cannot support differential authorization duplicate check, in several applications. In the recent times, structural design was offered that consisting of twin clouds for effective outsourcing of data as well as arbitrary computations towards an untrustworthy commodity cloud. With the introduction of cloud computing, efficient secure data deduplication has attracted much concentration in recent times from research community. Data deduplication is a committed data compression technique that is generally introduced for eliminating duplicate copies of repeating storage data. Distinct from established systems, private cloud is offered as a proxy towards permitting data owner to securely execute duplicate check by differential privileges and hence this architecture is useful and has attracted much consideration from researchers.   In our work we solve difficulty of deduplication by differential privileges within cloud computing, we imagine a hybrid cloud structural design consisting of a public cloud and private cloud.

    Towards Knowledge in the Cloud

    Get PDF
    Knowledge in the form of semantic data is becoming more and more ubiquitous, and the need for scalable, dynamic systems to support collaborative work with such distributed, heterogeneous knowledge arises. We extend the “data in the cloud” approach that is emerging today to “knowledge in the cloud”, with support for handling semantic information, organizing and finding it efficiently and providing reasoning and quality support. Both the life sciences and emergency response fields are identified as strong potential beneficiaries of having ”knowledge in the cloud”

    Review of the environmental and organisational implications of cloud computing: final report.

    Get PDF
    Cloud computing – where elastic computing resources are delivered over the Internet by external service providers – is generating significant interest within HE and FE. In the cloud computing business model, organisations or individuals contract with a cloud computing service provider on a pay-per-use basis to access data centres, application software or web services from any location. This provides an elasticity of provision which the customer can scale up or down to meet demand. This form of utility computing potentially opens up a new paradigm in the provision of IT to support administrative and educational functions within HE and FE. Further, the economies of scale and increasingly energy efficient data centre technologies which underpin cloud services means that cloud solutions may also have a positive impact on carbon footprints. In response to the growing interest in cloud computing within UK HE and FE, JISC commissioned the University of Strathclyde to undertake a Review of the Environmental and Organisational Implications of Cloud Computing in Higher and Further Education [19]

    Digital curation and the cloud

    Get PDF
    Digital curation involves a wide range of activities, many of which could benefit from cloud deployment to a greater or lesser extent. These range from infrequent, resource-intensive tasks which benefit from the ability to rapidly provision resources to day-to-day collaborative activities which can be facilitated by networked cloud services. Associated benefits are offset by risks such as loss of data or service level, legal and governance incompatibilities and transfer bottlenecks. There is considerable variability across both risks and benefits according to the service and deployment models being adopted and the context in which activities are performed. Some risks, such as legal liabilities, are mitigated by the use of alternative, e.g., private cloud models, but this is typically at the expense of benefits such as resource elasticity and economies of scale. Infrastructure as a Service model may provide a basis on which more specialised software services may be provided. There is considerable work to be done in helping institutions understand the cloud and its associated costs, risks and benefits, and how these compare to their current working methods, in order that the most beneficial uses of cloud technologies may be identified. Specific proposals, echoing recent work coordinated by EPSRC and JISC are the development of advisory, costing and brokering services to facilitate appropriate cloud deployments, the exploration of opportunities for certifying or accrediting cloud preservation providers, and the targeted publicity of outputs from pilot studies to the full range of stakeholders within the curation lifecycle, including data creators and owners, repositories, institutional IT support professionals and senior manager

    A MIX CLOUD TO ELIMINATE REPLICA DATA CERTIFIED SAFE APPROACH

    Get PDF
    Previous systems cannot support differential authorization duplicate check, in lots of applications. Within the recent occasions, structural design was offered that comprised of dual clouds for effective outsourcing of understanding in addition to arbitrary computations towards an untrustworthy commodity cloud. With the development of cloud-computing, efficient secure data deduplication has attracted much concentration in recent occasions from research community. Data deduplication is a committed data compression technique that's generally introduced for eliminating duplicate copies of repeating storage data. Dissimilar to established systems, private cloud is provided like a proxy towards permitting data owner to safely execute duplicate check by differential legal rights and so this architecture is helpful and provides attracted much consideration from researchers.   Within our work we solve impracticality of deduplication by differential legal rights within cloud-computing, we produce a hybrid cloud structural design comprised of everyone cloud and cloud
    • …
    corecore