77,063 research outputs found

    Analysis of the age of data in data backup systems

    Get PDF
    Cloud infrastructures are becoming a common platform for storage and workload operations for industries. With increasing rate of data generation, the cloud storage industry has already grown into a multi-billion dollar industry. This industry offers services with very strict service level agreements (SLAs) to insure a high Quality of Service (QoS) for its clients. A breach of these SLAs results in a heavy economic loss for the service provider. We study a queueing model of data backup systems with a focus on the age of data. The age of data is roughly defined as the time for which data has not been backed up and is therefore a measure of uncertainty for the user. We precisely define the performance measure and compute the generating function of its distribution. It is critical to ensure that the tail probabilities are small so that the system stays within SLAs with a high probability. Therefore, we also analyze the tail distribution of the age of data by performing dominant singularity analysis of its generating function. Our formulas can help the service providers to set the system parameters adequately. (C) 2019 Elsevier B.V. All rights reserved

    A queueing-theoretic analysis of the threshold-based exhaustive data-backup scheduling policy

    Get PDF
    We analyse the threshold-based exhaustive data backup scheduling mechanism by means of a queueing-theoretic approach. Data packets that have not yet been backed up are modelled by customers waiting for service (back-up). We obtain the probability generating function of the system content (backlog size) at random slot boundaries in steady state

    Fast and secure laptop backups with encrypted de-duplication

    Get PDF
    Many people now store large quantities of personal and corporate data on laptops or home computers. These often have poor or intermittent connectivity, and are vulnerable to theft or hardware failure. Conventional backup solutions are not well suited to this environment, and backup regimes are frequently inadequate. This paper describes an algorithm which takes advantage of the data which is common between users to increase the speed of backups, and reduce the storage requirements. This algorithm supports client-end per-user encryption which is necessary for confidential personal data. It also supports a unique feature which allows immediate detection of common subtrees, avoiding the need to query the backup system for every file. We describe a prototype implementation of this algorithm for Apple OS X, and present an analysis of the potential effectiveness, using real data obtained from a set of typical users. Finally, we discuss the use of this prototype in conjunction with remote cloud storage, and present an analysis of the typical cost savings.

    The role of large-scale energy storage design and dispatch in the power grid: A study of very high grid penetration of variable renewable resources

    Get PDF
    We present a result of hourly simulation performed using hourly load data and the corresponding simulated output of wind and solar technologies distributed throughout the state of California. We examined how we could achieve very high-energy penetration from intermittent renewable system into the electricity grid. This study shows that the maximum threshold for the storage need is significantly less than the daily average demand. In the present study, we found that the approximate network energy storage is of the order of 186. GW. h/22. GW (approximately 22% of the average daily demands of California). Allowing energy dumping was shown to increase storage use, and by that way, increases grid penetration and reduces the required backup conventional capacity requirements. Using the 186. GW. h/22. GW storage and at 20% total energy loss, grid penetration was increased to approximately 85% of the annual demand of the year while also reducing the conventional backup capacity requirement to 35. GW. This capacity was sufficient to supply the year round hourly demand, including 59 GW peak demand, plus a distribution loss of about 5.3%. We conclude that designing an efficient and least cost grid may require the capability to capture diverse physical and operational policy scenarios of the future grid. © 2014 Elsevier Ltd

    The Security Rule

    Get PDF
    corecore