11,088 research outputs found

    Prospective: A Data-Driven Technique to Predict Web Service Response Time Percentiles

    Get PDF
    Delivering fast response times for user transactions is a critical requirement for Web services. Often, a Web service has Service Level Agreements (SLA) with its users that quantify how quickly the service has to respond to a user transaction. Typically, SLAs stipulate requirements for Web service response time percentiles, e.g., a specified target for the 95th percentile of response time. Violating SLAs can have adverse consequences for a Web service operator. Consequently, operators require systematic techniques to predict Web service response time percentiles. Existing prediction techniques are very time consuming since they often involve manual construction of queuing or machine learning models. To address this problem, we propose Prospective, a data-driven approach for predicting Web service response time percentiles. Given a specification for workload expected at the Web service over a planning horizon, Prospective uses historical data to offer predictions for response time percentiles of interest. At the core of Prospective is a lightweight simulator that uses collaborative filtering to estimate response time behaviour of the service based on behaviour observed historically. Results show that Prospective significantly outperforms other baseline techniques for a wide variety of workloads. In particular, the technique provides accurate estimates even for workload scenarios not directly observed in the historical data. We also show that Prospective can provide a Web service operator with accurate estimates of the types and numbers of Web service instances needed to avoid SLA violations.Library OA Fun

    TimeTrader: Exploiting Latency Tail to Save Datacenter Energy for On-line Data-Intensive Applications

    Get PDF
    Datacenters running on-line, data-intensive applications (OLDIs) consume significant amounts of energy. However, reducing their energy is challenging due to their tight response time requirements. A key aspect of OLDIs is that each user query goes to all or many of the nodes in the cluster, so that the overall time budget is dictated by the tail of the replies' latency distribution; replies see latency variations both in the network and compute. Previous work proposes to achieve load-proportional energy by slowing down the computation at lower datacenter loads based directly on response times (i.e., at lower loads, the proposal exploits the average slack in the time budget provisioned for the peak load). In contrast, we propose TimeTrader to reduce energy by exploiting the latency slack in the sub- critical replies which arrive before the deadline (e.g., 80% of replies are 3-4x faster than the tail). This slack is present at all loads and subsumes the previous work's load-related slack. While the previous work shifts the leaves' response time distribution to consume the slack at lower loads, TimeTrader reshapes the distribution at all loads by slowing down individual sub-critical nodes without increasing missed deadlines. TimeTrader exploits slack in both the network and compute budgets. Further, TimeTrader leverages Earliest Deadline First scheduling to largely decouple critical requests from the queuing delays of sub- critical requests which can then be slowed down without hurting critical requests. A combination of real-system measurements and at-scale simulations shows that without adding to missed deadlines, TimeTrader saves 15-19% and 41-49% energy at 90% and 30% loading, respectively, in a datacenter with 512 nodes, whereas previous work saves 0% and 31-37%.Comment: 13 page

    Quantifying the Impact of Replication on the Quality-of-Service in Cloud Databases

    No full text
    Cloud databases achieve high availability by automatically replicating data on multiple nodes. However, the overhead caused by the replication process can lead to an increase in the mean and variance of transaction response times, causing unforeseen impacts on the offered quality-of-service (QoS). In this paper, we propose a measurement-driven methodology to predict the impact of replication on Database-as-a-Service (DBaaS) environments. Our methodology uses operational data to parameterize a closed queueing network model of the database cluster together with a Markov model that abstracts the dynamic replication process. Experiments on Amazon RDS show that our methodology predicts response time mean and percentiles with errors of just 1% and 15% respectively, and under operational conditions that are significantly different from the ones used for model parameterization. We show that our modeling approach surpasses standard modeling methods and illustrate the applicability of our methodology for automated DBaaS provisioning

    Use and Communication of Probabilistic Forecasts

    Full text link
    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications

    Citizens and Institutions as Information Prosumers. The Case Study of Italian Municipalities on Twitter

    Get PDF
    The aim of this paper is to address changes in public communication following the advent of Internet social networking tools and the emerging web 2.0 technologies which are providing new ways of sharing information and knowledge. In particular public administrations are called upon to reinvent the governance of public affairs and to update the means for interacting with their communities. The paper develops an analysis of the distribution, diffusion and performance of the official profiles on Twitter adopted by the Italian municipalities (comuni) up to November 2013. It aims to identify the patterns of spatial distribution and the drivers of the diffusion of Twitter profiles; the performance of the profiles through an aggregated index, called the Twitter performance index (Twiperindex), which evaluates the profiles' activity with reference to the gravitational areas of the municipalities in order to enable comparisons of the activity of municipalities with different demographic sizes and functional roles. The results show that only a small portion of innovative municipalities have adopted Twitter to enhance e-participation and e-governance and that the drivers of the diffusion seem to be related either to past experiences and existing conditions (i.e. civic networks, digital infrastructures) developed over time or to strong local community awareness. The better performances are achieved mainly by small and medium-sized municipalities. Of course, the phenomenon is very new and fluid, therefore this analysis should be considered as a first step in ongoing research which aims to grasp the dynamics of these new means of public communication

    Performance benchmarks for scholarly metrics associated with fisheries and wildlife faculty

    Get PDF
    Research productivity and impact are often considered in professional evaluations of academics, and performance metrics based on publications and citations increasingly are used in such evaluations. To promote evidence-based and informed use of these metrics, we collected publication and citation data for 437 tenure-track faculty members at 33 research-extensive universities in the United States belonging to the National Association of University Fisheries and Wildlife Programs. For each faculty member, we computed 8 commonly used performance metrics based on numbers of publications and citations, and recorded covariates including academic age (time since Ph.D.), sex, percentage of appointment devoted to research, and the sub-disciplinary research focus. Standardized deviance residuals from regression models were used to compare faculty after accounting for variation in performance due to these covariates. We also aggregated residuals to enable comparison across universities. Finally, we tested for temporal trends in citation practices to assess whether the law of constant ratios , used to enable comparison of performance metrics between disciplines that differ in citation and publication practices, applied to fisheries and wildlife sub-disciplines when mapped to Web of Science Journal Citation Report categories. Our regression models reduced deviance by 1/4 to 1/2. Standardized residuals for each faculty member, when combined across metrics as a simple average or weighted via factor analysis, produced similar results in terms of performance based on percentile rankings. Significant variation was observed in scholarly performance across universities, after accounting for the influence of covariates. In contrast to findings for other disciplines, normalized citation ratios for fisheries and wildlife sub-disciplines increased across years. Increases were comparable for all sub-disciplines except ecology. We discuss the advantages and limitations of our methods, illustrate their use when applied to new data, and suggest future improvements. Our benchmarking approach may provide a useful tool to augment detailed, qualitative assessment of performance. © 2016 Swihart et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited
    • …
    corecore