143,345 research outputs found

    Extended schools subsidy pathfinder evaluation : interim report

    Get PDF

    Correction. Brownian models of open processing networks: canonical representation of workload

    Full text link
    Due to a printing error the above mentioned article [Annals of Applied Probability 10 (2000) 75--103, doi:10.1214/aoap/1019737665] had numerous equations appearing incorrectly in the print version of this paper. The entire article follows as it should have appeared. IMS apologizes to the author and the readers for this error. A recent paper by Harrison and Van Mieghem explained in general mathematical terms how one forms an ``equivalent workload formulation'' of a Brownian network model. Denoting by Z(t)Z(t) the state vector of the original Brownian network, one has a lower dimensional state descriptor W(t)=MZ(t)W(t)=MZ(t) in the equivalent workload formulation, where MM can be chosen as any basis matrix for a particular linear space. This paper considers Brownian models for a very general class of open processing networks, and in that context develops a more extensive interpretation of the equivalent workload formulation, thus extending earlier work by Laws on alternate routing problems. A linear program called the static planning problem is introduced to articulate the notion of ``heavy traffic'' for a general open network, and the dual of that linear program is used to define a canonical choice of the basis matrix MM. To be specific, rows of the canonical MM are alternative basic optimal solutions of the dual linear program. If the network data satisfy a natural monotonicity condition, the canonical matrix MM is shown to be nonnegative, and another natural condition is identified which ensures that MM admits a factorization related to the notion of resource pooling.Comment: Published at http://dx.doi.org/10.1214/105051606000000583 in the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    View Selection in Semantic Web Databases

    Get PDF
    We consider the setting of a Semantic Web database, containing both explicit data encoded in RDF triples, and implicit data, implied by the RDF semantics. Based on a query workload, we address the problem of selecting a set of views to be materialized in the database, minimizing a combination of query processing, view storage, and view maintenance costs. Starting from an existing relational view selection method, we devise new algorithms for recommending view sets, and show that they scale significantly beyond the existing relational ones when adapted to the RDF context. To account for implicit triples in query answers, we propose a novel RDF query reformulation algorithm and an innovative way of incorporating it into view selection in order to avoid a combinatorial explosion in the complexity of the selection process. The interest of our techniques is demonstrated through a set of experiments.Comment: VLDB201

    Scheduling of data-intensive workloads in a brokered virtualized environment

    Full text link
    Providing performance predictability guarantees is increasingly important in cloud platforms, especially for data-intensive applications, for which performance depends greatly on the available rates of data transfer between the various computing/storage hosts underlying the virtualized resources assigned to the application. With the increased prevalence of brokerage services in cloud platforms, there is a need for resource management solutions that consider the brokered nature of these workloads, as well as the special demands of their intra-dependent components. In this paper, we present an offline mechanism for scheduling batches of brokered data-intensive workloads, which can be extended to an online setting. The objective of the mechanism is to decide on a packing of the workloads in a batch that minimizes the broker's incurred costs, Moreover, considering the brokered nature of such workloads, we define a payment model that provides incentives to these workloads to be scheduled as part of a batch, which we analyze theoretically. Finally, we evaluate the proposed scheduling algorithm, and exemplify the fairness of the payment model in practical settings via trace-based experiments

    Evaluation of the organisation and delivery of patient-centred acute nursing care

    Get PDF
    In 2002, a team of researchers from the School of Nursing, University of Salford were commissioned by Bolton Hospitals NHS Trust to evaluate the delivery and organisation of patient-centred nursing care across the acute nursing wards within the Royal Bolton Hospital. The key driver for the commissioning of this study arose from two serious untoward incidents that occurred in the year 2000. Following investigation of both these events the Director of Nursing in post at that time believed that poor organisation and delivery of care may have been a contributory factor. Senior nurses in the Trust had also expressed their concern that care may not be organised in a way that made best use of the skills available

    Enabling Quality Control for Entity Resolution: A Human and Machine Cooperation Framework

    Full text link
    Even though many machine algorithms have been proposed for entity resolution, it remains very challenging to find a solution with quality guarantees. In this paper, we propose a novel HUman and Machine cOoperation (HUMO) framework for entity resolution (ER), which divides an ER workload between the machine and the human. HUMO enables a mechanism for quality control that can flexibly enforce both precision and recall levels. We introduce the optimization problem of HUMO, minimizing human cost given a quality requirement, and then present three optimization approaches: a conservative baseline one purely based on the monotonicity assumption of precision, a more aggressive one based on sampling and a hybrid one that can take advantage of the strengths of both previous approaches. Finally, we demonstrate by extensive experiments on real and synthetic datasets that HUMO can achieve high-quality results with reasonable return on investment (ROI) in terms of human cost, and it performs considerably better than the state-of-the-art alternatives in quality control.Comment: 12 pages, 11 figures. Camera-ready version of the paper submitted to ICDE 2018, In Proceedings of the 34th IEEE International Conference on Data Engineering (ICDE 2018
    • …
    corecore