13,384 research outputs found

    A computer scientist looks at game theory

    Full text link
    I consider issues in distributed computation that should be of relevance to game theory. In particular, I focus on (a) representing knowledge and uncertainty, (b) dealing with failures, and (c) specification of mechanisms.Comment: To appear, Games and Economic Behavior. JEL classification numbers: D80, D8

    Stochastic scheduling on unrelated machines

    Get PDF
    Two important characteristics encountered in many real-world scheduling problems are heterogeneous machines/processors and a certain degree of uncertainty about the actual sizes of jobs. The first characteristic entails machine dependent processing times of jobs and is captured by the classical unrelated machine scheduling model.The second characteristic is adequately addressed by stochastic processing times of jobs as they are studied in classical stochastic scheduling models. While there is an extensive but separate literature for the two scheduling models, we study for the first time a combined model that takes both characteristics into account simultaneously. Here, the processing time of job jj on machine ii is governed by random variable PijP_{ij}, and its actual realization becomes known only upon job completion. With wjw_j being the given weight of job jj, we study the classical objective to minimize the expected total weighted completion time E[∑jwjCj]E[\sum_j w_jC_j], where CjC_j is the completion time of job jj. By means of a novel time-indexed linear programming relaxation, we compute in polynomial time a scheduling policy with performance guarantee (3+Δ)/2+ϵ(3+\Delta)/2+\epsilon. Here, ϵ>0\epsilon>0 is arbitrarily small, and Δ\Delta is an upper bound on the squared coefficient of variation of the processing times. We show that the dependence of the performance guarantee on Δ\Delta is tight, as we obtain a Δ/2\Delta/2 lower bound for the type of policies that we use. When jobs also have individual release dates rijr_{ij}, our bound is (2+Δ)+ϵ(2+\Delta)+\epsilon. Via Δ=0\Delta=0, currently best known bounds for deterministic scheduling are contained as a special case

    Computer Science and Game Theory: A Brief Survey

    Full text link
    There has been a remarkable increase in work at the interface of computer science and game theory in the past decade. In this article I survey some of the main themes of work in the area, with a focus on the work in computer science. Given the length constraints, I make no attempt at being comprehensive, especially since other surveys are also available, and a comprehensive survey book will appear shortly.Comment: To appear; Palgrave Dictionary of Economic

    Active Decisions and Pro-social Behavior: A Field Experiment on Blood Donation

    Get PDF
    In this paper, we propose a decision framework where people are individually asked to either actively consent or dissent to some pro-social behavior. We hypothesize that confronting individuals with the choice of engaging in a specific pro-social behavior contributes to the formation of issue-specific altruistic preferences while simultaneously involving a commitment. The hypothesis is tested in a large-scale field experiment on blood donation. We find that this "active-decision" intervention substantially increases the actual donation behavior of people who have not fully formed preferences beforehand.active decision, pro-social behavior, field experiment, blood donation

    The science of clinical practice: disease diagnosis or patient prognosis? Evidence about "what is likely to happen" should shape clinical practice.

    Get PDF
    BACKGROUND: Diagnosis is the traditional basis for decision-making in clinical practice. Evidence is often lacking about future benefits and harms of these decisions for patients diagnosed with and without disease. We propose that a model of clinical practice focused on patient prognosis and predicting the likelihood of future outcomes may be more useful. DISCUSSION: Disease diagnosis can provide crucial information for clinical decisions that influence outcome in serious acute illness. However, the central role of diagnosis in clinical practice is challenged by evidence that it does not always benefit patients and that factors other than disease are important in determining patient outcome. The concept of disease as a dichotomous 'yes' or 'no' is challenged by the frequent use of diagnostic indicators with continuous distributions, such as blood sugar, which are better understood as contributing information about the probability of a patient's future outcome. Moreover, many illnesses, such as chronic fatigue, cannot usefully be labelled from a disease-diagnosis perspective. In such cases, a prognostic model provides an alternative framework for clinical practice that extends beyond disease and diagnosis and incorporates a wide range of information to predict future patient outcomes and to guide decisions to improve them. Such information embraces non-disease factors and genetic and other biomarkers which influence outcome. SUMMARY: Patient prognosis can provide the framework for modern clinical practice to integrate information from the expanding biological, social, and clinical database for more effective and efficient care

    Rationale, design and conduct of a randomised controlled trial evaluating a primary care-based complex intervention to improve the quality of life of heart failure patients: HICMan (Heidelberg Integrated Case Management) : study protocol

    Get PDF
    Background: Chronic congestive heart failure (CHF) is a complex disease with rising prevalence, compromised quality of life (QoL), unplanned hospital admissions, high mortality and therefore high burden of illness. The delivery of care for these patients has been criticized and new strategies addressing crucial domains of care have been shown to be effective on patients' health outcomes, although these trials were conducted in secondary care or in highly organised Health Maintenance Organisations. It remains unclear whether a comprehensive primary care-based case management for the treating general practitioner (GP) can improve patients' QoL. Methods/Design: HICMan is a randomised controlled trial with patients as the unit of randomisation. Aim is to evaluate a structured, standardized and comprehensive complex intervention for patients with CHF in a 12-months follow-up trial. Patients from intervention group receive specific patient leaflets and documentation booklets as well as regular monitoring and screening by a prior trained practice nurse, who gives feedback to the GP upon urgency. Monitoring and screening address aspects of disease-specific selfmanagement, (non)pharmacological adherence and psychosomatic and geriatric comorbidity. GPs are invited to provide a tailored structured counselling 4 times during the trial and receive an additional feedback on pharmacotherapy relevant to prognosis (data of baseline documentation). Patients from control group receive usual care by their GPs, who were introduced to guidelineoriented management and a tailored health counselling concept. Main outcome measurement for patients' QoL is the scale physical functioning of the SF-36 health questionnaire in a 12-month follow-up. Secondary outcomes are the disease specific QoL measured by the Kansas City Cardiomyopathy questionnaire (KCCQ), depression and anxiety disorders (PHQ-9, GAD-7), adherence (EHFScBS and SANA), quality of care measured by an adapted version of the Patient Chronic Illness Assessment of Care questionnaire (PACIC) and NTproBNP. In addition, comprehensive clinical data are collected about health status, comorbidity, medication and health care utilisation. Discussion: As the targeted patient group is mostly cared for and treated by GPs, a comprehensive primary care-based guideline implementation including somatic, psychosomatic and organisational aspects of the delivery of care (HICMAn) is a promising intervention applying proven strategies for optimal care. Trial registration: Current Controlled Trials ISRCTN30822978

    On the Measurement of Privacy as an Attacker's Estimation Error

    Get PDF
    A wide variety of privacy metrics have been proposed in the literature to evaluate the level of protection offered by privacy enhancing-technologies. Most of these metrics are specific to concrete systems and adversarial models, and are difficult to generalize or translate to other contexts. Furthermore, a better understanding of the relationships between the different privacy metrics is needed to enable more grounded and systematic approach to measuring privacy, as well as to assist systems designers in selecting the most appropriate metric for a given application. In this work we propose a theoretical framework for privacy-preserving systems, endowed with a general definition of privacy in terms of the estimation error incurred by an attacker who aims to disclose the private information that the system is designed to conceal. We show that our framework permits interpreting and comparing a number of well-known metrics under a common perspective. The arguments behind these interpretations are based on fundamental results related to the theories of information, probability and Bayes decision.Comment: This paper has 18 pages and 17 figure
    • …
    corecore