1,248 research outputs found

    Sharing Non-Anonymous Costs of Multiple Resources Optimally

    Full text link
    In cost sharing games, the existence and efficiency of pure Nash equilibria fundamentally depends on the method that is used to share the resources' costs. We consider a general class of resource allocation problems in which a set of resources is used by a heterogeneous set of selfish users. The cost of a resource is a (non-decreasing) function of the set of its users. Under the assumption that the costs of the resources are shared by uniform cost sharing protocols, i.e., protocols that use only local information of the resource's cost structure and its users to determine the cost shares, we exactly quantify the inefficiency of the resulting pure Nash equilibria. Specifically, we show tight bounds on prices of stability and anarchy for games with only submodular and only supermodular cost functions, respectively, and an asymptotically tight bound for games with arbitrary set-functions. While all our upper bounds are attained for the well-known Shapley cost sharing protocol, our lower bounds hold for arbitrary uniform cost sharing protocols and are even valid for games with anonymous costs, i.e., games in which the cost of each resource only depends on the cardinality of the set of its users

    Strategic Payments in Financial Networks

    Get PDF
    In their seminal work on systemic risk in financial markets, Eisenberg and Noe [Larry Eisenberg and Thomas Noe, 2001] proposed and studied a model with n firms embedded into a network of debt relations. We analyze this model from a game-theoretic point of view. Every firm is a rational agent in a directed graph that has an incentive to allocate payments in order to clear as much of its debt as possible. Each edge is weighted and describes a liability between the firms. We consider several variants of the game that differ in the permissible payment strategies. We study the existence and computational complexity of pure Nash and strong equilibria, and we provide bounds on the (strong) prices of anarchy and stability for a natural notion of social welfare. Our results highlight the power of financial regulation - if payments of insolvent firms can be centrally assigned, a socially optimal strong equilibrium can be found in polynomial time. In contrast, worst-case strong equilibria can be a factor of ?(n) away from optimal, and, in general, computing a best response is an NP-hard problem. For less permissible sets of strategies, we show that pure equilibria might not exist, and deciding their existence as well as computing them if they exist constitute NP-hard problems

    Diabetic microangiopathy in Type 1 (insulin-dependent) diabetic patients after successful pancreatic and kidney or solitary kidney transplantation

    Get PDF
    To evaluate the beneficial effect of pancreatic grafting on peripheral microcirculation and long-term clinical outcome, we compared data of 28 Type 1 (insulin-dependent) diabetic patients either given a pancreatic and kidney graft simultaneously or given a solitary kidney graft (n=17). Peripheral microcirculation was estimated by transcutaneous oxygen pressure measurement (including reoxygenation potential after blood flow occlusion) and erythrocyte flow / velocity by a non-contact laser speckle method. All the measured parameters showed significant differences between diabetic and control subjects in the mean follow-up time of 49 (simultaneous pancreas and kidney transplantation) and 43 (solitary kidney transplantation) months. The data from patients after simultaneous pancreas and kidney transplantation revealed an improvement of transcutaneous oxygen pressure measurement (rise from 46±2 mm Hg to 63±3 mmHg), reoxygenation time (fall from 224±12s to 114±6s) and laser speckle measurement (rise from 4.2±1.7 to 5.6±1.8 relative units). The control group with solitary kidney transplantation did not show a positive evaluation. Data from patients after simultaneous pancreas and kidney transplantation revealed an improvement in transcutaneous oxygen pressure measurement, reoxygenation time and laser speckle measurement whereas the control group with solitary kidney transplantation did not show a positive evaluation. Improved microcirculation was more pronounced in patients with better microvascular preconditions. The results confirm that diabetic microangiopathy is positively influenced by pancreatic transplantation

    Network Investment Games with Wardrop Followers

    Get PDF
    We study a two-sided network investment game consisting of two sets of players, called providers and users. The game is set in two stages. In the first stage, providers aim to maximize their profit by investing in bandwidth of cloud computing services. The investments of the providers yield a set of usable services for the users. In the second stage, each user wants to process a task and therefore selects a bundle of services so as to minimize the total processing time. We assume the total processing time to be separable over the chosen services and the processing time of each service to depend on the utilization of the service and the installed bandwidth. We provide insights on how competition between providers affects the total costs of the users and show that every game on a series-parallel graph can be reduced to an equivalent single edge game when analyzing the set of subgame perfect Nash equilibria

    Robust Flows over Time: Models and Complexity Results

    Full text link
    We study dynamic network flows with uncertain input data under a robust optimization perspective. In the dynamic maximum flow problem, the goal is to maximize the flow reaching the sink within a given time horizon TT, while flow requires a certain travel time to traverse an edge. In our setting, we account for uncertain travel times of flow. We investigate maximum flows over time under the assumption that at most Γ\Gamma travel times may be prolonged simultaneously due to delay. We develop and study a mathematical model for this problem. As the dynamic robust flow problem generalizes the static version, it is NP-hard to compute an optimal flow. However, our dynamic version is considerably more complex than the static version. We show that it is NP-hard to verify feasibility of a given candidate solution. Furthermore, we investigate temporally repeated flows and show that in contrast to the non-robust case (that is, without uncertainties) they no longer provide optimal solutions for the robust problem, but rather yield a worst case optimality gap of at least TT. We finally show that the optimality gap is at most O(ηklogT)O(\eta k \log T), where η\eta and kk are newly introduced instance characteristics and provide a matching lower bound instance with optimality gap Ω(logT)\Omega(\log T) and η=k=1\eta = k = 1. The results obtained in this paper yield a first step towards understanding robust dynamic flow problems with uncertain travel times

    A Greedy Algorithm for the Social Golfer and the Oberwolfach Problem

    Full text link
    Inspired by the increasing popularity of Swiss-system tournaments in sports, we study the problem of predetermining the number of rounds that can be guaranteed in a Swiss-system tournament. Matches of these tournaments are usually determined in a myopic round-based way dependent on the results of previous rounds. Together with the hard constraint that no two players meet more than once during the tournament, at some point it might become infeasible to schedule a next round. For tournaments with nn players and match sizes of k2k\geq2 players, we prove that we can always guarantee nk(k1)\lfloor \frac{n}{k(k-1)} \rfloor rounds. We show that this bound is tight. This provides a simple polynomial time constant factor approximation algorithm for the social golfer problem. We extend the results to the Oberwolfach problem. We show that a simple greedy approach guarantees at least n+46\lfloor \frac{n+4}{6} \rfloor rounds for the Oberwolfach problem. This yields a polynomial time 13+ϵ\frac{1}{3+\epsilon}-approximation algorithm for any fixed ϵ>0\epsilon>0 for the Oberwolfach problem. Assuming that El-Zahar's conjecture is true, we improve the bound on the number of rounds to be essentially tight.Comment: 24 pages, 4 figure

    The effects of intelligence and education on the development of dementia

    Get PDF
    A number of recent epidemiological studies have shown that the prevalence and incidence of dementia are increased in population strata with low compared to high levels of education. This has been explained as a consequence of a greater 'brain reserve capacity' in people with a high level of education. Theoretically, however, brain reserve capacity is better reflected by intelligence than by level of education. Thus, the emergence of dementia will be better predicted by low pre-morbid intelligence than by low education. This prediction was tested in a population based sample of elderly subjects (N = 2063; age range 65-84; Amsterdam Study of the Elderly) who were followed over 4 years. Dementia was diagnosed using the Geriatric Mental State examination (GMS). Pre-morbid intelligence was measured using the Dutch Adult Reading Test (DART), a short reading test which gives a good estimate of verbal intelligence, and is relatively insensitive to brain dysfunction. The effects of age, gender, occupational level, number of diseases affecting the central nervous system and family history of dementia or extreme forgetfulness were also examined. Logistic regression analysis showed that low DART-IQ predicted incident dementia better than low level of education. A high occupational level (having been in charge of subordinates) had a protective effect. This result supports the brain reserve theory. It also indicates that low pre-morbid intelligence is an important risk factor for cognitive decline and dementia. Use of reading ability tests is to be preferred over years of education as estimator of pre-morbid cognitive level in (epidemiological) dementia researc

    Competitive Packet Routing with Priority Lists

    Get PDF
    In competitive packet routing games, packets are routed selfishly through a network and scheduling policies at edges determine which packages are forwarded first if there is not enough capacity on an edge to forward all packages at once. We analyze the impact of priority lists on the worst-case quality of pure Nash equilibria. A priority list is an ordered list of players that may or may not depend on the edge. Whenever the number of packets entering an edge exceeds the inflow capacity, packets are processed in list order. We derive several new bounds on the price of anarchy and stability for global and local priority policies. We also consider the question of the complexity of computing an optimal priority list. It turns out that even for very restricted cases, i.e., for routing on a tree, the computation of an optimal priority list is APX-hard
    corecore