11,657 research outputs found

    The Complexity of Rationalizing Network Formation

    Get PDF
    We study the complexity of rationalizing network formation. In this problem we fix an underlying model describing how selfish parties (the vertices) produce a graph by making individual decisions to form or not form incident edges. The model is equipped with a notion of stability (or equilibrium), and we observe a set of "snapshots" of graphs that are assumed to be stable. From this we would like to infer some unobserved data about the system: edge prices, or how much each vertex values short paths to each other vertex. We study two rationalization problems arising from the network formation model of Jackson and Wolinsky [14]. When the goal is to infer edge prices, we observe that the rationalization problem is easy. The problem remains easy even when rationalizing prices do not exist and we instead wish to find prices that maximize the stability of the system. In contrast, when the edge prices are given and the goal is instead to infer valuations of each vertex by each other vertex, we prove that the rationalization problem becomes NP-hard. Our proof exposes a close connection between rationalization problems and the Inequality-SAT (I-SAT) problem. Finally and most significantly, we prove that an approximation version of this NP-complete rationalization problem is NP-hard to approximate to within better than a 1/2 ratio. This shows that the trivial algorithm of setting everyone's valuations to infinity (which rationalizes all the edges present in the input graphs) or to zero (which rationalizes all the non-edges present in the input graphs) is the best possible assuming P ≠ NP To do this we prove a tight (1/2 + δ) -approximation hardness for a variant of I-SAT in which all coefficients are non-negative. This in turn follows from a tight hardness result for MAX-LlN_(R_+) (linear equations over the reals, with non-negative coefficients), which we prove by a (non-trivial) modification of the recent result of Guruswami and Raghavendra [10] which achieved tight hardness for this problem without the non-negativity constraint. Our technical contributions regarding the hardness of I-SAT and MAX-LIN_(R_+) may be of independent interest, given the generality of these problem

    Computable Rationality, NUTS, and the Nuclear Leviathan

    Get PDF
    This paper explores how the Leviathan that projects power through nuclear arms exercises a unique nuclearized sovereignty. In the case of nuclear superpowers, this sovereignty extends to wielding the power to destroy human civilization as we know it across the globe. Nuclearized sovereignty depends on a hybrid form of power encompassing human decision-makers in a hierarchical chain of command, and all of the technical and computerized functions necessary to maintain command and control at every moment of the sovereign's existence: this sovereign power cannot sleep. This article analyzes how the form of rationality that informs this hybrid exercise of power historically developed to be computable. By definition, computable rationality must be able to function without any intelligible grasp of the context or the comprehensive significance of decision-making outcomes. Thus, maintaining nuclearized sovereignty necessarily must be able to execute momentous life and death decisions without the type of sentience we usually associate with ethical individual and collective decisions

    The computational complexity of rationalizing Pareto optimal choice behavior

    Get PDF
    We consider a setting where a coalition of individuals chooses one or several alternatives from each set in a collection of choice sets. We examine the computational complexity of Pareto rationalizability. Pareto rationalizability requires that we can endow each individual in the coalition with a preference relation such that the observed choices are Pareto efficient. We differentiate between the situation where the choice function is considered to select all Pareto optimal alternatives from a choice set and the situation where it only contains one or several Pareto optimal alternatives. In the former case we find that Pareto rationalizability is an NP-complete problem. For the latter case we demonstrate that, if we have no additional information on the individual preference relations, then all choice behavior is Pareto rationalizable. However, if we have such additional information, then Pareto rationalizability is again NP-complete. Our results are valid for any coalition of size greater or equal than two.

    The computational complexity of rationalizing Pareto optimal choice behavior.

    Get PDF
    We consider a setting where a coalition of individuals chooses one or several alternatives from each set in a collection of choice sets. We examine the computational complexity of Pareto rationalizability. Pareto rationalizability requires that we can endow each individual in the coalition with a preference relation such that the observed choices are Pareto efficient. We differentiate between the situation where the choice function is considered to select all Pareto optimal alternatives from a choice set and the situation where it only contains one or several Pareto optimal alternatives. In the former case we find that Pareto rationalizability is an NP-complete problem. For the latter case we demonstrate that, if we have no additional information on the individual preference relations, then all choice behavior is Pareto rationalizable. However, if we have such additional information, then Pareto rationalizability is again NP-complete. Our results are valid for any coalition of size greater or equal than two.

    Beyond Dualisms in Methodology: An Integrative Design Research Medium "MAPS" and some Reflections

    Get PDF
    Design research is an academic issue and increasingly an essential success factor for industrial, organizational and social innovation. The fierce rejection of 1st generation design methods in the early 1970s resulted in the postmodernist attitude of "no methods", and subsequently, after more than a decade, in the strong adoption of scientific methods, or "the" scientific method, for design research. The current situation regarding methodology is characterized by unproductive dualisms such as scientific methods vs. designerly methods, normative methods vs. descriptive methods, research vs. design. The potential of the early (1st generation) methods is neglected and the practical usefulness of design research is impeded. The suggestion for 2nd generation methods as discussed by Rittel and others has hardly been taken up in design. The development of a methodological tool / medium for research through design – MAPS – (which is the central part of the paper) presents the cause and catalyst for some reflections about the usability / desirability / usefulness of methodical support for the design (research) process. Keywords: Integrative Design Research Medium, Research Through Design, MAPS, Methodology</p

    Functional impairment of human resident cardiac stem cells by the cardiotoxic antineoplastic agent trastuzumab

    Get PDF
    Trastuzumab (TZM), a monoclonal antibody against the ERBB2 protein, increases survival in ERBB2-positive breast cancer patients. Its clinical use, however, is limited by cardiotoxicity. We sought to evaluate whether TZM cardiotoxicity involves inhibition of human adult cardiac-derived stem cells, in addition to previously reported direct adverse effects on cardiomyocytes. To test this idea, we exposed human cardiosphere-derived cells (hCDCs), a natural mixture of cardiac stem cells and supporting cells that has been shown to exert potent regenerative effects, to TZM and tested the effects in vitro and in vivo. We found that ERBB2 mRNA and protein are expressed in hCDCs at levels comparable to those in human myocardium. Although clinically relevant concentrations of TZM had no effect on proliferation, apoptosis, or size of the c-kit-positive hCDC subpopulation, in vitro assays demonstrated diminished potential for cardiogenic differentiation and impaired ability to form microvascular networks in TZM-treated cells. The functional benefit of hCDCs injected into the border zone of acutely infarcted mouse hearts was abrogated by TZM: infarcted animals treated with TZM + hCDCs had a lower ejection fraction, thinner infarct scar, and reduced capillary density in the infarct border zone compared with animals that received hCDCs alone (n = 12 per group). Collectively, these results indicate that TZM inhibits the cardiomyogenic and angiogenic capacities of hCDCs in vitro and abrogates the morphological and functional benefits of hCDC transplantation in vivo. Thus, TZM impairs the function of human resident cardiac stem cells, potentially contributing to TZM cardiotoxicity

    Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness

    Get PDF
    &lt;b&gt;Background&lt;/b&gt; In this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding - and sometimes preventing - disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Discussion&lt;/b&gt; As the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Summary&lt;/b&gt; Burden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts
    corecore