151 research outputs found

    Near-Optimal Scheduling for LTL with Future Discounting

    Full text link
    We study the search problem for optimal schedulers for the linear temporal logic (LTL) with future discounting. The logic, introduced by Almagor, Boker and Kupferman, is a quantitative variant of LTL in which an event in the far future has only discounted contribution to a truth value (that is a real number in the unit interval [0, 1]). The precise problem we study---it naturally arises e.g. in search for a scheduler that recovers from an internal error state as soon as possible---is the following: given a Kripke frame, a formula and a number in [0, 1] called a margin, find a path of the Kripke frame that is optimal with respect to the formula up to the prescribed margin (a truly optimal path may not exist). We present an algorithm for the problem; it works even in the extended setting with propositional quality operators, a setting where (threshold) model-checking is known to be undecidable

    A methodology pruning the search space of six compiler transformations by addressing them together as one problem and by exploiting the hardware architecture details

    Get PDF
    Today’s compilers have a plethora of optimizations-transformations to choose from, and the correct choice, order as well parameters of transformations have a significant/large impact on performance; choosing the correct order and parameters of optimizations has been a long standing problem in compilation research, which until now remains unsolved; the separate sub-problems optimization gives a different schedule/binary for each sub-problem and these schedules cannot coexist, as by refining one degrades the other. Researchers try to solve this problem by using iterative compilation techniques but the search space is so big that it cannot be searched even by using modern supercomputers. Moreover, compiler transformations do not take into account the hardware architecture details and data reuse in an efficient way. In this paper, a new iterative compilation methodology is presented which reduces the search space of six compiler transformations by addressing the above problems; the search space is reduced by many orders of magnitude and thus an efficient solution is now capable to be found. The transformations are the following: loop tiling (including the number of the levels of tiling), loop unroll, register allocation, scalar replacement, loop interchange and data array layouts. The search space is reduced (a) by addressing the aforementioned transformations together as one problem and not separately, (b) by taking into account the custom hardware architecture details (e.g., cache size and associativity) and algorithm characteristics (e.g., data reuse). The proposed methodology has been evaluated over iterative compilation and gcc/icc compilers, on both embedded and general purpose processors; it achieves significant performance gains at many orders of magnitude lower compilation time

    Racism and hate speech – A critique of Scanlon’s Contractual Theory

    Get PDF
    The First Amendment is an important value in American liberal polity. Under this value, racism, hate speech and offensive speech are protected speech. This article scrutinizes one of the clear representatives of the American liberal polity - Thomas Scanlon. The paper tracks the developments in his theory over the years. It is argued that Scanlon’s arguments downplay tangible harm that speech might inflict on its target victim audience. Scanlon’s distinction between participant interests, audience interests, and the interests of bystanders is put under close scrutiny. The article criticizes viewpoint neutrality and suggests a balancing approach, further arguing that democracy is required to develop protective mechanisms against harm-facilitating speech as well as profound offences. Both should be taken most seriously

    A weakness measure for GR(1) formulae

    Get PDF
    In spite of the theoretical and algorithmic developments for system synthesis in recent years, little effort has been dedicated to quantifying the quality of the specifications used for synthesis. When dealing with unrealizable specifications, finding the weakest environment assumptions that would ensure realizability is typically a desirable property; in such context the weakness of the assumptions is a major quality parameter. The question of whether one assumption is weaker than another is commonly interpreted using implication or, equivalently, language inclusion. However, this interpretation does not provide any further insight into the weakness of assumptions when implication does not hold. To our knowledge, the only measure that is capable of comparing two formulae in this case is entropy, but even it fails to provide a sufficiently refined notion of weakness in case of GR(1) formulae, a subset of linear temporal logic formulae which is of particular interest in controller synthesis. In this paper we propose a more refined measure of weakness based on the Hausdorff dimension, a concept that captures the notion of size of the omega-language satisfying a linear temporal logic formula. We identify the conditions under which this measure is guaranteed to distinguish between weaker and stronger GR(1) formulae. We evaluate our proposed weakness measure in the context of computing GR(1) assumptions refinements

    Taxonomy and structure of the Romanian personality lexicon

    Get PDF
    We identified 1746 personality-relevant trait-adjectives in a Romanian dictionary, of which 412 were classified as descriptors of dispositions by 10 judges. Self-ratings were collected from 515 participants on those 412 adjectives, and the ratings were factored using principal components analysis. Solutions with different numbers of factors were analysed. The two- and three-factor solutions, respectively, confirmed the Big Two and Big Three of personality traits. A five-factor solution reflected the Big Five model with a fifth factor emphasising Rebelliousness versus Conventionality. The five-factor solution was related to the International Personality Item Pool-Big Five scales, and the highest correlations were indeed between the corresponding factors and scales. A six-factor solution was indicative of the six-factor model as expressed in the HEXACO model, yet with a weak Honesty-Humility factor. Additional analysis with self-ratings from 218 participants on marker-scales for the six-factor solution and on the six scales of the HEXACO did not produce a clear one-to-one correspondence between the two sets of scales, confirming indeed that the six-factor model was only partially found

    Kinetic regulation of multi-ligand binding proteins

    Get PDF
    Background: Second messengers, such as calcium, regulate the activity of multisite binding proteins in a concentration-dependent manner. For example, calcium binding has been shown to induce conformational transitions in the calcium-dependent protein calmodulin, under steady state conditions. However, intracellular concentrations of these second messengers are often subject to rapid change. The mechanisms underlying dynamic ligand-dependent regulation of multisite proteins require further elucidation. Results: In this study, a computational analysis of multisite protein kinetics in response to rapid changes in ligand concentrations is presented. Two major physiological scenarios are investigated: i) Ligand concentration is abundant and the ligand-multisite protein binding does not affect free ligand concentration, ii) Ligand concentration is of the same order of magnitude as the interacting multisite protein concentration and does not change. Therefore, buffering effects significantly influence the amounts of free ligands. For each of these scenarios the influence of the number of binding sites, the temporal effects on intermediate apo- and fully saturated conformations and the multisite regulatory effects on target proteins are investigated. Conclusions: The developed models allow for a novel and accurate interpretation of concentration and pressure jump-dependent kinetic experiments. The presented model makes predictions for the temporal distribution of multisite protein conformations in complex with variable numbers of ligands. Furthermore, it derives the characteristic time and the dynamics for the kinetic responses elicited by a ligand concentration change as a function of ligand concentration and the number of ligand binding sites. Effector proteins regulated by multisite ligand binding are shown to depend on ligand concentration in a highly nonlinear fashion

    Bioethical implications of end-of-life decision-making in patients with dementia:a tale of two societies

    Get PDF
    End-of-life decision-making in patients with dementia is a complex topic. Belgium and the Netherlands have been at the forefront of legislative advancement and progressive societal changes concerning the perspectives toward physician-assisted death (PAD). Careful consideration of clinical and social aspects is essential during the end-of-life decision-making process in patients with dementia. Geriatric assent provides the physician, the patient and his family the opportunity to end life with dignity. Unbearable suffering, decisional competence, and awareness of memory deficits are among the clinical considerations that physicians should incorporate during the end-of-life decision-making process. However, as other societies introduce legislature granting the right of PAD, new social determinants should be considered; Mexico City is an example. Current perspectives regarding advance euthanasia directives (AED) and PAD in patients with dementia are evolving. A new perspective that hinges on the role of the family and geriatric assent should help culturally heterogeneous societies in the transition of their public health care policies regarding end-of-life choices.</p

    The World Federation of ADHD International Consensus Statement:208 Evidence-based conclusions about the disorder

    Get PDF
    Background: Misconceptions about ADHD stigmatize affected people, reduce credibility of providers, and prevent/delay treatment. To challenge misconceptions, we curated findings with strong evidence base. Methods: We reviewed studies with more than 2000 participants or meta-analyses from five or more studies or 2000 or more participants. We excluded meta-analyses that did not assess publication bias, except for meta-analyses of prevalence. For network meta-analyses we required comparison adjusted funnel plots. We excluded treatment studies with waiting-list or treatment as usual controls. From this literature, we extracted evidence-based assertions about the disorder. Results: We generated 208 empirically supported statements about ADHD. The status of the included statements as empirically supported is approved by 80 authors from 27 countries and 6 continents. The contents of the manuscript are endorsed by 366 people who have read this document and agree with its contents. Conclusions: Many findings in ADHD are supported by meta-analysis. These allow for firm statements about the nature, course, outcome causes, and treatments for disorders that are useful for reducing misconceptions and stigma.</p
    corecore