255,923 research outputs found

    Two-Stage Technique for LTLf Synthesis Under LTL Assumptions

    Get PDF
    In synthesis, assumption are constraints on the environments that rule out certain environment behaviors. A key observation is that even if we consider system with LTLf goals on finite traces, assumptions need to be expressed considering infinite traces, using LTL on infinite traces, since the decision to stop the trace is controlled by the agent. To solve synthesis of LTLf goals under LTL assumptions, we could reduce the problem to LTL synthesis. Unfortunately, while synthesis in LTLf and in LTL have the same worst-case complexity (both are 2EXPTIME-complete), the algorithms available for LTL synthesis are much harder in practice than those for LTLf synthesis. Recently, it has been shown that in basic forms of fairness and stability assumptions we can avoid such a detour to LTL and keep the simplicity of LTLf synthesis. In this paper, we generalize these results and show how to effectively handle any kind of LTL assumptions. Specifically, we devise a two-stage technique for solving LTLf under general LTL assumptions and show empirically that this technique performs much better than standard LTL synthesis

    How is Binary Radio-Pulsars with Black Holes Population Rich?

    Full text link
    Using "Scenario Machine" we have carried out population synthesis of radio pulsar with black hole binaries (BH+Psr) in context of the most wide assumptions about star mass loss during evolution, binary stars mass ratio distribution, kick velocity and envelope mass lost during collapse. Our purpose is to display that under any suppositional parameters of evolution scenario BH+Psr population have to be abundant in Galaxy. It is shown that in the all models including models evolved by Heger et al. (2002), Woosley et al. (2002), Heger et al. (2003) expected number of the black holes paired with radio pulsars is sufficient enough to discover such systems within the next few years.Comment: 8 pages, 4 figures, accepted to MNRA

    A weakness measure for GR(1) formulae

    Get PDF
    In spite of the theoretical and algorithmic developments for system synthesis in recent years, little effort has been dedicated to quantifying the quality of the specifications used for synthesis. When dealing with unrealizable specifications, finding the weakest environment assumptions that would ensure realizability is typically a desirable property; in such context the weakness of the assumptions is a major quality parameter. The question of whether one assumption is weaker than another is commonly interpreted using implication or, equivalently, language inclusion. However, this interpretation does not provide any further insight into the weakness of assumptions when implication does not hold. To our knowledge, the only measure that is capable of comparing two formulae in this case is entropy, but even it fails to provide a sufficiently refined notion of weakness in case of GR(1) formulae, a subset of linear temporal logic formulae which is of particular interest in controller synthesis. In this paper we propose a more refined measure of weakness based on the Hausdorff dimension, a concept that captures the notion of size of the omega-language satisfying a linear temporal logic formula. We identify the conditions under which this measure is guaranteed to distinguish between weaker and stronger GR(1) formulae. We evaluate our proposed weakness measure in the context of computing GR(1) assumptions refinements

    The New Keynesian Approach to Business Cycle Theory: Nominal and Real Rigidities

    Get PDF
    At the heart of the Neoclassical synthesis lies the assumption that prices do not adjust instantly to equilibrate supply and demand. Under these circumstances, once the synthesis failed, economists naturally started to investigate whether the imperfect adjustment of prices could be logically inferred from realistic assumptions regarding the microeconomic environment, and subsequent research led to a variety of new non-walrasian theories regarding the functioning of markets. Thus, the non-walrasian analyses of the labour market suggested that wages could perform other functions than to equilibrate labour supply and demand. For instance, in models focused on labour contracts, wages are regarded as an „insurance” provided by the employer to the workers, while in efficiency wage models, wages are determinants of labour productivity. Such models have the ability to account for unemployment, but they are not able to explain the failure of the classical dichotomy. The paper aims to investigate the theoretical progress achieved during the past 3 decades, to clarify nominal and real rigidities and evaluate their impact on the business cycle and finally, to evaluate the theoretical aspects which need further analyses and refinements.nominal rigidities, real rigidities, menu costs, efficiency wages, near rationality.
    corecore