4,715 research outputs found

    Measuring Communication in Parallel Communicating Finite Automata

    Full text link
    Systems of deterministic finite automata communicating by sending their states upon request are investigated, when the amount of communication is restricted. The computational power and decidability properties are studied for the case of returning centralized systems, when the number of necessary communications during the computations of the system is bounded by a function depending on the length of the input. It is proved that an infinite hierarchy of language families exists, depending on the number of messages sent during their most economical recognitions. Moreover, several properties are shown to be not semi-decidable for the systems under consideration.Comment: In Proceedings AFL 2014, arXiv:1405.527

    Reliability of Public Private Partnership Projects under Assumptions of Cash Flow Volatility

    Get PDF
    This paper focuses on dynamic financial modelling of recurring cash flow items in PPP projects in operating stage and on risks associated with the volatility of these cash flows. As we concentrate on so-called government-pays schemes, only cash-outflows are considered, such as operating costs, repairs and maintenance expenses, and administration costs, whereas the revenue side is considered to be not at risk. We show different approaches to modelling the uncertainty of recurring operating expenses and explain how to interpret the results. Our analysis is based on the mathematical framework of stochastic processes, which, in finance, are particularly used to describe price series evolutions in capital markets. We apply them to generate variable trajectories of operating costs and integrate them into a stochastic simulation of the financial model.

    Detecting an exciton crystal by statistical means

    Full text link
    We investigate an ensemble of excitons in a coupled quantum well excited via an applied laser field. Using an effective disordered quantum Ising model, we perform a numerical simulation of the experimental procedure and calculate the probability distribution function P(M)P(M) to create MM excitons as well as their correlation function. It shows clear evidence of the existence of two phases corresponding to a liquid and a crystal phase. We demonstrate that not only the correlation function but also the distribution P(M)P(M) is very well suited to monitor this transition.Comment: 5 pages, 5 figure

    Requirements Quality Is Quality in Use

    Get PDF
    The quality of requirements engineering artifacts is widely considered a success factor for software projects. Currently, the definition of high-quality or good RE artifacts is often provided through normative references, such as quality standards, textbooks, or generic guidelines. We see various problems of such normative references: (1) It is hard to ensure that the contained rules are complete, (2) the contained rules are not context-dependent, and (3) the standards lack precise reasoning why certain criteria are considered bad quality. To change this understanding, we postulate that creating an RE artifact is rarely an end in itself, but just a means to understand and reach the project’s goals. Following this line of thought, the purpose of an RE artifact is to support the stakeholders in whatever activities they are performing in the project. This purpose must define high-quality RE artifacts. To express this view, we contribute an activity-based RE quality meta model and show applications of this paradigm. Lastly, we describe the impacts of this view onto research and practice.BMBF, 01IS15003, Q-Effekt : Qualitätssicherungsmaßnahmen effektiv steuer

    Good RE artifacts? I know it when I use it!

    Get PDF
    The definition of high-quality or good RE artifacts is often provided through normative references, such as quality standards or text books (e.g., ISO/IEEE/IEC-29148). We see various problems of such normative references. Quality standards are incomplete. Several quality standards describe quality through a set of abstract criteria. When analyzing the characteristics in detail, we see that there are two different types of criteria: Some criteria, such as ambiguity, consistency, completeness, and singularity are factors that describe properties of the RE artifact itself. In contrast, feasibility, traceability and verifiability state that activities can be performed with the artifact. This is a small, yet important difference: While the former can be assessed by analyzing just the artifact by itself, the latter describe a relationship of the artifact in the context of its usage. Yet this usage context is incompletely represented in the quality standards: For example, why is it important that requirements can be implemented (feasible in the terminology of ISO-29148) and verified, but other activities, such as maintenance, are not part of the quality model? Therefore, we argue that normative standards do not take all activities into account systematically, and thus, are missing relevant quality factors. Quality standards are only implicitly context-dependent. One could go even further and ask about the value of some artifact-based properties such as singularity. A normative approach does not provide such rationales. This is different for activity-based properties, such as verifiability, since these properties are defined through their usage: If we need to verify the requirements, properties of the artifact that increase verifiability are important. If we do not need to verify this requirement, e.g., because we use the artifacts only for task management in an agile process, these properties might not necessarily be relevant. This example shows that, in contrast to the normative definition of quality in RE standards, RE quality usually depends on the context. Quality standards lack precise reasoning. For defining most of the aforementioned criteria, the standards remain abstract and vague. For some criteria, such as ambiguity, the standards provide a detailed lists of factors to avoid. However, these criteria have an imprecise relation to the abstract criteria mentioned above, and, consequently, the harm that they might potentially cause remains unclear

    Liberalization strategies for free trade in services

    Full text link
    Trade in services is being dealt with in GATT negotiations for the first time in the present Uruguay Round. The discussion on the proper liberalization instrument to be applied to trade in services is highly controversial. This paper attempts to clarify the discussion and outline rational policy options

    Multiple endocrine neoplasia type 2: achievements and current challenges

    Get PDF
    Incremental advances in medical technology, such as the development of sensitive hormonal assays for routine clinical care, are the drivers of medical progress. This principle is exemplified by the creation of the concept of multiple endocrine neoplasia type 2, encompassing medullary thyroid cancer, pheochromocytoma, and primary hyperparathyroidism, which did not emerge before the early 1960s
    • …
    corecore