42,544 research outputs found

    On cascade products of answer set programs

    Full text link
    Describing complex objects by elementary ones is a common strategy in mathematics and science in general. In their seminal 1965 paper, Kenneth Krohn and John Rhodes showed that every finite deterministic automaton can be represented (or "emulated") by a cascade product of very simple automata. This led to an elegant algebraic theory of automata based on finite semigroups (Krohn-Rhodes Theory). Surprisingly, by relating logic programs and automata, we can show in this paper that the Krohn-Rhodes Theory is applicable in Answer Set Programming (ASP). More precisely, we recast the concept of a cascade product to ASP, and prove that every program can be represented by a product of very simple programs, the reset and standard programs. Roughly, this implies that the reset and standard programs are the basic building blocks of ASP with respect to the cascade product. In a broader sense, this paper is a first step towards an algebraic theory of products and networks of nonmonotonic reasoning systems based on Krohn-Rhodes Theory, aiming at important open issues in ASP and AI in general.Comment: Appears in Theory and Practice of Logic Programmin

    Monte Carlo Generators

    Get PDF
    The structure of events in high-energy collisions is complex and not predictable from first principles. Event generators allow the problem to be subdivided into more manageable pieces, some of which can be described from first principles, while others need to be based on appropriate models with parameters tuned to data. In these lectures we provide an overview, discuss how matrix elements are used, introduce the machinery for initial- and final-state parton showers, explain how matrix elements and parton showers can be combined for optimal accuracy, introduce the concept of multiple parton--parton interactions, comment briefly on the hadronization issue, and provide an outlook for the future.Comment: 23 pages, lectures presented at the 2006 European School of High-Energy Physics, Aronsborg, Sweden, 18 June -- 1 July 200

    Towards modular verification of pathways: fairness and assumptions

    Full text link
    Modular verification is a technique used to face the state explosion problem often encountered in the verification of properties of complex systems such as concurrent interactive systems. The modular approach is based on the observation that properties of interest often concern a rather small portion of the system. As a consequence, reduced models can be constructed which approximate the overall system behaviour thus allowing more efficient verification. Biochemical pathways can be seen as complex concurrent interactive systems. Consequently, verification of their properties is often computationally very expensive and could take advantage of the modular approach. In this paper we report preliminary results on the development of a modular verification framework for biochemical pathways. We view biochemical pathways as concurrent systems of reactions competing for molecular resources. A modular verification technique could be based on reduced models containing only reactions involving molecular resources of interest. For a proper description of the system behaviour we argue that it is essential to consider a suitable notion of fairness, which is a well-established notion in concurrency theory but novel in the field of pathway modelling. We propose a modelling approach that includes fairness and we identify the assumptions under which verification of properties can be done in a modular way. We prove the correctness of the approach and demonstrate it on the model of the EGF receptor-induced MAP kinase cascade by Schoeberl et al.Comment: In Proceedings MeCBIC 2012, arXiv:1211.347

    Physics of Extremely High Energy Cosmic Rays

    Get PDF
    Over the last third of the century, a few tens of events, detected by ground-based cosmic ray detectors, have opened a new window in the field of high-energy astrophysics. These events have macroscopic energies, unobserved sources, an unknown chemical composition and a production and transport mechanism yet to be explained. With a flux as low as one particle per century per square kilometer, only dedicated detectors with huge apertures can bring in the high-quality and statistically significant data needed to answer those questions. In this article, we review the present status of the field both from an experimental and theoretical point of view. Special attention is given to the next generation of detectors devoted to the thorough exploration of the highest energy rangesComment: 43 pages, 12 figures, submitted to International Journal of Modern Physics

    MANCaLog: A Logic for Multi-Attribute Network Cascades (Technical Report)

    Get PDF
    The modeling of cascade processes in multi-agent systems in the form of complex networks has in recent years become an important topic of study due to its many applications: the adoption of commercial products, spread of disease, the diffusion of an idea, etc. In this paper, we begin by identifying a desiderata of seven properties that a framework for modeling such processes should satisfy: the ability to represent attributes of both nodes and edges, an explicit representation of time, the ability to represent non-Markovian temporal relationships, representation of uncertain information, the ability to represent competing cascades, allowance of non-monotonic diffusion, and computational tractability. We then present the MANCaLog language, a formalism based on logic programming that satisfies all these desiderata, and focus on algorithms for finding minimal models (from which the outcome of cascades can be obtained) as well as how this formalism can be applied in real world scenarios. We are not aware of any other formalism in the literature that meets all of the above requirements

    Identifying influencers in a social network : the value of real referral data

    Get PDF
    Individuals influence each other through social interactions and marketers aim to leverage this interpersonal influence to attract new customers. It still remains a challenge to identify those customers in a social network that have the most influence on their social connections. A common approach to the influence maximization problem is to simulate influence cascades through the network based on the existence of links in the network using diffusion models. Our study contributes to the literature by evaluating these principles using real-life referral behaviour data. A new ranking metric, called Referral Rank, is introduced that builds on the game theoretic concept of the Shapley value for assigning each individual in the network a value that reflects the likelihood of referring new customers. We also explore whether these methods can be further improved by looking beyond the one-hop neighbourhood of the influencers. Experiments on a large telecommunication data set and referral data set demonstrate that using traditional simulation based methods to identify influencers in a social network can lead to suboptimal decisions as the results overestimate actual referral cascades. We also find that looking at the influence of the two-hop neighbours of the customers improves the influence spread and product adoption. Our findings suggest that companies can take two actions to improve their decision support system for identifying influential customers: (1) improve the data by incorporating data that reflects the actual referral behaviour of the customers or (2) extend the method by looking at the influence of the connections in the two-hop neighbourhood of the customers

    Measurement Invariance and Response Bias: A Stochastic Frontier Approach

    Get PDF
    The goals of the present paper were to assess measurement invariance using a common econometric method and to illustrate the approach with self-reported measures of parenting behaviors before and after a family intervention. Most recent literature on measurement invariance (MI) in psychological research 1) explores the use of structural equation modeling (SEM) and confirmatory factor analysis to identify measurement invariance, and 2) tests for measurement invariance across groups rather than across time. We use method, Stochastic Frontier Estimation, or SFE, to identify response bias and covariates of response bias both across individuals at a single point in time and across two measurement occasions (before and after participation in a family intervention). We examined the effects of participant demographics (N = 1437) on response bias; gender and race/ethnicity were related to magnitude of bias and to changes in bias across time, and bias was lower at posttest than at pretest. We discuss analytic advantages and disadvantages of SFE relative to SEM approaches and note that the technique may be particularly useful in addressing the problem of “response shift bias” or “recalibration” in program evaluation -- that is, a shift in metric from before to after an intervention which is caused by the intervention itself and may lead to underestimates of program effects.Measurement invariance, measurement equivalence, response bias, response-shift bias, stochastic frontier analysis
    corecore