6,898 research outputs found

    On the Expressive Power of Multiple Heads in CHR

    Full text link
    Constraint Handling Rules (CHR) is a committed-choice declarative language which has been originally designed for writing constraint solvers and which is nowadays a general purpose language. CHR programs consist of multi-headed guarded rules which allow to rewrite constraints into simpler ones until a solved form is reached. Many empirical evidences suggest that multiple heads augment the expressive power of the language, however no formal result in this direction has been proved, so far. In the first part of this paper we analyze the Turing completeness of CHR with respect to the underneath constraint theory. We prove that if the constraint theory is powerful enough then restricting to single head rules does not affect the Turing completeness of the language. On the other hand, differently from the case of the multi-headed language, the single head CHR language is not Turing powerful when the underlying signature (for the constraint theory) does not contain function symbols. In the second part we prove that, no matter which constraint theory is considered, under some reasonable assumptions it is not possible to encode the CHR language (with multi-headed rules) into a single headed language while preserving the semantics of the programs. We also show that, under some stronger assumptions, considering an increasing number of atoms in the head of a rule augments the expressive power of the language. These results provide a formal proof for the claim that multiple heads augment the expressive power of the CHR language.Comment: v.6 Minor changes, new formulation of definitions, changed some details in the proof

    Incremental Predictive Process Monitoring: How to Deal with the Variability of Real Environments

    Full text link
    A characteristic of existing predictive process monitoring techniques is to first construct a predictive model based on past process executions, and then use it to predict the future of new ongoing cases, without the possibility of updating it with new cases when they complete their execution. This can make predictive process monitoring too rigid to deal with the variability of processes working in real environments that continuously evolve and/or exhibit new variant behaviors over time. As a solution to this problem, we propose the use of algorithms that allow the incremental construction of the predictive model. These incremental learning algorithms update the model whenever new cases become available so that the predictive model evolves over time to fit the current circumstances. The algorithms have been implemented using different case encoding strategies and evaluated on a number of real and synthetic datasets. The results provide a first evidence of the potential of incremental learning strategies for predicting process monitoring in real environments, and of the impact of different case encoding strategies in this setting

    How to manage the development of complex organizations… and beyond

    Get PDF
    The article starts with a synthesis of traditional Theory of organization.This vision of the organization is inspired by the classical view of world, and we use to summarized this theory the “metaphor of the machine”. As gears of a machine, people are treated as systems of skills, which can be analyzed and used to achieve organizational goals. This vision is too primitive due to the fact that it is impossible to design procedures that prescribe all behaviors and measure in an absolute manner the ability of each person. People have an undeniable autonomy that make them create a so called informal organization which autonomously evolves.“To Manage” does not mean to make an organization function. Instead it means to manage the evolution process it-self. We have developed a methodology to govern the self-evolution processes of informal organizations. We present a specific case history in one of the major Italian metallurgical companies, to increase the level of safety and improve management of the human factors

    Clustering-Based Predictive Process Monitoring

    Full text link
    Business process enactment is generally supported by information systems that record data about process executions, which can be extracted as event logs. Predictive process monitoring is concerned with exploiting such event logs to predict how running (uncompleted) cases will unfold up to their completion. In this paper, we propose a predictive process monitoring framework for estimating the probability that a given predicate will be fulfilled upon completion of a running case. The predicate can be, for example, a temporal logic constraint or a time constraint, or any predicate that can be evaluated over a completed trace. The framework takes into account both the sequence of events observed in the current trace, as well as data attributes associated to these events. The prediction problem is approached in two phases. First, prefixes of previous traces are clustered according to control flow information. Secondly, a classifier is built for each cluster using event data to discriminate between fulfillments and violations. At runtime, a prediction is made on a running case by mapping it to a cluster and applying the corresponding classifier. The framework has been implemented in the ProM toolset and validated on a log pertaining to the treatment of cancer patients in a large hospital

    Sine-Gordon soliton as a model for Hawking radiation of moving black holes and quantum soliton evaporation

    Get PDF
    The intriguing connection between black holes' evaporation and physics of solitons is opening novel roads to finding observable phenomena. It is known from the inverse scattering transform that velocity is a fundamental parameter in solitons theory. Taking this into account, the study of Haw\-king radiation by a moving soliton gets a growing relevance. However, a theoretical context for the description of this phenomenon is still lacking. Here, we adopt a soliton geometrization technique to study the quantum emission of a moving soliton in a one-dimensional model. Representing a black hole by the one soliton solution of the sine-Gordon equation, we consider Haw\-king emission spectra of a quantized massless scalar field on the soliton-induced metric. We study the relation between the soliton velocity and the black hole temperature. Our results address a new scenario in the detection of new physics in the quantum gravity panorama.Comment: 8 pages, 4 figure

    Explain, Adapt and Retrain: How to improve the accuracy of a PPM classifier through different explanation styles

    Full text link
    Recent papers have introduced a novel approach to explain why a Predictive Process Monitoring (PPM) model for outcome-oriented predictions provides wrong predictions. Moreover, they have shown how to exploit the explanations, obtained using state-of-the art post-hoc explainers, to identify the most common features that induce a predictor to make mistakes in a semi-automated way, and, in turn, to reduce the impact of those features and increase the accuracy of the predictive model. This work starts from the assumption that frequent control flow patterns in event logs may represent important features that characterize, and therefore explain, a certain prediction. Therefore, in this paper, we (i) employ a novel encoding able to leverage DECLARE constraints in Predictive Process Monitoring and compare the effectiveness of this encoding with Predictive Process Monitoring state-of-the art encodings, in particular for the task of outcome-oriented predictions; (ii) introduce a completely automated pipeline for the identification of the most common features inducing a predictor to make mistakes; and (iii) show the effectiveness of the proposed pipeline in increasing the accuracy of the predictive model by validating it on different real-life datasets

    A horse, a horse, my kingdom for a horse. Saddle thrombosis of carotid bifurcation in acute stroke

    Get PDF
    Background: Saddle thrombosis is less frequently detected in carotid arteries than in peripheral arterial embolism. The clot and the distal vessel patency have to be promptly recognized in these cases, because if the carotid vessel is open distally, chances may arise for successful emergent surgical procedures to remove the thrombus. At conventional static imaging, mobile floating thrombi may be difficult to differentiate from thrombosis on carotid complicated lesions of atherosclerotic origin. High-resolution ultrasound (US), with its unique capability of real-time imaging, adds fundamental data for interpretation of the findings. Methods: Carotid ultrasound has been performed in acute stroke patients with high-resolution probes. Real-time clips are analyzed and imaging is presented. Results: Saddle carotid bifurcation thrombosis of cardiac origin has been identified in 2 patients with acute homolateral ischemic stroke, with prompt successful surgical removal in one case. Moreover, an example of a thrombus attached on the ruptured surface of a complicated atherosclerotic plaque in an acute symptomatic stroke patient that was successfully operated in emergency is presented. Conclusions: Early high-resolution ultrasound with real-time imaging can easily identify peculiar characteristics of carotid vulnerable diseases in acute stroke phase. Different clinical implications result from the early identification of these different conditions, modifying the therapeutical strategies. © 2012 Elsevier GmbH

    ¿De la deslocalización al "backshoring"? Evidencia de los distritos industriales italianos

    Get PDF
    In recent decades, industrial districts (ID) have experienced intense delocalisation to low-cost countries, with implications for IDs’ internal structure. Recent studies, however, highlight the advantages of relocalising manufacturing in home countries. This paper investigates ID firms’ production-location strategies and backshoring decisions. The results from a survey of 259 firms in eight Italian IDs show that firms that delocalise production do not change their strategies over time and make limited recourse to backshoring. ID production is still important to guarantee product quality and access to specialised know-how.En las últimas décadas, los distritos industriales experimentaron una deslocalización intensa hacia países de bajo costo, con implicaciones en la estructura interna del distrito. Estudios recientes destacan las ventajas de volver a localizar la producción en el mercado nacional. Este artículo analiza las estrategias de localización de las empresas del distrito de producción y las decisiones de “backshoring”. El trabajo empírico consiste en un análisis descriptivo de 259 empresas ubicadas en 8 distritos industriales en Italia. Los resultados muestran que las empresas que deslocalizaron la producción no han cambiado sus estrategias con el tiempo, con un limitado recurso al “back-shoring”. No obstante, la producción del distrito es todavía importante para garantizar la calidad del producto y el acceso a un “know-how” especializado

    Networks as mediating variables: a Bayesian latent space approach

    Get PDF
    The use of network analysis to investigate social structures has recently seen a rise due to the high availability of data and the numerous insights it can provide into different fields. Most analyses focus on the topological characteristics of networks and the estimation of relationships between the nodes. We adopt a different perspective by considering the whole network as a random variable conveying the effect of an exposure on a response. This point of view represents a classical mediation setting, where the interest lies in estimating the indirect effect, that is, the effect propagated through the mediating variable. We introduce a latent space model mapping the network into a space of smaller dimension by considering the hidden positions of the units in the network. The coordinates of each node are used as mediators in the relationship between the exposure and the response. We further extend mediation analysis in the latent space framework by using Generalised Linear Models instead of linear ones, as previously done in the literature, adopting an approach based on derivatives to obtain the effects of interest. A Bayesian approach allows us to get the entire distribution of the indirect effect, generally unknown, and compute the corresponding highest density interval, which gives accurate and interpretable bounds for the mediated effect. Finally, an application to social interactions among a group of adolescents and their attitude toward substance use is presented

    Design of hybrid gels based on gellan-cholesterol derivative and P90G liposomes for drug depot applications

    Get PDF
    Gels are extensively studied in the drug delivery field because of their potential benefits in therapeutics. Depot gel systems fall in this area, and the interest in their development has been focused on long-lasting, biocompatible, and resorbable delivery devices. The present work describes a new class of hybrid gels that stem from the interaction between liposomes based on P90G phospholipid and the cholesterol derivative of the polysaccharide gellan. The mechanical properties of these gels and the delivery profiles of the anti-inflammatory model drug diclofenac embedded in such systems confirmed the suitability of these hybrid gels as a good candidate for drug depot applications
    corecore