2,640 research outputs found

    Conformance Checking Based on Multi-Perspective Declarative Process Models

    Full text link
    Process mining is a family of techniques that aim at analyzing business process execution data recorded in event logs. Conformance checking is a branch of this discipline embracing approaches for verifying whether the behavior of a process, as recorded in a log, is in line with some expected behaviors provided in the form of a process model. The majority of these approaches require the input process model to be procedural (e.g., a Petri net). However, in turbulent environments, characterized by high variability, the process behavior is less stable and predictable. In these environments, procedural process models are less suitable to describe a business process. Declarative specifications, working in an open world assumption, allow the modeler to express several possible execution paths as a compact set of constraints. Any process execution that does not contradict these constraints is allowed. One of the open challenges in the context of conformance checking with declarative models is the capability of supporting multi-perspective specifications. In this paper, we close this gap by providing a framework for conformance checking based on MP-Declare, a multi-perspective version of the declarative process modeling language Declare. The approach has been implemented in the process mining tool ProM and has been experimented in three real life case studies

    A multi-agent simulation approach to sustainability in tourism development

    Get PDF
    In the last decades the increasing facility in moving and the simultaneous fall of the transportation costs have strongly increased the tourist flows. As a consequence, different destinations, especially those which are rich of natural resources, unable or unready to sustain huge tourism flows, present serious problems of sustainability and Tourism Carrying Capacity (TCC). At the present, it is universally recognized that every tourist destination should plan effective and pro-reactive protection policies of its cultural, environmental and social resources. In order to facilitate policies definition it may be useful to measure the Tourist Carrying Capacity, but the literature has highlighted that this is not an easy task for different reasons: among the others, the complexity and the dynamicity of the concept, the absence of a universally accepted definition and the impossibility of assigning an objective scientific value and to apply a rigorous analysis. Thereby, more recently an alternative, or even complementary, interpretation of TCC has developed; it is called LAC, Limit of Acceptable Changes where the focus shifts from: “How much use an area can tolerate?†to “How much change is acceptable?â€, aiming at evaluating the costs and benefits from alternative management tourism actions. The aim of the paper is to present an innovative framework, based on the LAC approach - MABSiT, Mobile Agent Behavior Simulation in Tourism - developed by the authors, which is composed by five modules: elaboration data, DBMS, ad-hoc maps, agents and ontology. Its modular structure allows to easily study the interactions among the components in order to observe the behavior of the single agents. In an aggregate form, it is possible to define group dynamics, where one possible effect is the influence on the variation of agents’ satisfaction perception in comparison to the surroundings environment. The paper will be structured as follows: an introduction will be followed by a literature review; than the methodology and the framework will be presented and applied to a case study: Vieste, a known maritime destination of South of Italy, which is characterized by high problems of seasonality in the summer. Finally, some conclusions and policy recommendations will be drawn.

    From zero to hero: A process mining tutorial

    Get PDF
    Process mining is an emerging area that synergically combines model-based and data-oriented analysis techniques to obtain useful insights on how business processes are executed within an organization. This tutorial aims at providing an introduction to the key analysis techniques in process mining that allow decision makers to discover process models from data, compare expected and actual behaviors, and enrich models with key information about the actual process executions. In addition, the tutorial will present concrete tools and will provide practical skills for applying process mining in a variety of application domains, including the one of software development

    Clustering-Based Predictive Process Monitoring

    Full text link
    Business process enactment is generally supported by information systems that record data about process executions, which can be extracted as event logs. Predictive process monitoring is concerned with exploiting such event logs to predict how running (uncompleted) cases will unfold up to their completion. In this paper, we propose a predictive process monitoring framework for estimating the probability that a given predicate will be fulfilled upon completion of a running case. The predicate can be, for example, a temporal logic constraint or a time constraint, or any predicate that can be evaluated over a completed trace. The framework takes into account both the sequence of events observed in the current trace, as well as data attributes associated to these events. The prediction problem is approached in two phases. First, prefixes of previous traces are clustered according to control flow information. Secondly, a classifier is built for each cluster using event data to discriminate between fulfillments and violations. At runtime, a prediction is made on a running case by mapping it to a cluster and applying the corresponding classifier. The framework has been implemented in the ProM toolset and validated on a log pertaining to the treatment of cancer patients in a large hospital

    LTLf and LDLf Monitoring: A Technical Report

    Get PDF
    Runtime monitoring is one of the central tasks to provide operational decision support to running business processes, and check on-the-fly whether they comply with constraints and rules. We study runtime monitoring of properties expressed in LTL on finite traces (LTLf) and in its extension LDLf. LDLf is a powerful logic that captures all monadic second order logic on finite traces, which is obtained by combining regular expressions and LTLf, adopting the syntax of propositional dynamic logic (PDL). Interestingly, in spite of its greater expressivity, LDLf has exactly the same computational complexity of LTLf. We show that LDLf is able to capture, in the logic itself, not only the constraints to be monitored, but also the de-facto standard RV-LTL monitors. This makes it possible to declaratively capture monitoring metaconstraints, and check them by relying on usual logical services instead of ad-hoc algorithms. This, in turn, enables to flexibly monitor constraints depending on the monitoring state of other constraints, e.g., "compensation" constraints that are only checked when others are detected to be violated. In addition, we devise a direct translation of LDLf formulas into nondeterministic automata, avoiding to detour to Buechi automata or alternating automata, and we use it to implement a monitoring plug-in for the PROM suite

    Incremental Predictive Process Monitoring: How to Deal with the Variability of Real Environments

    Full text link
    A characteristic of existing predictive process monitoring techniques is to first construct a predictive model based on past process executions, and then use it to predict the future of new ongoing cases, without the possibility of updating it with new cases when they complete their execution. This can make predictive process monitoring too rigid to deal with the variability of processes working in real environments that continuously evolve and/or exhibit new variant behaviors over time. As a solution to this problem, we propose the use of algorithms that allow the incremental construction of the predictive model. These incremental learning algorithms update the model whenever new cases become available so that the predictive model evolves over time to fit the current circumstances. The algorithms have been implemented using different case encoding strategies and evaluated on a number of real and synthetic datasets. The results provide a first evidence of the potential of incremental learning strategies for predicting process monitoring in real environments, and of the impact of different case encoding strategies in this setting

    How Current Direct-Acting Antiviral and Novel Cell Culture Systems for HCV are Shaping Therapy and Molecular Diagnosis of Chronic HCV Infection

    Get PDF
    We have entered a new era of hepatitis C virus (HCV) therapy in which elimination of infection and disease is a real possibility. HCV cell culture models were instrumental for identification of therapeutic targets, testing candidate drugs, and profiling of therapeutic strategies. Here we describe current and novel methods of cell culture systems for HCV that are allowing investigation of HCV life cycle and virus-host interaction required for replication and propagation. The development of protocols to grow infectious virus in culture and generate hepatocyte cell lines from specific individuals hold great promise to investigate the mechanisms exploited by the virus to spread the infection and the host factors critical for HCV replication and propagation, or resistance to infection. Since host factors are presumably conserved and equally interacting with different HCV isolates and genotypes, the development of drugs targeting host factors essential for virus replication holds great promises in further increasing treatment efficacy. Refocusing of therapeutic goals also impacted in vitro diagnosis. The primary goal of anti-HCV therapy is to achieve a sustained virologic response (SVR) defined as " undetectable" HCV RNA genome in the serum or plasma at 12 to 24 weeks following the end of treatment. Use of direct antiviral agents has substantially changed the threshold of the viral load used to define SVR and led to a reassessment, as discussed herein, of result interpretation and requirements of clinically-approved, quantitative molecular assays

    The CLIL Virtual Tour

    Get PDF

    Analysis of food supplement with unusual raspberry ketone content

    Get PDF
    In recent years food supplement market increased constantly, including slimming products and against obesity. The case of rasberry ketone (RK) is here reported. HPTLC and HPLC-DAD analyses on a marketed product containing raspberry juice evidenced an abnormal quantity of RK, not in accordance with the juice natural content. The reported data confirm the need of adequate controls on marketed food supplements and the necessity of a complete adherence between labelling and real constitution of the product. Practical Applications: Determining the natural origin and assuring the consumers' safety for raspberry-based food supplement
    • …
    corecore