654 research outputs found

    Discovering Petri Net Models of Discrete-Event Processes by Computing T-Invariants

    Get PDF
    International audienceThis paper addresses the problem of discovering a Petri Net (PN) from a long event sequence representing the behavior of discrete-event processes. A method for building a 1-bounded PN able to execute the events sequence S is presented; it is based on determining causality and concurrence relations between events and computing the t-invariants. This novel method determines the structure and the initial marking of an ordinary PN, which reproduces the behavior in S. The algorithms derived from the method are efficient and have been implemented and tested on numerous examples of diverse complexity. Note to Practitioners—Model discovery is useful to perform reverse engineering of ill-known systems. The algorithms proposed in this paper build 1-bounded PN models, which are enough powerful to describe many discrete-event processes from industry. The efficiency of the method allows processing very large sequences. Thus, an automated modeling tool can be developed for dealing with data issued from real systems

    Big Data and Causality

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Causality analysis continues to remain one of the fundamental research questions and the ultimate objective for a tremendous amount of scientific studies. In line with the rapid progress of science and technology, the age of big data has significantly influenced the causality analysis on various disciplines especially for the last decade due to the fact that the complexity and difficulty on identifying causality among big data has dramatically increased. Data mining, the process of uncovering hidden information from big data is now an important tool for causality analysis, and has been extensively exploited by scholars around the world. The primary aim of this paper is to provide a concise review of the causality analysis in big data. To this end the paper reviews recent significant applications of data mining techniques in causality analysis covering a substantial quantity of research to date, presented in chronological order with an overview table of data mining applications in causality analysis domain as a reference directory

    EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    Get PDF
    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments

    Causal modeling and prediction over event streams

    Get PDF
    In recent years, there has been a growing need for causal analysis in many modern stream applications such as web page click monitoring, patient health care monitoring, stock market prediction, electric grid monitoring, and network intrusion detection systems. The detection and prediction of causal relationships help in monitoring, planning, decision making, and prevention of unwanted consequences. An event stream is a continuous unbounded sequence of event instances. The availability of a large amount of continuous data along with high data throughput poses new challenges related to causal modeling over event streams, such as (1) the need for incremental causal inference for the unbounded data, (2) the need for fast causal inference for the high throughput data, and (3) the need for real-time prediction of effects from the events seen so far in the continuous event streams. This dissertation research addresses these three problems by focusing on utilizing temporal precedence information which is readily available in event streams: (1) an incremental causal model to update the causal network incrementally with the arrival of a new batch of events instead of storing the complete set of events seen so far and building the causal network from scratch with those stored events, (2) a fast causal model to speed up the causal network inference time, and (3) a real-time top-k predictive query processing mechanism to find the most probable k effects with the highest scores by proposing a run-time causal inference mechanism which addresses cyclic causal relationships. In this dissertation, the motivation, related work, proposed approaches, and the results are presented in each of the three problems

    Una nueva capa de protección a través de súper alarmas con capacidad de diagnóstico

    Get PDF
    An alarm management methodology can be proposed as a discrete event sequence recognition problem where time patterns are used to identify the process safe condition, especially in the start-up and shutdown stages. Industrial plants, particularly in the petrochemical, energy, and chemical sectors, require a combined approach of all the events that can result in a catastrophic accident. This document introduces a new layer of protection (super-alarm) for industrial processes based on a diagnostic stage. Alarms and actions of the standard operating procedure are considered discrete events involved in sequences, where the diagnostic stage corresponds to the recognition of a special situation when these sequences occur. This is meant to provide operators with pertinent information regarding the normal or abnormal situations induced by the flow of alarms. Chronicles Based Alarm Management (CBAM) is the methodology used to build the chronicles that will permit to generate the super-alarms furthermore, a case study of the petrochemical sector using CBAM is presented to build the chronicles of the normal startup, abnormal start-up, and normal shutdown scenarios. Finally, the scenario validation is performed for an abnormal start-up, showing how a super-alarm is generated.Se puede formular una metodología de gestión de alarmas como un problema de reconocimiento de secuencia de eventos discretos en el que se utilizan patrones de tiempo para identificar la condición segura del proceso, especialmente en las etapas de arranque y parada de planta. Las plantas industriales, particularmente en las industrias petroquímica, energética y química, requieren una administración combinada de todos los eventos que pueden producir un accidente catastrófico. En este documento, se introduce una nueva capa de protección (súper alarma) a los procesos industriales basados en una etapa de diagnóstico. Las alarmas y las acciones estándar del procedimiento operativo son asumidas como eventos discretos involucrados en las secuencias, luego la etapa de diagnóstico corresponde al reconocimiento de la situación cuando ocurren estas secuencias. Esto proporciona a los operadores información pertinente sobre las situaciones normales o anormales inducidas por el flujo de alarmas. La gestión de alarmas basadas en crónicas (CBAM) es la metodología utilizada en este artículo para construir las crónicas que permitirán generar las super alarmas, además, se presenta un caso de estudio del sector petroquímico que usa CBAM para construir las crónicas de los escenarios de un arranque normal, un arranque anormal y un apagado normal. Finalmente, la validación del escenario se realiza para un arranque anormal, mostrando cómo se genera una súper alarma

    Towards Correlated Sequential Rules

    Full text link
    The goal of high-utility sequential pattern mining (HUSPM) is to efficiently discover profitable or useful sequential patterns in a large number of sequences. However, simply being aware of utility-eligible patterns is insufficient for making predictions. To compensate for this deficiency, high-utility sequential rule mining (HUSRM) is designed to explore the confidence or probability of predicting the occurrence of consequence sequential patterns based on the appearance of premise sequential patterns. It has numerous applications, such as product recommendation and weather prediction. However, the existing algorithm, known as HUSRM, is limited to extracting all eligible rules while neglecting the correlation between the generated sequential rules. To address this issue, we propose a novel algorithm called correlated high-utility sequential rule miner (CoUSR) to integrate the concept of correlation into HUSRM. The proposed algorithm requires not only that each rule be correlated but also that the patterns in the antecedent and consequent of the high-utility sequential rule be correlated. The algorithm adopts a utility-list structure to avoid multiple database scans. Additionally, several pruning strategies are used to improve the algorithm's efficiency and performance. Based on several real-world datasets, subsequent experiments demonstrated that CoUSR is effective and efficient in terms of operation time and memory consumption.Comment: Preprint. 7 figures, 6 table

    Trading strategy and behavior of various investor types between spot and futures market: evidence from Thailand

    Get PDF
    In rational, efficient market, returns on derivative and underlying securities should be perfectly contemporaneously correlated. Due to market imperfections, one of these two markets may reflect information faster. The thesis analyzes the lead-lag relationship between the spot market and futures market, SET50 index and its futures contract, for the Thailand market. Various econometric tools like unit root tests and the Error-Correction Model (ECM) were employed in the study. The Augmented Dickey Fuller tests employed in the study proved that both the selected markets were stationary series after first difference and the Granger Causality test proved unidirectional relationships between these markets. On the daily observations basis, the results show that there is a price discovery for the futures index. In other words, the lagged of changes in spot price has a leading effect to the changes in the futures price. Alternatively, the TDEX is used instead of the SET50 index to see any changes in the lead-lag relationship. The result proves that there is a leading effect between TDEX and SET50 index futures. The ECM, which utilizes the traditional linear model, is considered to be the best forecasting model. The trading strategy based on this model can outperform the market even after allowing for transaction costs. Moreover, this thesis studies the trading patterns of each investor type, which are foreign investors, institutional investors, and individual investors by using detailed records of trading activity, trading volume, and trading value by employing a unique data set of daily aggregated purchases and sales on the Stock Exchange of Thailand (SET) and the Thailand’s derivative market. The results show that the buying and selling investment flows of these three investor groups are ranked as follows; the majority trader in the Stock Exchange of Thailand (SET) is the individual investor, followed by the foreign investor, and the institutional investor. The corresponding ranking in the Thailand’s Derivative Market is the individual investor, then the institutional investor, and the foreign investor is the minority trader. The results provide empirical evidence that foreign investors were net buyers whereas institutional investors and individual investors were net sellers of equities in both the spot and the futures market of Thailand. For the feedback-trading pattern, the results show that in both the spot and the futures market; foreign investors are positive feedback or momentum traders. While, individual investors tend to be contrarian investors, or negative feedback traders. Institutional investors’ trading pattern in both spot and futures market is rather mixed results. Furthermore, the results show that foreign investors’ herding is positively correlated with institutional traders in spot market, while negatively correlated with institutional investors in futures market. Foreign investors’ herding is negatively correlated with individual investors in both spot and futures market. Institutional investors’ trade flow is positively correlated with individual investor in futures market whereas it is negatively correlated with individual investors in spot market. In addition, this thesis studies trading performance of various investor types, which are foreign investors, institutional investors, and individual investors on the Stock Exchange of Thailand (SET) and Thailand’s derivative market. The results reveal that different investor types can have different performance. Foreign investors who are more likely to have information advantage over other type make minor overall net trading gains in the futures market, their gains arise from the good market timing but likely to incur large losses in the spot market from negative price spreads between sell and buy prices. Individual investors in the spot market experience positive return, they have success in performance from price spread whereas they experience poor market timing return. Moreover, the results exhibit that individuals make losses on their trade in the futures market. Specifically, the results show that institutional investors make overall net trading gains from positive price spreads between sell and buy prices in both spot and futures market. The different performance might be due to mixed effect of the trading gains and losses arise from trades between investor types that have different backgrounds
    corecore