710,479 research outputs found

    Jointly Multiple Events Extraction via Attention-based Graph Information Aggregation

    Full text link
    Event extraction is of practical utility in natural language processing. In the real world, it is a common phenomenon that multiple events existing in the same sentence, where extracting them are more difficult than extracting a single event. Previous works on modeling the associations between events by sequential modeling methods suffer a lot from the low efficiency in capturing very long-range dependencies. In this paper, we propose a novel Jointly Multiple Events Extraction (JMEE) framework to jointly extract multiple event triggers and arguments by introducing syntactic shortcut arcs to enhance information flow and attention-based graph convolution networks to model graph information. The experiment results demonstrate that our proposed framework achieves competitive results compared with state-of-the-art methods.Comment: accepted by EMNLP 201

    Variational Inference of Joint Models using Multivariate Gaussian Convolution Processes

    Full text link
    We present a non-parametric prognostic framework for individualized event prediction based on joint modeling of both longitudinal and time-to-event data. Our approach exploits a multivariate Gaussian convolution process (MGCP) to model the evolution of longitudinal signals and a Cox model to map time-to-event data with longitudinal data modeled through the MGCP. Taking advantage of the unique structure imposed by convolved processes, we provide a variational inference framework to simultaneously estimate parameters in the joint MGCP-Cox model. This significantly reduces computational complexity and safeguards against model overfitting. Experiments on synthetic and real world data show that the proposed framework outperforms state-of-the art approaches built on two-stage inference and strong parametric assumptions

    Deep Recurrent Survival Analysis

    Full text link
    Survival analysis is a hotspot in statistical research for modeling time-to-event information with data censorship handling, which has been widely used in many applications such as clinical research, information system and other fields with survivorship bias. Many works have been proposed for survival analysis ranging from traditional statistic methods to machine learning models. However, the existing methodologies either utilize counting-based statistics on the segmented data, or have a pre-assumption on the event probability distribution w.r.t. time. Moreover, few works consider sequential patterns within the feature space. In this paper, we propose a Deep Recurrent Survival Analysis model which combines deep learning for conditional probability prediction at fine-grained level of the data, and survival analysis for tackling the censorship. By capturing the time dependency through modeling the conditional probability of the event for each sample, our method predicts the likelihood of the true event occurrence and estimates the survival rate over time, i.e., the probability of the non-occurrence of the event, for the censored data. Meanwhile, without assuming any specific form of the event probability distribution, our model shows great advantages over the previous works on fitting various sophisticated data distributions. In the experiments on the three real-world tasks from different fields, our model significantly outperforms the state-of-the-art solutions under various metrics.Comment: AAAI 2019. Supplemental material, slides, code: https://github.com/rk2900/drs

    Semantic Modeling of Analytic-based Relationships with Direct Qualification

    Full text link
    Successfully modeling state and analytics-based semantic relationships of documents enhances representation, importance, relevancy, provenience, and priority of the document. These attributes are the core elements that form the machine-based knowledge representation for documents. However, modeling document relationships that can change over time can be inelegant, limited, complex or overly burdensome for semantic technologies. In this paper, we present Direct Qualification (DQ), an approach for modeling any semantically referenced document, concept, or named graph with results from associated applied analytics. The proposed approach supplements the traditional subject-object relationships by providing a third leg to the relationship; the qualification of how and why the relationship exists. To illustrate, we show a prototype of an event-based system with a realistic use case for applying DQ to relevancy analytics of PageRank and Hyperlink-Induced Topic Search (HITS).Comment: Proceedings of the 2015 IEEE 9th International Conference on Semantic Computing (IEEE ICSC 2015

    Necessary and Sufficient Conditions on Partial Orders for Modeling Concurrent Computations

    Full text link
    Partial orders are used extensively for modeling and analyzing concurrent computations. In this paper, we define two properties of partially ordered sets: width-extensibility and interleaving-consistency, and show that a partial order can be a valid state based model: (1) of some synchronous concurrent computation iff it is width-extensible, and (2) of some asynchronous concurrent computation iff it is width-extensible and interleaving-consistent. We also show a duality between the event based and state based models of concurrent computations, and give algorithms to convert models between the two domains. When applied to the problem of checkpointing, our theory leads to a better understanding of some existing results and algorithms in the field. It also leads to efficient detection algorithms for predicates whose evaluation requires knowledge of states from all the processes in the system

    State-space based mass event-history model I: many decision-making agents with one target

    Full text link
    A dynamic decision-making system that includes a mass of indistinguishable agents could manifest impressive heterogeneity. This kind of nonhomogeneity is postulated to result from macroscopic behavioral tactics employed by almost all involved agents. A State-Space Based (SSB) mass event-history model is developed here to explore the potential existence of such macroscopic behaviors. By imposing an unobserved internal state-space variable into the system, each individual's event-history is made into a composition of a common state duration and an individual specific time to action. With the common state modeling of the macroscopic behavior, parametric statistical inferences are derived under the current-status data structure and conditional independence assumptions. Identifiability and computation related problems are also addressed. From the dynamic perspectives of system-wise heterogeneity, this SSB mass event-history model is shown to be very distinct from a random effect model via the Principle Component Analysis (PCA) in a numerical experiment. Real data showing the mass invasion by two species of parasitic nematode into two species of host larvae are also analyzed. The analysis results not only are found coherent in the context of the biology of the nematode as a parasite, but also include new quantitative interpretations.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS189 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The heavy quark search at the LHC

    Full text link
    We explore further the discovery potential for heavy quarks at the LHC, with emphasis on the t′t' and b′b' of a sequential fourth family associated with electroweak symmetry breaking. We consider QCD multijets, ttˉ+jetst\bar{t}+\rm{jets}, W+jetsW+\rm{jets} and single tt backgrounds using event generation based on improved matrix elements and low sensitivity to the modeling of initial state radiation. We exploit a jet mass technique for the identification of hadronically decaying WW's and tt's, to be used in the reconstruction of the t′t' or b′b' mass. This along with other aspects of event selection can reduce backgrounds to very manageable levels. It even allows a search for both t′t' and b′b' in the absence of bb-tagging, of interest for the early running of the LHC. A heavy quark mass of order 600 GeV is motivated by the connection to electroweak symmetry breaking, but our analysis is relevant for any new heavy quarks with weak decay modes.Comment: 12 pages, 7 figure
    • …
    corecore