18,648 research outputs found

    Time Aware Knowledge Extraction for Microblog Summarization on Twitter

    Full text link
    Microblogging services like Twitter and Facebook collect millions of user generated content every moment about trending news, occurring events, and so on. Nevertheless, it is really a nightmare to find information of interest through the huge amount of available posts that are often noise and redundant. In general, social media analytics services have caught increasing attention from both side research and industry. Specifically, the dynamic context of microblogging requires to manage not only meaning of information but also the evolution of knowledge over the timeline. This work defines Time Aware Knowledge Extraction (briefly TAKE) methodology that relies on temporal extension of Fuzzy Formal Concept Analysis. In particular, a microblog summarization algorithm has been defined filtering the concepts organized by TAKE in a time-dependent hierarchy. The algorithm addresses topic-based summarization on Twitter. Besides considering the timing of the concepts, another distinguish feature of the proposed microblog summarization framework is the possibility to have more or less detailed summary, according to the user's needs, with good levels of quality and completeness as highlighted in the experimental results.Comment: 33 pages, 10 figure

    Refactorings of Design Defects using Relational Concept Analysis

    Get PDF
    Software engineers often need to identify and correct design defects, ıe} recurring design problems that hinder development and maintenance\ud by making programs harder to comprehend and--or evolve. While detection\ud of design defects is an actively researched area, their correction---mainly\ud a manual and time-consuming activity --- is yet to be extensively\ud investigated for automation. In this paper, we propose an automated\ud approach for suggesting defect-correcting refactorings using relational\ud concept analysis (RCA). The added value of RCA consists in exploiting\ud the links between formal objects which abound in a software re-engineering\ud context. We validated our approach on instances of the <span class='textit'></span>Blob\ud design defect taken from four different open-source programs

    Ontology mapping: the state of the art

    No full text
    Ontology mapping is seen as a solution provider in today's landscape of ontology research. As the number of ontologies that are made publicly available and accessible on the Web increases steadily, so does the need for applications to use them. A single ontology is no longer enough to support the tasks envisaged by a distributed environment like the Semantic Web. Multiple ontologies need to be accessed from several applications. Mapping could provide a common layer from which several ontologies could be accessed and hence could exchange information in semantically sound manners. Developing such mapping has beeb the focus of a variety of works originating from diverse communities over a number of years. In this article we comprehensively review and present these works. We also provide insights on the pragmatics of ontology mapping and elaborate on a theoretical approach for defining ontology mapping

    Curbing domestic violence: instantiating C-K theory with formal concept analysis and emergent self organizing maps.

    Get PDF
    In this paper we propose a human-centered process for knowledge discovery from unstructured text that makes use of Formal Concept Analysis and Emergent Self Organizing Maps. The knowledge discovery process is conceptualized and interpreted as successive iterations through the Concept-Knowledge (C-K) theory design square. To illustrate its effectiveness, we report on a real-life case study of using the process at the Amsterdam-Amstelland police in the Netherlands aimed at distilling concepts to identify domestic violence from the unstructured text in actual police reports. The case study allows us to show how the process was not only able to uncover the nature of a phenomenon such as domestic violence, but also enabled analysts to identify many types of anomalies in the practice of policing. We will illustrate how the insights obtained from this exercise resulted in major improvements in the management of domestic violence cases.Formal concept analysis; Emergent self organizing map; C-K theory; Text mining; Actionable knowledge discovery; Domestic violence;

    High-precision αs\alpha_s measurements from LHC to FCC-ee

    Full text link
    This document provides a writeup of all contributions to the workshop on "High precision measurements of αs\alpha_s: From LHC to FCC-ee" held at CERN, Oct. 12--13, 2015. The workshop explored in depth the latest developments on the determination of the QCD coupling αs\alpha_s from 15 methods where high precision measurements are (or will be) available. Those include low-energy observables: (i) lattice QCD, (ii) pion decay factor, (iii) quarkonia and (iv) τ\tau decays, (v) soft parton-to-hadron fragmentation functions, as well as high-energy observables: (vi) global fits of parton distribution functions, (vii) hard parton-to-hadron fragmentation functions, (viii) jets in e±e^\pmp DIS and γ\gamma-p photoproduction, (ix) photon structure function in γ\gamma-γ\gamma, (x) event shapes and (xi) jet cross sections in e+ee^+e^- collisions, (xii) W boson and (xiii) Z boson decays, and (xiv) jets and (xv) top-quark cross sections in proton-(anti)proton collisions. The current status of the theoretical and experimental uncertainties associated to each extraction method, the improvements expected from LHC data in the coming years, and future perspectives achievable in e+ee^+e^- collisions at the Future Circular Collider (FCC-ee) with O\cal{O}(1--100 ab1^{-1}) integrated luminosities yielding 1012^{12} Z bosons and jets, and 108^{8} W bosons and τ\tau leptons, are thoroughly reviewed. The current uncertainty of the (preliminary) 2015 strong coupling world-average value, αs(mZ)\alpha_s(m_Z) = 0.1177 ±\pm 0.0013, is about 1\%. Some participants believed this may be reduced by a factor of three in the near future by including novel high-precision observables, although this opinion was not universally shared. At the FCC-ee facility, a factor of ten reduction in the αs\alpha_s uncertainty should be possible, mostly thanks to the huge Z and W data samples available.Comment: 135 pages, 56 figures. CERN-PH-TH-2015-299, CoEPP-MN-15-13. This document is dedicated to the memory of Guido Altarell
    corecore