1,894 research outputs found

    A General Framework for Representing, Reasoning and Querying with Annotated Semantic Web Data

    Full text link
    We describe a generic framework for representing and reasoning with annotated Semantic Web data, a task becoming more important with the recent increased amount of inconsistent and non-reliable meta-data on the web. We formalise the annotated language, the corresponding deductive system and address the query answering problem. Previous contributions on specific RDF annotation domains are encompassed by our unified reasoning formalism as we show by instantiating it on (i) temporal, (ii) fuzzy, and (iii) provenance annotations. Moreover, we provide a generic method for combining multiple annotation domains allowing to represent, e.g. temporally-annotated fuzzy RDF. Furthermore, we address the development of a query language -- AnQL -- that is inspired by SPARQL, including several features of SPARQL 1.1 (subqueries, aggregates, assignment, solution modifiers) along with the formal definitions of their semantics

    Normalisation of imprecise temporal expressions extracted from text

    Get PDF
    Information extraction systems and techniques have been largely used to deal with the increasing amount of unstructured data available nowadays. Time is among the different kinds of information that may be extracted from such unstructured data sources, including text documents. However, the inability to correctly identify and extract temporal information from text makes it difficult to understand how the extracted events are organised in a chronological order. Furthermore, in many situations, the meaning of temporal expressions (timexes) is imprecise, such as in “less than 2 years” and “several weeks”, and cannot be accurately normalised, leading to interpretation errors. Although there are some approaches that enable representing imprecise timexes, they are not designed to be applied to specific scenarios and difficult to generalise. This paper presents a novel methodology to analyse and normalise imprecise temporal expressions by representing temporal imprecision in the form of membership functions, based on human interpretation of time in two different languages (Portuguese and English). Each resulting model is a generalisation of probability distributions in the form of trapezoidal and hexagonal fuzzy membership functions. We use an adapted F1-score to guide the choice of the best models for each kind of imprecise timex and a weighted F1-score (F1 3 D ) as a complementary metric in order to identify relevant differences when comparing two normalisation models. We apply the proposed methodology for three distinct classes of imprecise timexes, and the resulting models give distinct insights in the way each kind of temporal expression is interpreted

    Fast approximation of visibility dominance using topographic features as targets and the associated uncertainty

    Get PDF
    An approach to reduce visibility index computation time andmeasure the associated uncertainty in terrain visibility analysesis presented. It is demonstrated that the visibility indexcomputation time in mountainous terrain can be reduced substantially,without any significant information loss, if the lineof sight from each observer on the terrain is drawn only to thefundamental topographic features, i.e., peaks, pits, passes,ridges, and channels. However, the selected sampling of targetsresults in an underestimation of the visibility index ofeach observer. Two simple methods based on iterative comparisonsbetween the real visibility indices and the estimatedvisibility indices have been proposed for a preliminary assessmentof this uncertainty. The method has been demonstratedfor gridded digital elevation models

    Artificial neural networks : A comparative study of implementations for human chromosome classification

    Get PDF
    Artificial neural networks are a popular field of artificial intelligence and have commonly been applied to solve many prediction, classification and diagnostic tasks. One such task is the analysis of human chromosomes. This thesis investigates the use of artificial neural networks (ANNs) as automated chromosome classifiers. The investigation involves the thorough analysis of seven different implementation techniques. These include three techniques using artificial neural networks, two techniques using ANN s supported by another method and two techniques not using ANNs. These seven implementations are evaluated according to the classification accuracy achieved and according to their support of important system measures, such as robustness and validity. The results collected show that ANNs perform relatively well in terms of classification accuracy, though other implementations achieved higher results. However, ANNs provide excellent support of essential system measures. This leads to a well-rounded implementation, consisting of a good balance between accuracy and system features, and thus an effective technique for automated human chromosome classification

    Development of a quantification model for the cost of loss of image with customer complaints

    Get PDF
    Despite the difficulty in measuring hidden quality costs, we must be aware not only of their existence, but also of their importance. Not surprisingly, they have been the causative factor in the closure of many companies because they are doubly dangerous. One the one hand, they represent very significant quantities of money and, on the other, they remain hidden, like the submerged portion of an iceberg [Campanella, J. (1999). Principles of Quality Costs: Principles, Implementation and Use. Milwaukee, WI: ASQ Quality Press]. Possibly one of the most harmful hidden quality costs, and most difficult to quantify, is the cost of loss of image (CLI) a company suffers because of faults detected by its customers. This paper develops an original tool that, with the use of fuzzy logic as an alternative to probabilistic theory, is capable of facilitating the quantification of the CLI in any company from the observation of its customer complaints. Once the theoretical model is presented, we proceed with its experimentation, making use of a case study as research methodology

    Higher Order Fuzzy Rule Interpolation

    Get PDF

    Computational Modelling for Bankruptcy Prediction: Semantic data Analysis Integrating Graph Database and Financial Ontology

    Get PDF
    In this paper, we propose a novel intelligent methodology to construct a Bankruptcy Prediction Computation Model, which is aimed to execute a company's financial status analysis accurately. Based on the semantic data analysis and management, our methodology considers Semantic Database System as the core of the system. It comprises three layers: an Ontology of Bankruptcy Prediction, Semantic Search Engine, and a Semantic Analysis Graph Database system. The Ontological layer defines the basic concepts of the financial risk management as well as the objects that serve as sources of knowledge for predicting a company's bankruptcy. The Graph Database layer utilises a powerful semantic data technology, which serves as a semantic data repository for our model. The article provides a detailed description of the construction of the Ontology and its informal conceptual representation. We also present a working prototype of the Graph Database system, constructed using the Neo4j application, and show the connection between well-known financial ratios. We argue that this methodology which utilises state of the art semantic data management mechanisms enables data processing and relevant computations in a more efficient way than approaches using the traditional relational database. These give us solid grounds to build a system that is capable of tackling the data of any complexity level

    Towards autonomy, self-organisation and learning in holonic manufacturing

    Get PDF
    This paper intends to discuss self-organisation and learning capabilities in autonomous and cooperative holons that are part of a holonic manufacturing control system. These capabilities will support the dynamic adaptation of the manufacturing control to the manufacturing evolution and emergency, specially the agile reaction to unexpected disturbances
    • …
    corecore