2,214 research outputs found

    Semantic Modeling of Analytic-based Relationships with Direct Qualification

    Full text link
    Successfully modeling state and analytics-based semantic relationships of documents enhances representation, importance, relevancy, provenience, and priority of the document. These attributes are the core elements that form the machine-based knowledge representation for documents. However, modeling document relationships that can change over time can be inelegant, limited, complex or overly burdensome for semantic technologies. In this paper, we present Direct Qualification (DQ), an approach for modeling any semantically referenced document, concept, or named graph with results from associated applied analytics. The proposed approach supplements the traditional subject-object relationships by providing a third leg to the relationship; the qualification of how and why the relationship exists. To illustrate, we show a prototype of an event-based system with a realistic use case for applying DQ to relevancy analytics of PageRank and Hyperlink-Induced Topic Search (HITS).Comment: Proceedings of the 2015 IEEE 9th International Conference on Semantic Computing (IEEE ICSC 2015

    Towards new information resources for public health: From WordNet to MedicalWordNet

    Get PDF
    In the last two decades, WORDNET has evolved as the most comprehensive computational lexicon of general English. In this article, we discuss its potential for supporting the creation of an entirely new kind of information resource for public health, viz. MEDICAL WORDNET. This resource is not to be conceived merely as a lexical extension of the original WORDNET to medical terminology; indeed, there is already a considerable degree of overlap between WORDNET and the vocabulary of medicine. Instead, we propose a new type of repository, consisting of three large collections of (1) medically relevant word forms, structured along the lines of the existing Princeton WORDNET; (2) medically validated propositions, referred to here as medical facts, which will constitute what we shall call MEDICAL FACTNET; and (3) propositions reflecting laypersons’ medical beliefs, which will constitute what we shall call the MEDICAL BELIEFNET. We introduce a methodology for setting up the MEDICAL WORDNET. We then turn to the discussion of research challenges that have to be met in order to build this new type of information resource

    Ontology based Clinical Practice Justification in Natural Language

    Get PDF
    One of the most important contributions that any decision support system can make to achieve wide acceptance among any community is to be able to justify its own suggestions. When dealing with highly technical and scientifically advanced practitioners like medical doctors or any other related clinical workers, the ability to justify itself using the domain specialist usual terminology and technicalities is imperative. In this article we demonstrate the use of an ontological framework as inferencing basis for automatic sound clinical suggestions providing. Our work has two main contributions, consolidating the use of \{OGCP\} (Ontology for General Clinical Practice) as foundation and providing controlled English justifications of the extracted suggestions. We found that clinical practitioners feel as acceptable the Attempto Controlled English justifications generated from the knowledge base

    The nature and evaluation of commercial expert system building tools, revision 1

    Get PDF
    This memorandum reviews the factors that constitute an Expert System Building Tool (ESBT) and evaluates current tools in terms of these factors. Evaluation of these tools is based on their structure and their alternative forms of knowledge representation, inference mechanisms and developer end-user interfaces. Next, functional capabilities, such as diagnosis and design, are related to alternative forms of mechanization. The characteristics and capabilities of existing commercial tools are then reviewed in terms of these criteria

    A Toolkit for uncertainty reasoning and representation using fuzzy set theory in PROLOG expert systems

    Get PDF
    This thesis examines the issue of uncertainty reasoning and representation in expert systems. Uncertainty and expert systems are defined. The value of uncertainty in expert systems as an approximation of human reasoning is stressed. Five alternative methods of dealing with uncertainty are explored. These include Bayesian probabilities, Mycin confirmation theory, fuzzy set theory, Dempster-Shafer\u27s theory of evidence and a theory of endorsements. A toolkit to apply uncertainty processing in PROLOG expert systems is developed using fuzzy set theory as the basis for uncertainty reasoning and representation. The concepts of fuzzy logic and approximate reasoning are utilized in the implementation. The toolkit is written in C-PROLOG for the PYRAMID UNIX system at the Rochester Institute of Technology

    Expert system technology

    Get PDF
    The expert system is a computer program which attempts to reproduce the problem-solving behavior of an expert, who is able to view problems from a broad perspective and arrive at conclusions rapidly, using intuition, shortcuts, and analogies to previous situations. Expert systems are a departure from the usual artificial intelligence approach to problem solving. Researchers have traditionally tried to develop general modes of human intelligence that could be applied to many different situations. Expert systems, on the other hand, tend to rely on large quantities of domain specific knowledge, much of it heuristic. The reasoning component of the system is relatively simple and straightforward. For this reason, expert systems are often called knowledge based systems. The report expands on the foregoing. Section 1 discusses the architecture of a typical expert system. Section 2 deals with the characteristics that make a problem a suitable candidate for expert system solution. Section 3 surveys current technology, describing some of the software aids available for expert system development. Section 4 discusses the limitations of the latter. The concluding section makes predictions of future trends

    Summarisation and visualisation of e-Health data repositories

    Get PDF
    At the centre of the Clinical e-Science Framework (CLEF) project is a repository of well organised, detailed clinical histories, encoded as data that will be available for use in clinical care and in-silico medical experiments. We describe a system that we have developed as part of the CLEF project, to perform the task of generating a diverse range of textual and graphical summaries of a patient’s clinical history from a data-encoded model, a chronicle, representing the record of the patient’s medical history. Although the focus of our current work is on cancer patients, the approach we describe is generalisable to a wide range of medical areas

    StuA: An Intelligent Student Assistant

    Get PDF
    With advanced innovation in digital technology, demand for virtual assistants is arising which can assist a person and at the same time, minimize the need for interaction with the human. Acknowledging the requirement, we propose an interactive and intelligent student assistant, StuA, which can help new-comer in a college who are hesitant in interacting with the seniors as they fear of being ragged. StuA is capable of answering all types of queries of a new-comer related to academics, examinations, library, hostel and extra curriculum activities. The model is designed using CLIPS which allows inferring using forward chaining. Nevertheless, a generalized algorithm for backward chaining for CLIPS is also implemented. Validation of the proposed model is presented in five steps which show that the model is complete and consistent with 99.16% accuracy of the knowledge model. Moreover, the backward chaining algorithm is found to be 100% accurate
    • …
    corecore