49,534 research outputs found

    A unified framework for building ontological theories with application and testing in the field of clinical trials

    Get PDF
    The objective of this research programme is to contribute to the establishment of the emerging science of Formal Ontology in Information Systems via a collaborative project involving researchers from a range of disciplines including philosophy, logic, computer science, linguistics, and the medical sciences. The re­searchers will work together on the construction of a unified formal ontology, which means: a general framework for the construction of ontological theories in specific domains. The framework will be constructed using the axiomatic-deductive method of modern formal ontology. It will be tested via a series of applications relating to on-going work in Leipzig on medical taxonomies and data dictionaries in the context of clinical trials. This will lead to the production of a domain-specific ontology which is designed to serve as a basis for applications in the medical field

    Profiling a decade of information systems frontiers’ research

    Get PDF
    This article analyses the first ten years of research published in the Information Systems Frontiers (ISF) from 1999 to 2008. The analysis of the published material includes examining variables such as most productive authors, citation analysis, universities associated with the most publications, geographic diversity, authors’ backgrounds and research methods. The keyword analysis suggests that ISF research has evolved from establishing concepts and domain of information systems (IS), technology and management to contemporary issues such as outsourcing, web services and security. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of ISF. The analysis has also identified authors published in other journals whose work largely shaped and guided the researchers published in ISF. This research has implications for researchers, journal editors, and research institutions

    Conceptual graph-based knowledge representation for supporting reasoning in African traditional medicine

    Get PDF
    Although African patients use both conventional or modern and traditional healthcare simultaneously, it has been proven that 80% of people rely on African traditional medicine (ATM). ATM includes medical activities stemming from practices, customs and traditions which were integral to the distinctive African cultures. It is based mainly on the oral transfer of knowledge, with the risk of losing critical knowledge. Moreover, practices differ according to the regions and the availability of medicinal plants. Therefore, it is necessary to compile tacit, disseminated and complex knowledge from various Tradi-Practitioners (TP) in order to determine interesting patterns for treating a given disease. Knowledge engineering methods for traditional medicine are useful to model suitably complex information needs, formalize knowledge of domain experts and highlight the effective practices for their integration to conventional medicine. The work described in this paper presents an approach which addresses two issues. First it aims at proposing a formal representation model of ATM knowledge and practices to facilitate their sharing and reusing. Then, it aims at providing a visual reasoning mechanism for selecting best available procedures and medicinal plants to treat diseases. The approach is based on the use of the Delphi method for capturing knowledge from various experts which necessitate reaching a consensus. Conceptual graph formalism is used to model ATM knowledge with visual reasoning capabilities and processes. The nested conceptual graphs are used to visually express the semantic meaning of Computational Tree Logic (CTL) constructs that are useful for formal specification of temporal properties of ATM domain knowledge. Our approach presents the advantage of mitigating knowledge loss with conceptual development assistance to improve the quality of ATM care (medical diagnosis and therapeutics), but also patient safety (drug monitoring)

    Neurocognitive Informatics Manifesto.

    Get PDF
    Informatics studies all aspects of the structure of natural and artificial information systems. Theoretical and abstract approaches to information have made great advances, but human information processing is still unmatched in many areas, including information management, representation and understanding. Neurocognitive informatics is a new, emerging field that should help to improve the matching of artificial and natural systems, and inspire better computational algorithms to solve problems that are still beyond the reach of machines. In this position paper examples of neurocognitive inspirations and promising directions in this area are given

    Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach.

    Get PDF
    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain-computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a "containerized" approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data "Levels," each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org)

    Speech acts and medical records: The ontological nexus

    Get PDF
    Despite the recent advances in information and communication technology that have increased our ability to store and circulate information, the task of ensuring that the right sorts of information gets to the right sorts of people remains. We argue that the many efforts underway to develop efficient means for sharing information across healthcare systems and organizations would benefit from a careful analysis of human action in healthcare organizations. This in turn requires that the management of information and knowledge within healthcare organizations be combined with models of resources and processes of patient care that are based on a general ontology of social interaction. The Health Level 7 (HL7) is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. HL7 has advanced a widely used messaging standard that enables healthcare applications to exchange clinical and administrative data in digital form. HL7 focuses on the interface requirements of the entire healthcare system and not exclusively on the requirements of one area of healthcare such as pharmacy, medical devices, imaging or insurance transactions. This has inspired the development of a powerful abstract model of patient care called the Reference Information Model (RIM). The present paper begins with an overview of the core classes of the HL7 (Version 3) RIM and a brief discussion of its “actcentered” view of healthcare. Central to this account is what is called the life cycle of events. A clinical action may progress from defined, through planned and ordered, to executed. These modalities of an action are represented as the mood of the act. We then outline the basis of an ontology of organizations, starting from the theory of speech Acts, and apply this ontology to the HL7 RIM. Special attention is given to the sorts of preconditions that must be satisfied for the successful performance of a speech act and to the sorts of entities to which speech acts give rise (e.g. obligations, claims, commitments, etc.). Finally we draw conclusions for the efficient communication and management of medical information and knowledge within and between healthcare organizations, paying special attention to the role that medical documents play in such organizations
    • …
    corecore