41,919 research outputs found

    Best practice in undertaking and reporting health technology assessments : Working Group 4 report

    Get PDF
    [Executive Summary] The aim of Working Group 4 has been to develop and disseminate best practice in undertaking and reporting assessments, and to identify needs for methodologic development. Health technology assessment (HTA) is a multidisciplinary activity that systematically examines the technical performance, safety, clinical efficacy, and effectiveness, cost, costeffectiveness, organizational implications, social consequences, legal, and ethical considerations of the application of a health technology (18). HTA activity has been continuously increasing over the last few years. Numerous HTA agencies and other institutions (termed in this report “HTA doers”) across Europe are producing an important and growing amount of HTA information. The objectives of HTA vary considerably between HTA agencies and other actors, from a strictly political decision making–oriented approach regarding advice on market licensure, coverage in benefits catalogue, or investment planning to information directed to providers or to the public. Although there seems to be broad agreement on the general elements that belong to the HTA process, and although HTA doers in Europe use similar principles (41), this is often difficult to see because of differences in language and terminology. In addition, the reporting of the findings from the assessments differs considerably. This reduces comparability and makes it difficult for those undertaking HTA assessments to integrate previous findings from other HTA doers in a subsequent evaluation of the same technology. Transparent and clear reporting is an important step toward disseminating the findings of a HTA; thus, standards that ensure high quality reporting may contribute to a wider dissemination of results. The EUR-ASSESS methodologic subgroup already proposed a framework for conducting and reporting HTA (18), which served as the basis for the current working group. New developments in the last 5 years necessitate revisiting that framework and providing a solid structure for future updates. Giving due attention to these methodologic developments, this report describes the current “best practice” in both undertaking and reporting HTA and identifies the needs for methodologic development. It concludes with specific recommendations and tools for implementing them, e.g., by providing the structure for English-language scientific summary reports and a checklist to assess the methodologic and reporting quality of HTA reports

    The use of Web Ontology Language (OWL) to Combine Extant Controlled Vocabularies in Biodiversity Informatics Appears Redundant

    Get PDF
    Implementation of PESI requires data to be combined from multiple source databases. Some of the shared fields in the source databases used different controlled vocabularies of terms. OWL DL was investigated as a mechanism to build an extensible, shared ontology of species occurrence terms that permitted the source database to continue using and extending their own vocabularies whilst formally mapping to a more generic shared vocabulary. The merits of this approach were explored and it was concluded that the building of such a complex mapping ontology probably wasn't worthwhile. The level of semantic complexity involved outweighed the costs of simply imposing a flat list of well defined terms onto data suppliers. The main problem with exiting vocabularies appear to be the overloading of terms. A candidate list of terms was proposed

    1st INCF Workshop on Global Portal Services for Neuroscience

    Get PDF
    The goal of this meeting was to map out existing portal services for neuroscience, identify their features and future plans, and outline opportunities for synergistic developments. The workshop discussed alternative formats of future global and integrated portal services

    1st INCF Workshop on Sustainability of Neuroscience Databases

    Get PDF
    The goal of the workshop was to discuss issues related to the sustainability of neuroscience databases, identify problems and propose solutions, and formulate recommendations to the INCF. The report summarizes the discussions of invited participants from the neuroinformatics community as well as from other disciplines where sustainability issues have already been approached. The recommendations for the INCF involve rating, ranking, and supporting database sustainability

    Performance prediction tools for low impact building design

    Get PDF
    IT systems are emerging that may be used to support decisions relating to the design of a built enviroment that has low impact in terms of energy use and environmental emissions. This paper summarises this prospect in relation to four complementary application areas: digital cities, rational planning, virtual design and Internet energy services

    New Methods, Current Trends and Software Infrastructure for NLP

    Full text link
    The increasing use of `new methods' in NLP, which the NeMLaP conference series exemplifies, occurs in the context of a wider shift in the nature and concerns of the discipline. This paper begins with a short review of this context and significant trends in the field. The review motivates and leads to a set of requirements for support software of general utility for NLP research and development workers. A freely-available system designed to meet these requirements is described (called GATE - a General Architecture for Text Engineering). Information Extraction (IE), in the sense defined by the Message Understanding Conferences (ARPA \cite{Arp95}), is an NLP application in which many of the new methods have found a home (Hobbs \cite{Hob93}; Jacobs ed. \cite{Jac92}). An IE system based on GATE is also available for research purposes, and this is described. Lastly we review related work.Comment: 12 pages, LaTeX, uses nemlap.sty (included

    An MPEG-7 scheme for semantic content modelling and filtering of digital video

    Get PDF
    Abstract Part 5 of the MPEG-7 standard specifies Multimedia Description Schemes (MDS); that is, the format multimedia content models should conform to in order to ensure interoperability across multiple platforms and applications. However, the standard does not specify how the content or the associated model may be filtered. This paper proposes an MPEG-7 scheme which can be deployed for digital video content modelling and filtering. The proposed scheme, COSMOS-7, produces rich and multi-faceted semantic content models and supports a content-based filtering approach that only analyses content relating directly to the preferred content requirements of the user. We present details of the scheme, front-end systems used for content modelling and filtering and experiences with a number of users
    • …
    corecore