5 research outputs found

    A Metadirectory of Web Components for Mashup Composition

    Get PDF
    Because of the growing availability of third-party APIs, services, widgets and any other reusable web component, mashup developers now face a vast amount of candidate components for their developments. Moreover, these components quite often are scattered in many different repositories and web sites, which makes difficult their selection or discovery. In this paper, we discuss the problem of component selection in Service-Oriented Architectures (SOA) and Mashup-Driven Development, and introduce the Linked Mashups Ontology (LiMOn), a model that allows describing mashups and their components for integrating and sharing mashup information such as categorization or dependencies. The model has allowed the building of an integrated, centralized metadirectory of web components for query and selection, which has served to evaluate the model. The metadirectory allows accessing various heterogeneous repositories of mashups and web components while using external information from the Linked Data cloud, helping mashup development

    Ranking web services using centralities and social indicators

    Get PDF
    Nowadays, developers of web application mashups face a sheer overwhelming variety and pluralism of web services. Therefore, choosing appropriate web services to achieve specific goals requires a certain amount of knowledge as well as expertise. In order to support users in choosing appropriate web services it is not only important to match their search criteria to a dataset of possible choices but also to rank the results according to their relevance, thus minimizing the time it takes for taking such a choice. Therefore, we investigated six ranking approaches in an empirical manner and compared them to each other. Moreover, we have had a look on how one can combine those ranking algorithms linearly in order to maximize the quality of their outputs

    Clinical foundations and information architecture for the implementation of a federated health record service

    Get PDF
    Clinical care increasingly requires healthcare professionals to access patient record information that may be distributed across multiple sites, held in a variety of paper and electronic formats, and represented as mixtures of narrative, structured, coded and multi-media entries. A longitudinal person-centred electronic health record (EHR) is a much-anticipated solution to this problem, but its realisation is proving to be a long and complex journey. This Thesis explores the history and evolution of clinical information systems, and establishes a set of clinical and ethico-legal requirements for a generic EHR server. A federation approach (FHR) to harmonising distributed heterogeneous electronic clinical databases is advocated as the basis for meeting these requirements. A set of information models and middleware services, needed to implement a Federated Health Record server, are then described, thereby supporting access by clinical applications to a distributed set of feeder systems holding patient record information. The overall information architecture thus defined provides a generic means of combining such feeder system data to create a virtual electronic health record. Active collaboration in a wide range of clinical contexts, across the whole of Europe, has been central to the evolution of the approach taken. A federated health record server based on this architecture has been implemented by the author and colleagues and deployed in a live clinical environment in the Department of Cardiovascular Medicine at the Whittington Hospital in North London. This implementation experience has fed back into the conceptual development of the approach and has provided "proof-of-concept" verification of its completeness and practical utility. This research has benefited from collaboration with a wide range of healthcare sites, informatics organisations and industry across Europe though several EU Health Telematics projects: GEHR, Synapses, EHCR-SupA, SynEx, Medicate and 6WINIT. The information models published here have been placed in the public domain and have substantially contributed to two generations of CEN health informatics standards, including CEN TC/251 ENV 13606

    Forgetting to remember : organisational memory

    Get PDF
    Organisations need to learn from their current and past experiences to optimise their activities, decisions and future strategies. Non-governmental organisations are similar to public or governmental departments in that learning is crucial for their existence. One of the key factors influencing learning is the development and maintenance of a functional organisational memory. The organisational memory is a dynamic entity encompassing more than the storage facilities provided by an information technology system. It also resides in human form, acting as reservoirs and interpretation centres and feeding the organisational memory as a whole. Previous research in organisational memory focussed mostly on describing the structure of the storage systems, with the current focus on developing management information systems to enhance organisational memory storage and retrieval. Some work has been undertaken to describe the processes involved, which include accessing, storing and retrieving the memory. Other functions that need special attention are the development of data to information, and especially creating and using knowledge. The studies mostly involved existing organisational memory as it was represented at a specific time of the organisations’ development. This study looks at all the different developmental phases of a regional NGO, which include start-up, expansion in target territory, expansion in activities, consolidation and close-out. To investigate the temporal changes of organisational memory in a regional intermediary NGO, a retrospective case study methodology was used. The NGO was closing down, providing an opportunity to investigate all the stages of development. The data collection, analysis and interpretation involved various in-depth interviews with current and past staff members and other key stakeholders, such as beneficiary organisations and consultants. In addition, a complex set of documents were studied, including proposals, strategic documents, minutes of meetings, and audiovisual material. The main themes and factors, such as individuals, leadership, electronic and other management of the organisational memory, culture, including the importance of a vision and theory of change, policies and global developments are discussed using a temporal ecological framework. The key findings of this study illustrate the importance of directories as part of the metamemory in accessing seemingly dormant organisational memories. The conclusion is that organisational memory survives after the demise of the organisation and that it is accessible through directories.PsychologyPh. D. (Consulting Psychology

    Architektur- und Werkzeugkonzepte für föderiertes Identitäts-Management

    Get PDF
    Als essentielle Komponente des IT-Security Managements umfasst das Identity & Access Management (I&AM) saemtliche organisatorischen und technischen Prozesse der Verwaltung von Dienstnutzern einer Einrichtung und deren Berechtigungen; dabei werden die Datenbestaende verschiedenster autoritativer Datenquellen wie Personal- und Kundenverwaltungssysteme aggregiert, korreliert und in aufbereiteter Form den IT-Services zur Verfuegung gestellt. Das Federated Identity Management (FIM) hat zum Ziel, die so geschaffenen integrierten Datenbestaende auch organisationsuebergreifend nutzbar zu machen; diese Funktionalitaet wird beispielsweise im Rahmen von Business-to-Business-Kooperationen, Outsourcing-Szenarien und im Grid-Computing zunehmend dringender benoetigt. Die Vermeidung von Redundanz und Inkonsistenzen, aber auch die garantierte Verfuegbarkeit der Daten und die Einhaltung von Datenschutzbestimmungen stellen hierbei besonders kritische Erfolgsfaktoren dar. Mit der Security Assertion Markup Language (SAML), den Spezifikationen der Liberty Alliance und WS-Federation als integralem Bestandteil des Web Services WS-*-Protokollstacks haben sich industrielle und partiell standardisierte technische Ansaetze fuer FIM herauskristallisiert, deren praktische Umsetzung jedoch noch haeufig an der nur unzureichend geklaerten, komplexen organisatorischen Einbettung und den technischen Unzulaenglichkeiten hinsichtlich der Integration in bestehende IT-Infrastrukturen scheitert. In dieser Arbeit wird zunaechst eine tiefgehende und in diesem Umfang neue Anforderungsanalyse durchgefuehrt, die neben I&AM und FIM auch die als User-Centric Identity Management (UCIM) bezeichnete Benutzerperspektive beruecksichtigt; die Schwerpunkte der mehr als 60 strukturierten und gewichteten Anforderungen liegen dabei auf der Integration von I&AM- und FIM-Systemen sowohl auf der Seite der organisation, der die Benutzer angehoeren (Identity Provider), als auch beim jeweiligen Dienstleister (Service Provider), und auf dem Einbezug von organisatorischen Randbedingungen sowie ausgewaehlten Sicherheits- und Datenschutzaspekten. Im Rahmen eines umfassenden, gesamtheitlichen Architekturkonzepts wird anschliessend eine Methodik zur systematischen Integration von FIM-Komponenten in bestehende I&AM-Systeme erarbeitet. Neben der praezisen Spezifikation der technischen Systemschnittstellen, die den bestehenden Ansaetzen fehlt, fokussiert diese Arbeit auf die organisatorische Eingliederung aus Sicht des IT Service Managements, wobei insbesondere das Security Management und das Change Management nach ITIL vertieft werden. Zur Kompensation weiterer grundlegender Defizite bisheriger FIM-Ansaetze werden im Rahmen eines Werkzeugkonzepts fuenf neue FIM-Komponenten spezifiziert, die auf eine verbesserte Interoperabilitaet der FIM-Systeme der an einer so genannten Identity Federation beteiligten organisationen abzielen. Darueber hinaus wird auf Basis der eXtensible Access Control Markup Language (XACML) eine policy-basierte Privacy Management Architektur spezifiziert und integriert, die eine dezentrale Steuerung und Kontrolle von Datenfreigaben durch Administratoren und Benutzer ermoeglicht und somit essentiell zur Einhaltung von Datenschutzauflagen beitraegt. Eine Beschreibung der prototypischen Implementierung der Werkzeugkonzepte mit einer Diskussion ihrer Performanz und die methodische Anwendung des Architekturkonzepts auf ein komplexes, realistisches Szenario runden die Arbeit ab
    corecore