1,837 research outputs found

    BlogForever D3.2: Interoperability Prospects

    Get PDF
    This report evaluates the interoperability prospects of the BlogForever platform. Therefore, existing interoperability models are reviewed, a Delphi study to identify crucial aspects for the interoperability of web archives and digital libraries is conducted, technical interoperability standards and protocols are reviewed regarding their relevance for BlogForever, a simple approach to consider interoperability in specific usage scenarios is proposed, and a tangible approach to develop a succession plan that would allow a reliable transfer of content from the current digital archive to other digital repositories is presented

    SoK: Distributed Computing in ICN

    Full text link
    Information-Centric Networking (ICN), with its data-oriented operation and generally more powerful forwarding layer, provides an attractive platform for distributed computing. This paper provides a systematic overview and categorization of different distributed computing approaches in ICN encompassing fundamental design principles, frameworks and orchestration, protocols, enablers, and applications. We discuss current pain points in legacy distributed computing, attractive ICN features, and how different systems use them. This paper also provides a discussion of potential future work for distributed computing in ICN.Comment: 10 pages, 3 figures, 1 table. Accepted by ACM ICN 202

    Internet of things for medication control: service implementation and testing

    Get PDF
    RFID technology (Radio Frequency IDentification) enables putting an identification label (e.g. tag) into a certain object and, by means of a reader, get the information related to it without any physical contact. The use of these tags in medical context enables a rapid and precise identification of each patient and, by means of Internet of Things (IoT), enables a ubiquitous and quick access to medical related records. These technologies, RFID and IoT, integrated within a suitable system, promote a better physician-patient interaction. A simple IoT-enabled system can send warnings to any physician, nurse or other health caregiver. Therefore, with a simple IoT architecture one may remotely monitor and control the patient's well being. This paper presents an IoT architecture, using RFID tags, able to easily and remotely establish a medication control system, beginning from physicians prescription to pharmaceutical drug administration. This paper presents the implementation and analysis of a prototype service, with a web interface, allowing for a first evaluation of the proposed service. The prototype service - based on RFID, EPC (Electronic Product Code) and ONS (Object Name Service) - and its web interface are presented and evaluated. Some use cases are presented and evaluated using this prototype service entitled "RFID-based IoT for medication control".Fundação para a Ciência e a Tecnologia (FCT

    Clinical foundations and information architecture for the implementation of a federated health record service

    Get PDF
    Clinical care increasingly requires healthcare professionals to access patient record information that may be distributed across multiple sites, held in a variety of paper and electronic formats, and represented as mixtures of narrative, structured, coded and multi-media entries. A longitudinal person-centred electronic health record (EHR) is a much-anticipated solution to this problem, but its realisation is proving to be a long and complex journey. This Thesis explores the history and evolution of clinical information systems, and establishes a set of clinical and ethico-legal requirements for a generic EHR server. A federation approach (FHR) to harmonising distributed heterogeneous electronic clinical databases is advocated as the basis for meeting these requirements. A set of information models and middleware services, needed to implement a Federated Health Record server, are then described, thereby supporting access by clinical applications to a distributed set of feeder systems holding patient record information. The overall information architecture thus defined provides a generic means of combining such feeder system data to create a virtual electronic health record. Active collaboration in a wide range of clinical contexts, across the whole of Europe, has been central to the evolution of the approach taken. A federated health record server based on this architecture has been implemented by the author and colleagues and deployed in a live clinical environment in the Department of Cardiovascular Medicine at the Whittington Hospital in North London. This implementation experience has fed back into the conceptual development of the approach and has provided "proof-of-concept" verification of its completeness and practical utility. This research has benefited from collaboration with a wide range of healthcare sites, informatics organisations and industry across Europe though several EU Health Telematics projects: GEHR, Synapses, EHCR-SupA, SynEx, Medicate and 6WINIT. The information models published here have been placed in the public domain and have substantially contributed to two generations of CEN health informatics standards, including CEN TC/251 ENV 13606

    Internet of things for medication control: e-health architecture and service implementation

    Get PDF
    The use of Radio Frequency Identification technology (RFID) in medical context enables drug identification but also a rapid and, of course, precise identification of patients, physicians, nurses or any other health caregiver. Combining RFID tag identification with structured and secure Internet of Things (IoT) solutions, one can establish a ubiquitous and quick access to any type of medical related records, as long as one can control and adequately secure all the Internet mediated interactions. This paper presents an e-Health service architecture, along with the corresponding Internet of Things prototype implementation, that makes use of RFID tags and Electronic Product Codes (EPC) standards, in order to easily establish in a ubiquitous manner a medication control system. The system, presented and tested, has a web interface and allowed for a first evaluation of the e-health proposed service. As the service is mainly focused on elderly Ambient Assisted Living (AAL) solutions, all these technologies - RFID, EPC, Object Naming Service (ONS) and IoT – have been integrated into a suitable system, able to promote better patient/physician, patient/nurse and, generally, any patient/health caregiver, interactions. The whole prototype service, entitled "RFID-based IoT for Medication Control", and its web interface are presented and evaluated.FEDER Funds through the Programa Operacional Fatores de Competitividade – COMPETE and by National Funds through the FCT - Fundação para a Ciência e a Tecnologia (Portuguese Foundation for Science and Technology) within project FCOMP-01-0124-FEDER-02267

    Biomedical data integration in computational drug design and bioinformatics

    Get PDF
    [Abstract In recent years, in the post genomic era, more and more data is being generated by biological high throughput technologies, such as proteomics and transcriptomics. This omics data can be very useful, but the real challenge is to analyze all this data, as a whole, after integrating it. Biomedical data integration enables making queries to different, heterogeneous and distributed biomedical data sources. Data integration solutions can be very useful not only in the context of drug design, but also in biomedical information retrieval, clinical diagnosis, system biology, etc. In this review, we analyze the most common approaches to biomedical data integration, such as federated databases, data warehousing, multi-agent systems and semantic technology, as well as the solutions developed using these approaches in the past few years.Red Gallega de Investigación sobre Cáncer Colorrectal; Ref. 2009/58Programa Iberoamericano de Ciencia y Tecnología para el Desarrollo; 209RT- 0366Instituto de Salud Carlos III; PIO52048Instituto de Salud Carlos III; RD07/0067/0005Ministerio de Industria, Turismo y Comercio; TSI-020110-2009-

    Cloud Computing cost and energy optimization through Federated Cloud SoS

    Get PDF
    2017 Fall.Includes bibliographical references.The two most significant differentiators amongst contemporary Cloud Computing service providers have increased green energy use and datacenter resource utilization. This work addresses these two issues from a system's architectural optimization viewpoint. The proposed approach herein, allows multiple cloud providers to utilize their individual computing resources in three ways by: (1) cutting the number of datacenters needed, (2) scheduling available datacenter grid energy via aggregators to reduce costs and power outages, and lastly by (3) utilizing, where appropriate, more renewable and carbon-free energy sources. Altogether our proposed approach creates an alternative paradigm for a Federated Cloud SoS approach. The proposed paradigm employs a novel control methodology that is tuned to obtain both financial and environmental advantages. It also supports dynamic expansion and contraction of computing capabilities for handling sudden variations in service demand as well as for maximizing usage of time varying green energy supplies. Herein we analyze the core SoS requirements, concept synthesis, and functional architecture with an eye on avoiding inadvertent cascading conditions. We suggest a physical architecture that diminishes unwanted outcomes while encouraging desirable results. Finally, in our approach, the constituent cloud services retain their independent ownership, objectives, funding, and sustainability means. This work analyzes the core SoS requirements, concept synthesis, and functional architecture. It suggests a physical structure that simulates the primary SoS emergent behavior to diminish unwanted outcomes while encouraging desirable results. The report will analyze optimal computing generation methods, optimal energy utilization for computing generation as well as a procedure for building optimal datacenters using a unique hardware computing system design based on the openCompute community as an illustrative collaboration platform. Finally, the research concludes with security features cloud federation requires to support to protect its constituents, its constituents tenants and itself from security risks

    A Two-Level Information Modelling Translation Methodology and Framework to Achieve Semantic Interoperability in Constrained GeoObservational Sensor Systems

    Get PDF
    As geographical observational data capture, storage and sharing technologies such as in situ remote monitoring systems and spatial data infrastructures evolve, the vision of a Digital Earth, first articulated by Al Gore in 1998 is getting ever closer. However, there are still many challenges and open research questions. For example, data quality, provenance and heterogeneity remain an issue due to the complexity of geo-spatial data and information representation. Observational data are often inadequately semantically enriched by geo-observational information systems or spatial data infrastructures and so they often do not fully capture the true meaning of the associated datasets. Furthermore, data models underpinning these information systems are typically too rigid in their data representation to allow for the ever-changing and evolving nature of geo-spatial domain concepts. This impoverished approach to observational data representation reduces the ability of multi-disciplinary practitioners to share information in an interoperable and computable way. The health domain experiences similar challenges with representing complex and evolving domain information concepts. Within any complex domain (such as Earth system science or health) two categories or levels of domain concepts exist. Those concepts that remain stable over a long period of time, and those concepts that are prone to change, as the domain knowledge evolves, and new discoveries are made. Health informaticians have developed a sophisticated two-level modelling systems design approach for electronic health documentation over many years, and with the use of archetypes, have shown how data, information, and knowledge interoperability among heterogenous systems can be achieved. This research investigates whether two-level modelling can be translated from the health domain to the geo-spatial domain and applied to observing scenarios to achieve semantic interoperability within and between spatial data infrastructures, beyond what is possible with current state-of-the-art approaches. A detailed review of state-of-the-art SDIs, geo-spatial standards and the two-level modelling methodology was performed. A cross-domain translation methodology was developed, and a proof-of-concept geo-spatial two-level modelling framework was defined and implemented. The Open Geospatial Consortium’s (OGC) Observations & Measurements (O&M) standard was re-profiled to aid investigation of the two-level information modelling approach. An evaluation of the method was undertaken using II specific use-case scenarios. Information modelling was performed using the two-level modelling method to show how existing historical ocean observing datasets can be expressed semantically and harmonized using two-level modelling. Also, the flexibility of the approach was investigated by applying the method to an air quality monitoring scenario using a technologically constrained monitoring sensor system. This work has demonstrated that two-level modelling can be translated to the geospatial domain and then further developed to be used within a constrained technological sensor system; using traditional wireless sensor networks, semantic web technologies and Internet of Things based technologies. Domain specific evaluation results show that twolevel modelling presents a viable approach to achieve semantic interoperability between constrained geo-observational sensor systems and spatial data infrastructures for ocean observing and city based air quality observing scenarios. This has been demonstrated through the re-purposing of selected, existing geospatial data models and standards. However, it was found that re-using existing standards requires careful ontological analysis per domain concept and so caution is recommended in assuming the wider applicability of the approach. While the benefits of adopting a two-level information modelling approach to geospatial information modelling are potentially great, it was found that translation to a new domain is complex. The complexity of the approach was found to be a barrier to adoption, especially in commercial based projects where standards implementation is low on implementation road maps and the perceived benefits of standards adherence are low. Arising from this work, a novel set of base software components, methods and fundamental geo-archetypes have been developed. However, during this work it was not possible to form the required rich community of supporters to fully validate geoarchetypes. Therefore, the findings of this work are not exhaustive, and the archetype models produced are only indicative. The findings of this work can be used as the basis to encourage further investigation and uptake of two-level modelling within the Earth system science and geo-spatial domain. Ultimately, the outcomes of this work are to recommend further development and evaluation of the approach, building on the positive results thus far, and the base software artefacts developed to support the approach

    Managing cultural heritage: information systems architecture

    Get PDF
    This chapter is about the architecture of systems that store, preserve and provide access to digital cultural heritage objects. It presents some major design considerations for implementing cultural heritage system architectures and some existing architectural patterns currently in use. Then, a simpler architectural design is proposed; this new architecture could potentially have a positive impact on digital preservation

    Benefits through Utilising EPC Network Components in Service‐Oriented Environments – an Analysis Using the Example of the Food Industry

    Get PDF
    Improvements in the food sector imply enhancements of delivering food which is safe, affordable, readily available, and of the quality and diversity consumers expect. However, prevalent information systems (IS) of companies in the food industry are not ready to support further significant improvements. They especially lack the capability to exchange relevant information in an efficient manner. Since recently, two major developments can be observed from IS perspective: the spreading of service-oriented architectures (SOA) as well as an increase in mass serialization (due to public and private traceability requirements, e.g.). So far, though most important due to food safety, a growing need to become more efficient as well as an increasing information demand of consumers, the food sector has attracted little attention in literature concerning an analysis about the potential of both service-orientation and the Electronic Product Code (EPC) Network. This is why this paper will investigate to which extent these two developments can contribute to facilitate food companies’ IS helping them to maintain their competiveness. As a starting point, the research paper will depict the state of the art including SOA and the EPC Network. After describing the research approach, it will proceed with a characterisation of the food sector including an examination why there is need for action. Based on current research findings as well as experience gathered in recent projects, the paper will investigate the application of the EPC Network with its three major components, i. e. EPCIS (EPC Information Services), ONS (Object Name Service) and the EPC Discovery Services, as part of future IS architectures in this sector. The paper will close with a discussion whether the envisioned IS architecture is appropriate to accomplish the previously identified challenges and requirements in the food sector in a more agile, efficient and effective way. What is more, it will highlight the most pressing challenges and provide an outlook as to the following steps of the research
    corecore