49 research outputs found

    DACTyL:towards providing the missing link between clinical and telehealth data

    Get PDF
    This document conveys the findings of the Data Analytics, Clinical, Telehealth, Link (DACTyL) project. This nine-month project started at January 2013 and was conducted at Philips Research in the Care Management Solution group and as part of the Data Analysis for Home Healthcare (DA4HH) project. The DA4HH charter is to perform and support retrospective analyses of data from Home Healthcare products, such as Motiva telehealth. These studies will provide valid insights in actual clinical aspects, usage and behavior of installed products and services. The insights will help to improve service offerings, create clinical algorithms for better outcome, and validate and substantiate claims on efficacy and cost-effectiveness. The current DACTyL project aims at developing and implementing an architecture and infrastructure to meet the most demanding need from Motiva telehealth customers on return on investment (ROI). These customers are hospitals that offer Motiva telehealth to their patients. In order to provide the Motiva service cost-effectively, they need to have insight into the actual cost, benefit and resource utilization when it comes to Motiva deployment compared to their usual routine care. Additional stakeholders for these ROI-related data are Motiva customer consultants and research scientists from Philips for strengthening their messaging and service deliveries to arrive at better patient care

    DACTyL:towards providing the missing link between clinical and telehealth data

    Get PDF
    This document conveys the findings of the Data Analytics, Clinical, Telehealth, Link (DACTyL) project. This nine-month project started at January 2013 and was conducted at Philips Research in the Care Management Solution group and as part of the Data Analysis for Home Healthcare (DA4HH) project. The DA4HH charter is to perform and support retrospective analyses of data from Home Healthcare products, such as Motiva telehealth. These studies will provide valid insights in actual clinical aspects, usage and behavior of installed products and services. The insights will help to improve service offerings, create clinical algorithms for better outcome, and validate and substantiate claims on efficacy and cost-effectiveness. The current DACTyL project aims at developing and implementing an architecture and infrastructure to meet the most demanding need from Motiva telehealth customers on return on investment (ROI). These customers are hospitals that offer Motiva telehealth to their patients. In order to provide the Motiva service cost-effectively, they need to have insight into the actual cost, benefit and resource utilization when it comes to Motiva deployment compared to their usual routine care. Additional stakeholders for these ROI-related data are Motiva customer consultants and research scientists from Philips for strengthening their messaging and service deliveries to arrive at better patient care

    Database migration processes and optimization using BSMS (bank staff management system)

    Get PDF
    Veritabanları temel olarak karmaşık verilere bağlı görevleri yerine getirmek ve bu görevleri gerçekleştirmek için tasarlanmış bir depolama teknolojisidir, veri bütünlüğü önemlidir. Pek çok şirket için, veritabanları kelimenin tam anlamıyla şirketin işinin elektronik bir temsilidir ve göç sırasında herhangi bir veri parçasını kaybeder ve kaybeder kabul edilemez. Verilerin taşınmasının çeşitli ticari nedenleri vardır, bunlardan bazıları arşivleme, veri depolama, yeni ortama, platformlara veya teknolojiye geçmedir. Veri tabanı geçişi, genellikle değerlendirme, veri tabanı şeması dönüşümü, veri geçişi ve işlevsel testi içeren karmaşık, çok fazlı bir işlemdir. Çevrimiçi İşlem İşleme (OLTP) veritabanları genellikle veri bütünlüğü sağlama, veri fazlalığını ortadan kaldırma ve kayıt kilitlemesini azaltma gibi görevleri yerine getirerek verimlilik için çok normalize edilir. Ancak bu veritabanı tasarım sistemi bize çok sayıda tablo sunar ve bu tabloların ve yabancı anahtar kısıtlamalarının her biri veri taşıma noktasında dikkate alınmalıdır. Ayrıca, geleneksel görevlerden farklı olarak veri taşıma işi için Kabul kriteri tamamen% 100'dür, çünkü hatalar veritabanlarında tolere edilmez ve kalite önemlidir. Bu tez, verilerin Paradox veritabanı adı verilen yavaş, verimsiz ve eski bir veritabanı platformundan, verileri başarıyla geçiren Oracle adı verilen çok daha gelişmiş bir veritabanına aktarılması sırasında ortaya çıkan zorlukları ve kaygıları göstermektedir. Herhangi bir tutarsızlık ve veri kaybı olmadan verileri hızlı bir şekilde alarak, bir sorgunun performansını iyileştirmek için indeksleme tekniği kullanılmıştır

    Data Integration: a Case Study in the Financial Services Industry

    Get PDF
    Current economic conditions result in banks participating in multiple mergers and acquisitions. This results in banks inheriting silo and heterogeneous systems. For banks to remain competitive, they must create a strategy to integrate data from these acquired systems in a dynamic, efficient, and consumable manner. This research considers a case study of a large financial services company that has successfully integrated data from different sources. In addition this research investigates endeavors that experts in the field have undertaken to develop architectures that address the problems with data integration and appropriate solutions

    Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems

    Get PDF
    An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM) Content Aggregation Model (CAM). In contrast, LMSs are independent computer systems that manage and deliver course content to students via a web interface. This research addressed three important issues related to the interoperability gap: (a) a lack of a metadata standard that defined the format of how student assessment data should be communicated from LMSs to LORs, (b) a lack of an architectural standard for the movement of data from LMSs to LORs, and (c) a lack of middleware that facilitated the movement of the student assessment data from the LMSs to LORs. This research achieved the following objectives: (a) the SCORM CAM LOM standard was extended to facilitate the storage of student assessment data, (b) Service Oriented Architecture (SOA) was identified as the best architecture to resolve the interoperability gap between LMSs and LORs, (c) a panel of Computer Information Systems (CIS) experts participated in a five-stage, web-based, anonymous Delphi process that approved and ranked 28 functional requirements for a proposed middleware application, and (d) the functional requirements were verified via the development of a prototype that transferred student assessment data from a LMSs into the LOM of LOs that are stored within a LOR. In conclusion, the research demonstrated that there are three acceptable approaches to extending the SCORM LOM standard: (a) new metadata elements, (b) new vocabulary values, and (c) the reference of an internal or external XML file using a location element. The main accomplishments of the research were the gathering of SOA functional requirements and the development of a prototype that provided an approach for the resolution of the interoperability gap that exists between LMSs and LORs

    A survey of RDB to RDF translation approaches and tools

    Get PDF
    ISRN I3S/RR 2013-04-FR 24 pagesRelational databases scattered over the web are generally opaque to regular web crawling tools. To address this concern, many RDB-to-RDF approaches have been proposed over the last years. In this paper, we propose a detailed review of seventeen RDB-to-RDF initiatives, considering end-to-end projects that delivered operational tools. The different tools are classified along three major axes: mapping description language, mapping implementation and data retrieval method. We analyse the motivations, commonalities and differences between existing approaches. The expressiveness of existing mapping languages is not always sufficient to produce semantically rich data and make it usable, interoperable and linkable. We therefore briefly present various strategies investigated in the literature to produce additional knowledge. Finally, we show that R2RML, the W3C recommendation for describing RDB to RDF mappings, may not apply to all needs in the wide scope of RDB to RDF translation applications, leaving space for future extensions

    An Examination of Processes based on Open Standards in Support of Service Location

    Get PDF
    A private telecom carrier partnered with the University of Waterloo to examine opportunities to improve their asset management processes. A reliance on traditional CAD technology made it difficult to generate an enterprise view of operational assets, such as poles and cables, since CAD documents were limited to neighbourhood scale coverage. The CAD documents had to communicate logical and locational properties of these assets. These requirements were often at odds since the elements in these CAD documents were occasionally moved to clarify logical aspects, the most common being connectivity with other telecommunications hardware. Elements within the drawings were also restricted to two dimensions, a legacy of early adoption of CAD technology within the telecom carrier. Developments in GIS and architectural technology that have occurred since the introduction of CAD offer opportunities to manage assets using enterprise geospatial systems with three dimensional content. Prominent technologies and standards, such as CityGML and Oracle, will be examined to develop a model to support requirements related to service location. A service location for this paper is a site that requires the deployment of specific resources to meet the needs of a service request. Additionally, as location displacement is an issue that needs to be addressed, an evaluation of data quality processes related to location will be presented. The results from this evaluation will then be used to construct a final standards based 3D geospatial service location model, one that should address the needs of the partner carrier

    Data Warehouse and Business Intelligence: Comparative Analysis of Olap tools

    Get PDF
    Data Warehouse applications are designed basically to provide the business communities with accurate and consolidated information. The objective of Data Warehousing applications are not just for collecting data and reporting, but rather for analyzing, it requires technical and business expertise tools. To achieve business intelligence it requires proper tools to be selected. The most commonly used Business intelligence (BI) technologies are Online Analytical Processing (OLAP) and Reporting tools for analyzing the data and to make tactical decision for the better performance of the organization, and more over to provide quick and fast access to end user request. This study will review data warehouse environment and architecture, business intelligence concepts, OLAP and the related theories involved on it. As well as the concept of data warehouse and OLAP, this study will also present comparative analysis of commonly used OLAP tools in Organization

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources
    corecore