1,526 research outputs found

    Data virtualization design model for near real time decision making in business intelligence environment

    Get PDF
    The main purpose of Business Intelligence (BI) is to focus on supporting an organization‘s strategic, operational and tactical decisions by providing comprehensive, accurate and vivid data to the decision makers. A data warehouse (DW), which is considered as the input for decision making system activities is created through a complex process known as Extract, Transform and Load (ETL). ETL operates at pre-defined times and requires time to process and transfer data. However, providing near real time information to facilitate the data integration in supporting decision making process is a known issue. Inaccessibility to near realtime information could be overcome with Data Virtualization (DV) as it provides unified, abstracted, near real time, and encapsulated view of information for querying. Nevertheless, currently, there are lack of studies on the BI model for developing and managing data in virtual manner that can fulfil the organization needs. Therefore, the main aim of this study is to propose a DV model for near-real time decision making in BI environment. Design science research methodology was adopted to accomplish the research objectives. As a result of this study, a model called Data Virtualization Development Model (DVDeM) is proposed that addresses the phases and components which affect the BI environment. To validate the model, expert reviews and focus group discussions were conducted. A prototype based on the proposed model was also developed, and then implemented in two case studies. Also, an instrument was developed to measure the usability of the prototype in providing near real time data. In total, 60 participants were involved and the findings indicated that 93% of the participants agreed that the DVDeM based prototype was able to provide near real-time data for supporting decision-making process. From the studies, the findings also showed that the majority of the participants (more than 90%) in both of education and business sectors, have affirmed the workability of the DVDeM and the usability of the prototype in particular able to deliver near real-time decision-making data. Findings also indicate theoretical and practical contributions for developers to develop efficient BI applications using DV technique. Also, the mean values for each measurement item are greater than 4 indicating that the respondents agreed with the statement for each measurement item. Meanwhile, it was found that the mean scores for overall usability attributes of DVDeM design model fall under "High" or "Fairly High". Therefore, the results show sufficient indications that by adopting DVDeM model in developing a system, the usability of the produced system is perceived by the majority of respondents as high and is able to support near real time decision making data

    DATA WAREHOUSE AND BUSINESS INTELLIGENCE STRATEGIES AND TRENDS

    Get PDF
    In recent decades following the evolution of information technology, decision support systems have played an important role by presenting the necessary information resulted from the operational systems processes. By continuing improvement of the methods as well as the contribution of technological advance the applicability of decision support systems is now generalized and has reached the status of complex systems of business intelligence. Business Intelligence is about creating intelligence about a business based on a cyclic flow which consists of capturing, analyzing, planning and implementation resulting in streamlining the organization.Decisions, DSS, Data Driven, Business Intelligence, Data Warehouse

    Collaboration and Virtualization in Large Information Systems Projects

    Get PDF
    A project is evolving through different phases from idea and conception until the experiments, implementation and maintenance. The globalization, the Internet, the Web and the mobile computing changed many human activities, and in this respect, the realization of the Information System (IS) projects. The projects are growing, the teams are geographically distributed, and the users are heterogeneous. In this respect, the realization of the large Information Technology (IT) projects needs to use collaborative technologies. The distribution of the team, the users' heterogeneity and the project complexity determines the virtualization. This paper is an overview of these aspects for large IT projects. It shortly present a general framework developed by the authors for collaborative systems in general and adapted to collaborative project management. The general considerations are illustrated on the case of a large IT project in which the authors were involved.large IT projects, collaborative systems, virtualization, framework for collaborative virtual systems

    Business Intelligence in the Cloud?

    Get PDF
    Business Intelligence (BI) deals with integrated approaches to management support. In many cases, the integrated infrastructures that are subject to BI have become complex, costly, and inflexible. A possible remedy for these issues might arise on the horizon with “Cloud Computing” concepts that promise new options for a net based sourcing of hard- and software. Currently, there is still a dearth of concepts for defining, designing, and structuring a possible adaption of Cloud Computing to the domain of BI. This contribution combines results from the outsourcing and the BI literature and derives a framework for delineating “Cloud BI” approaches. This is the bases for the discussion of six possible scenarios – some of which within immediate reach today

    Implementation of a data virtualization layer applied to insurance data

    Get PDF
    This work focuses on the introduction of a data virtualization layer to read and consolidate data from heterogeneous sources (Hadoop system, a data mart and a data warehouse) and provide a single point of data access to all data consumers

    Ubiquitous Computing – an Application Domain for Business Intelligence in the Cloud?

    Get PDF
    A number of IT providers have introduced web-based services for management support that are discussed under the label“Business Intelligence (BI) in the Cloud”. It has been argued that these Cloud products might become valuable complementsto on-premise enterprise BI infrastructures by allowing a flexible addition of sizeable components, tools or – in selected areas– complete solutions. In this publication, it is discussed in how far a Ubiquitous Computing setting based on technologies likeradio frequency identification (RFID) or sensor technology could become a relevant application domain for Cloud-BI”. Themain insights come from a literature review, a series of expert interviews on BI and Cloud Computing, and a case on spareparts logistics. The results indicate that the addressed domain indeed comes with business potential and highlight the need forfurther design oriented research

    A Proposed Architecture for Big Data Driven Supply Chain Analytics

    Full text link
    Advancement in information and communication technology (ICT) has given rise to explosion of data in every field of operations. Working with the enormous volume of data (or Big Data, as it is popularly known as) for extraction of useful information to support decision making is one of the sources of competitive advantage for organizations today. Enterprises are leveraging the power of analytics in formulating business strategy in every facet of their operations to mitigate business risk. Volatile global market scenario has compelled the organizations to redefine their supply chain management (SCM). In this paper, we have delineated the relevance of Big Data and its importance in managing end to end supply chains for achieving business excellence. A Big Data-centric architecture for SCM has been proposed that exploits the current state of the art technology of data management, analytics and visualization. The security and privacy requirements of a Big Data system have also been highlighted and several mechanisms have been discussed to implement these features in a real world Big Data system deployment in the context of SCM. Some future scope of work has also been pointed out. Keyword: Big Data, Analytics, Cloud, Architecture, Protocols, Supply Chain Management, Security, Privacy.Comment: 24 pages, 4 figures, 3 table

    Collaboration and Virtualization in Large Information Systems Projects

    Get PDF
    A project is evolving through different phases from idea and conception until the experiments, implementation and maintenance. The globalization, the Internet, the Web and the mobile computing changed many human activities, and in this respect, the realization of the Information System (IS) projects. The projects are growing, the teams are geographically distributed, and the users are heterogeneous. In this respect, the realization of the large Information Technology (IT) projects needs to use collaborative technologies. The distribution of the team, the users' heterogeneity and the project complexity determines the virtualization. This paper is an overview of these aspects for large IT projects. It shortly present a general framework developed by the authors for collaborative systems in general and adapted to collaborative project management. The general considerations are illustrated on the case of a large IT project in which the authors were involved

    On-site customer analytics and reporting (OSCAR):a portable clinical data warehouse for the in-house linking of hospital and telehealth data

    Get PDF
    This document conveys the results of the On-Site Customer Analytics and Reporting (OSCAR) project. This nine-month project started on January 2014 and was conducted at Philips Research in the Chronic Disease Management group as part of the H2H Analytics Project. Philips has access to telehealth data from their Philips Motiva tele-monitoring and other services. Previous projects within Philips Re-search provided a data warehouse for Motiva data and a proof-of-concept (DACTyL) solution that demonstrated the linking of hospital and Motiva data and subsequent reporting. Severe limitations with the DACTyL solution resulted in the initiation of OSCAR. A very important one was the unwillingness of hospitals to share personal patient data outside their premises due to stringent privacy policies, while at the same time patient personal data is required in order to link the hospital data with the Motiva data. Equally important is the fact that DACTyL considered the use of only Motiva as a telehealth source and only a single input interface for the hospitals. OSCAR was initiated to propose a suitable architecture and develop a prototype solution, in contrast to the proof-of-concept DACTyL, with the twofold aim to overcome the limitations of DACTyL in order to be deployed in a real-life hospital environment and to expand the scope to an extensible solution that can be used in the future for multiple telehealth services and multiple hospital environments. In the course of the project, a software solution was designed and consequently deployed in the form of a virtual machine. The solution implements a data warehouse that links and hosts the collected hospital and telehealth data. Hospital data are collected with the use of a modular service oriented data collection component by exposing web services described in WSDL that accept configurable XML data messages. ETL processes propagate the data, link, and load it on the OS-CAR data warehouse. Automated reporting is achieved using dash-boards that provide insight into the data stored in the data warehouse. Furthermore, the linked data is available for export to Philips Re-search in de-identified format

    Semantic processing of EHR data for clinical research

    Get PDF
    There is a growing need to semantically process and integrate clinical data from different sources for clinical research. This paper presents an approach to integrate EHRs from heterogeneous resources and generate integrated data in different data formats or semantics to support various clinical research applications. The proposed approach builds semantic data virtualization layers on top of data sources, which generate data in the requested semantics or formats on demand. This approach avoids upfront dumping to and synchronizing of the data with various representations. Data from different EHR systems are first mapped to RDF data with source semantics, and then converted to representations with harmonized domain semantics where domain ontologies and terminologies are used to improve reusability. It is also possible to further convert data to application semantics and store the converted results in clinical research databases, e.g. i2b2, OMOP, to support different clinical research settings. Semantic conversions between different representations are explicitly expressed using N3 rules and executed by an N3 Reasoner (EYE), which can also generate proofs of the conversion processes. The solution presented in this paper has been applied to real-world applications that process large scale EHR data.Comment: Accepted for publication in Journal of Biomedical Informatics, 2015, preprint versio
    • …
    corecore