349 research outputs found

    In-memory business intelligence: a Wits context

    Get PDF
    The organisational demand for real-time, flexible and cheaper approaches to Business Intelligence is impacting the Business Intelligence ecosystem. In-memory databases, in-memory analytics, the availability of 64 bit computing power, as well as the reduced costs of memory, are enabling technologies to meet this demand. This research report examines whether these technologies will have an evolutionary or a revolutionary impact on traditional Business Intelligence implementations. An in-memory analytic solution was developed for University of the Witwatersrand Procurement Office, to evaluate the benefits claimed for the in-memory approach for Business intelligence, in the development, reporting and analysis processes. A survey was used to collect data on the users' experience when using an in-memory solution. The results indicate that the in-memory solution offers a fast, flexible and visually rich user experience. However, there are certain key steps of the traditional BI approach that cannot be omitted. The conclusion reached is that the in-memory approach to Business Intelligence can co-exist with the traditional Business Intelligence approach, so that the merits of both approaches can be leveraged to enhance value for an organisation

    DACTyL:towards providing the missing link between clinical and telehealth data

    Get PDF
    This document conveys the findings of the Data Analytics, Clinical, Telehealth, Link (DACTyL) project. This nine-month project started at January 2013 and was conducted at Philips Research in the Care Management Solution group and as part of the Data Analysis for Home Healthcare (DA4HH) project. The DA4HH charter is to perform and support retrospective analyses of data from Home Healthcare products, such as Motiva telehealth. These studies will provide valid insights in actual clinical aspects, usage and behavior of installed products and services. The insights will help to improve service offerings, create clinical algorithms for better outcome, and validate and substantiate claims on efficacy and cost-effectiveness. The current DACTyL project aims at developing and implementing an architecture and infrastructure to meet the most demanding need from Motiva telehealth customers on return on investment (ROI). These customers are hospitals that offer Motiva telehealth to their patients. In order to provide the Motiva service cost-effectively, they need to have insight into the actual cost, benefit and resource utilization when it comes to Motiva deployment compared to their usual routine care. Additional stakeholders for these ROI-related data are Motiva customer consultants and research scientists from Philips for strengthening their messaging and service deliveries to arrive at better patient care

    A case study of business intelligence applications for business users

    Get PDF
    This research is conducted in two parts, with the first part reviewing the standard industry approach to providing organisations with business intelligence (BI) architecture. The discussion begins with a brief history of the evolution of data warehouses and business intelligence (DW/BI) systems. The generic approach to developing a DW/BI is described and the interfaces and features of BI applications are explored as to how they support the various user roles within an organisation e.g. executive, business user and business analyst. The discussion is presented using references to the Zachman Framework. The second part of the research focuses on a case study examining an organisation's implementation of a bespoke BI solution which is supporting its business managers with decision support, reporting and analysis. Where today's business intelligence is about giving business users the tools to get the information they need out of the data warehouse and thus reducing the reliance on IT departments, the bespoke solution studied puts the reliance on IT staff to support their business intelligence requirements. The BI requirements are compared and contrasted against the features of third party BI tools to reach a conclusion as to whether they support the reporting needs of the planning group in the case study or whether their needs are so specific that a bespoke solution is the best option and thus reliance on IT departments is still necessary to support the delivery of business intelligence. The findings from the first part of the research are the view that for the successful development of BI applications the BI user's needs should be addressed from the requirements stage, and the development of BI applications should run as a parallel activity alongside the data warehouse development activities. The BI applications should be developed by BI developers who have knowledge of the business, rather than technical IT staff. This view is supported by leading DW/BI authors such Kimball et al. (2008). The research also found the needs of the BI application users can be analysed by grouping them into one of five classifications of user - Tourists, Farmers, Explorers, Miners and Operators and that different user interfaces are needed to support their needs. The case study in the second part of the research found that the implementation of the DW/BI system in SAP using SAP BEx software fails to provide planning staff with BI applications that meet with all their reporting and analysis needs and has therefore led to the development of bespoke applications. The findings suggest that this may be because the planning staff were not involved at the scoping and planning stage of developing the DW/BI. The investigations found that most of the features in the bespoke BI system could be developed using a third party solution and that they are available in the SAP family of products. The level of expertise needed to develop the features ranged from easy to technical. The adoption of a third party tool could be used to develop the reports by the BI application developers identified by Kimball et al. (2008) and provide the planning managers with an intuitive and flexible user interface that can be easily customised and maintained. It was also found that SAP BusinessObject's Crystal Reports provide a rich user interface that is easy to use to support most of the BI features

    Extraction transformation load (ETL) solution for data integration: a case study of rubber import and export information

    Get PDF
    Data integration is important in consolidating all the data in the organization or outside the organization to provide a unified view of the organization's information. Extraction Transformation Load (ETL) solution is the back-end process of data integration which involves collecting data from various data sources, preparing and transforming the data according to business requirements and loading them into a Data Warehouse (DW). This paper explains the integration of the rubber import and export data between Malaysian Rubber Board (MRB) and Royal Malaysian Customs Department (Customs) using the ETL solution. Microsoft SQL Server Integration Services (SSIS) and Microsoft SQL Server Agent Jobs have been used as the ETL tool and ETL scheduling

    Data Warehouse and Business Intelligence: Comparative Analysis of Olap tools

    Get PDF
    Data Warehouse applications are designed basically to provide the business communities with accurate and consolidated information. The objective of Data Warehousing applications are not just for collecting data and reporting, but rather for analyzing, it requires technical and business expertise tools. To achieve business intelligence it requires proper tools to be selected. The most commonly used Business intelligence (BI) technologies are Online Analytical Processing (OLAP) and Reporting tools for analyzing the data and to make tactical decision for the better performance of the organization, and more over to provide quick and fast access to end user request. This study will review data warehouse environment and architecture, business intelligence concepts, OLAP and the related theories involved on it. As well as the concept of data warehouse and OLAP, this study will also present comparative analysis of commonly used OLAP tools in Organization

    Design and Implementation of an Enterprise Data Warehouse

    Get PDF
    The reporting and sharing of information has been synonymous with databases as long as there have been systems to host them. Now more than ever, users expect the sharing of information in an immediate, efficient, and secure manner. However, due to the sheer number of databases within the enterprise, getting the data in an effective fashion requires a coordinated effort between the existing systems. There is a very real need today to have a single location for the storage and sharing of data that users can easily utilize to make improved business decisions, rather than trying to traverse the multiple databases that exist today and can do so by using an enterprise data warehouse. The Thesis involves a description of data warehousing techniques, design, expectations, and challenges regarding data cleansing and transforming existing data, as well as other challenges associated with extracting from transactional databases. The Thesis also includes a technical piece discussing database requirements and technologies used to create and refresh the data warehouse. The Thesis discusses how data from databases and other data warehouses could integrate. In addition, there is discussion of specific data marts within the warehouse to satisfy a specific need. Finally, there are explanations for how users will consume the data in the enterprise data warehouse, such as through reporting and other business intelligence. This discussion also includes the topics of system architecture of how data from databases and other data warehouses from different departments could integrate. An Enterprise Data Warehouse prototype developed will show how a pair of different databases undergoes the Extract, Transform and Load (ETL) process and loaded into an actual set of star schemas then makes the reporting easier. Separately, an important piece of this thesis takes an actual example of data and compares the performance between them by running the same queries against separate databases, one transactional and one data warehouse. As the queries expand in difficulty, larger grows the gap between the actual recorded times of running that same query in the different environments

    Business intelligence-centered software as the main driver to migrate from spreadsheet-based analytics

    Get PDF
    Internship Report presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceNowadays, companies are handling and managing data in a way that they weren’t ten years ago. The data deluge is, as a mere consequence of that, the constant day-to-day challenge for them - having to create agile and scalable data solutions to tackle this reality. The main trigger of this project was to support the decision-making process of a customer-centered marketing team (called Customer Voice) in the Company X by developing a complete, holistic Business Intelligence solution that goes all the way from ETL processes to data visualizations based on that team’s business needs. Having this context into consideration, the focus of the internship was to make use of BI, ETL techniques to migrate their data stored in spreadsheets — where they performed data analysis — and shift the way they see the data into a more dynamic, sophisticated, and suitable way in order to help them make data-driven strategic decisions. To ensure that there was credibility throughout the development of this project and its subsequent solution, it was necessary to make an exhaustive literature review to help me frame this project in a more realistic and logical way. That being said, this report made use of scientific literature that explained the evolution of the ETL workflows, tools, and limitations across different time periods and generations, how it was transformed from manual to real-time data tasks together with data warehouses, the importance of data quality and, finally, the relevance of ETL processes optimization and new ways of approaching data integrations by using modern, cloud architectures

    Graduate entrepreneur analytical reports (GEAR) using data warehouse model: A case study at CEDI, Universiti Utara Malaysia (UUM)

    Get PDF
    Business Intelligence (BI) system using Data Warehouse (DW) technology is one of the important strategic management approaches in the organizations today.BI combines architectures, databases, analytical tools, and methodologies to enable interactive information access focused on analytical reports.Analytical reports, which affect the long-term direction of the entire company, are typically made by top managers. Decisions making in an organization is very difficult, especially if the organization has poor quality data and limited information.The management in the organization always depended on the past experiences and their instincts when making a decision making without support from the factual information. DW is a technology enable to integrate and transform enterprise data for strategic decision making.The organization, which is, responsible to manage entrepreneur activities need an analytical report for strategic decision making.This paper is focused how to design and develop Graduate Entrepreneur Analytical Reports called GEAR by using a DW model in Cooperative and Entrepreneur Development Institute (CEDI), Universiti Utara Malaysia (UUM) as a case study. This system has been tested through the system user feedback by using Computer System Usability Questionnaire (CSUQ), which measures satisfaction and consumer usability

    On-site customer analytics and reporting (OSCAR):a portable clinical data warehouse for the in-house linking of hospital and telehealth data

    Get PDF
    This document conveys the results of the On-Site Customer Analytics and Reporting (OSCAR) project. This nine-month project started on January 2014 and was conducted at Philips Research in the Chronic Disease Management group as part of the H2H Analytics Project. Philips has access to telehealth data from their Philips Motiva tele-monitoring and other services. Previous projects within Philips Re-search provided a data warehouse for Motiva data and a proof-of-concept (DACTyL) solution that demonstrated the linking of hospital and Motiva data and subsequent reporting. Severe limitations with the DACTyL solution resulted in the initiation of OSCAR. A very important one was the unwillingness of hospitals to share personal patient data outside their premises due to stringent privacy policies, while at the same time patient personal data is required in order to link the hospital data with the Motiva data. Equally important is the fact that DACTyL considered the use of only Motiva as a telehealth source and only a single input interface for the hospitals. OSCAR was initiated to propose a suitable architecture and develop a prototype solution, in contrast to the proof-of-concept DACTyL, with the twofold aim to overcome the limitations of DACTyL in order to be deployed in a real-life hospital environment and to expand the scope to an extensible solution that can be used in the future for multiple telehealth services and multiple hospital environments. In the course of the project, a software solution was designed and consequently deployed in the form of a virtual machine. The solution implements a data warehouse that links and hosts the collected hospital and telehealth data. Hospital data are collected with the use of a modular service oriented data collection component by exposing web services described in WSDL that accept configurable XML data messages. ETL processes propagate the data, link, and load it on the OS-CAR data warehouse. Automated reporting is achieved using dash-boards that provide insight into the data stored in the data warehouse. Furthermore, the linked data is available for export to Philips Re-search in de-identified format

    Implementation of business intelligence tools using open source approach

    Get PDF
    Discovering business intelligence is the modern organization’s way of gaining competitive advantage in their market, supported by Decisions Support Systems or Business Intelligence Systems. The first step in any decision support system is to create the repository of data for the system to collect and display any information requested. This repository is the source of all business intelligence and implementing it requires the right software tools, essential for the data warehouse. Therefore, when choosing the software tool, the project size, budget constraints and risks should be kept in mind. Overall the right choice depends on the organization’s needs and ambitions. The essential work to be done here is to demonstrate that open source software can be an accurate and reliable tool to implement data warehouse projects. The two ETL solutions used were: • Pentaho Kettle Data Integration Community Editions (Open Source Software) • SQL Server 2005 Integrations Services (SSIS) Enterprise Edition (Proprietary Software) The proprietary, commercial software in question (as well as others) is widely used. However, an open source solution has key features recognized by organizations worldwide and this work will show the different functionalities and benefits of this open source approach.Nas organizações a descoberta de conhecimento do negócio é o processo para alcançar vantagem competitiva sobre os seus concorrentes, e esta é apoiada por Sistemas de Suporte á decisão ou por Business Intelligence termo atualmente em voga.A primeira coisa a fazer em qualquer tipo de sistema de apoio à decisão é criar o repositório de dados de informação onde o sistema vai recolher e mostrar todas as informações solicitadas. Este repositório é a fonte de todo o conhecimento do negócio, e a sua construção exige as ferramentas de software corretas para o desenvolvimento do data warehouse. Deve-se por isso ao escolher a ferramenta de software pensar nos requisitos para a seleção do software do mercado, a escolha do software envolve o tamanho do projecto, orçamento, ou riscos a tomar em mente. Globalmente, a escolha certa depende das necessidades de organização e suas ambições. O trabalho essencial a ser feito aqui é demonstrar que o software open source pode ser uma ferramenta fiavél e eficaz para implementar projetos de data warehouse. As duas soluções ETL utilizadas foram: • Pentaho Data Integration Chaleira Editions Comunidade (Open Source Software) • SQL Server 2005 Integration Services (SSIS) Enterprise Edition (Software Proprietário) O software proprietario, comercial em questão (assim como outros) é amplamente utilizado. No entanto, uma solução de open source tem características fundamentais que são reconhecidas por organizações em todo o mundo e este trabalho irá mostrar as diferentes funcionalidades e benefícios desta abordagem de software open source
    • …
    corecore