20 research outputs found

    Business Intelligence Technology, Applications, and Trends

    Get PDF
    Enterprises are considering substantial investment in Business Intelligence (BI) theories and technologies to maintain their competitive advantages. BI allows massive diverse data collected from virus sources to be transformed into useful information, allowing more effective and efficient production. This paper briefly and broadly explores the business intelligence technology, applications and trends while provides a few stimulating and innovate theories and practices. The authors also explore several contemporary studies related to the future of BI and surrounding fields

    Impact of service-oriented architectures (SOA) on business process standardization - Proposing a research model

    Get PDF
    Originally, Data Warehouses (DWH) were conceived to be components for the data support of controlling and management. From early on, this brought along the need to cope with extensive data preparation, integration, and distribution requirements. In the growing infrastructures for managerial support (“Business Intelligence”), the DWH turned into a central data hub for decision support. As the business environment and the underlying technical infrastructures are fostering an ever increasing degree of systems integration, the DWH has been recognized to be a pivotal component for all sorts of data transformation and data integration operations. Nowadays, the DWH is supposed to process both managerial and operational data – it becomes a transformation hub (TH). This article delineates the relevant motives that drive the trend towards THs and the resulting requirements. The logical composition of a TH is developed based on data transformation steps. Two case studies exemplify the application of the resulting architecture

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Making the Case for a Business Intelligence Framework

    Get PDF
    This research is intended to develop evidence for whether or not large organizations should spend a large amount of time and resources on building Business Intelligence Frameworks by examining Project Manager’s perceptions of complex information systems. Project Managers in a large organization provide a cross functional reporting role that requires them to delve into information technology systems in complex ways when querying for simple metrics related to projects they manage. Using an online survey, this study found that project manager’s perceptions changed more positively towards IT systems performing automatic queries, web based queries, IT systems, and business intelligence system dashboards if they did not already have a business intelligence framework in place, and if they were less experienced. More experienced project managers had lower perceptions of current IT systems, automatic queries, web-based queries, and dashboards. There is evidence to suggest that business intelligence frameworks will be positively perceived for project managers with lower experience, and where these systems have not already been introduced

    Information technology architecture and related strategic factors supporting business advantage

    Get PDF
    Magister Commercii (Information Management) - MCom(IM)Information Technology (IT) architecture is not restricted to technology, but may also address the views of business activities; their processes; data sets and information flows; applications and software; and technology. The objective of this study is to understand the role of IT Architecture and related factors that support competitive business advantage. This study investigates the null hypothesis: IT architecture enhances the competitive advantage of business. This study sets out to explore IT architecture and strategic factorsthat support business advantage. The study findings indicated that business advantage is supported by a sound architecture, by IT and business alignmentand by the enablers of organisations.South Afric

    Investigating the application of real-time business intelligence and facilitating its justification through a proposed conceptual model

    Get PDF
    Includes abstract.Includes bibliographical references.Although real-time Business Intelligence (BI) environments overcome the setbacks of traditional BI, and offer a host of value adding benefits to organizations, their implementation has said to be hampered due to their technological complexities, and has required changes to the business environment, as well as the high costs required to put them in place. In addition, the justification of IT investments still remains a common problem as they provide many intangible benefits which are incompatible with traditional (financial) IT benefits measurement models. For this reason, the research set out to investigate and understand the technological components and organizational changes surrounding real-time BI in order to shed light on these issues. This study also aimed to further the understanding of how real-time BI can be justified as a prudent investment..

    Scaling up Mixed Workloads: a Battle of Data Freshness, Flexibility, and Scheduling

    Get PDF
    The common "one size does not fit all" paradigm isolates transactional and analytical workloads into separate, specialized database systems. Operational data is periodically replicated to a data warehouse for analytics. Competitiveness of enterprises today, however, depends on real-time reporting on operational data, necessitating an integration of transactional and analytical processing in a single database system. The mixed workload should be able to query and modify common data in a shared schema. The database needs to provide performance guarantees for transactional workloads, and, at the same time, efficiently evaluate complex analytical queries. In this paper, we share our analysis of the performance of two main-memory databases that support mixed workloads, SAP HANA and HyPer, while evaluating the mixed workload CH-benCHmark. By examining their similarities and differences, we identify the factors that affect performance while scaling the number of concurrent transactional and analytical clients. The three main factors are (a) data freshness, i.e., how recent is the data processed by analytical queries, (b) flexibility, i.e., restricting transactional features in order to increase optimization choices and enhance performance, and (c) scheduling, i.e., how the mixed workload utilizes resources. Specifically for scheduling, we show that the absence of workload management under cases of high concurrency leads to analytical workloads overwhelming the system and severely hurting the performance of transactional workloads
    corecore