659 research outputs found

    On Deriving Net Change Information From Change Logs - The DELTALAYER-Algorithm

    Get PDF
    The management of change logs is crucial in different areas of information systems like data replication, data warehousing, and process management. One barrier that hampers the (intelligent) use of respective change logs is the possibly large amount of unnecessary and redundant data provided by them. In particular, change logs often contain information about changes which actually have had no effect on the original data source (e.g., due to subsequently applied, overriding change operations). Typically, such inflated logs lead to difficulties with respect to system performance, data quality or change comparability. In order to deal with this we introduce the DeltaLayer algorithm. It takes arbitrary change log information as input and produces a cleaned output which only contains the net change effects; i.e., the produced log only contains information about those changes which actually have had an effect on the original source. We formally prove the minimality of our algorithm, and we show how it can be applied in different domains; e.g., the post-processing of differential snapshots in data warehouses or the analysis of conflicting changes in process management systems. Altogether the ability to purge change logs from unnecessary information provides the basis for a more intelligent handling of these logs

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources

    The challenges of extract, transform and load (ETL) for data integration in near real-time environment

    Get PDF
    Organization with considerable investment into data warehousing, the influx of various data types and forms require certain ways of prepping data and staging platform that support fast, efficient and volatile data to reach its targeted audiences or users of different business needs. Extract, Transform and Load (ETL) system proved to be a choice standard for managing and sustaining the movement and transactional process of the valued big data assets. However, traditional ETL system can no longer accommodate and effectively handle streaming or near real-time data and stimulating environment which demands high availability, low latency and horizontal scalability features for functionality. This paper identifies the challenges of implementing ETL system for streaming or near real-time data which needs to evolve and streamline itself with the different requirements. Current efforts and solution approaches to address the challenges are presented. The classification of ETL system challenges are prepared based on near real-time environment features and ETL stages to encourage different perspectives for future research

    Towards a big data reference architecture

    Get PDF

    Pragmatic development of service based real-time change data capture

    Get PDF
    This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    The Modernization Process of a Data Pipeline

    Get PDF
    Data plays an integral part in a company’s decision-making. Therefore, decision-makers must have the right data available at the right time. Data volumes grow constantly, and new data is continuously needed for analytical purposes. Many companies use data warehouses to store data in an easy-to-use format for reporting and analytics. The challenge with data warehousing is displaying data using one unified structure. The source data is often gathered from many systems that are structured in various ways. A process called extract, transform, and load (ETL) or extract, load, and transform (ELT) is used to load data into the data warehouse. This thesis describes the modernization process of one such pipeline. The previous solution, which used an on-premises Teradata platform for computation and SQL stored procedures for the transformation logic, is replaced by a new solution. The goal of the new solution is a process that uses modern tools, is scalable, and follows programming best practises. The cloud-based Databricks platform is used for computation, and dbt is used as the transformation tool. Lastly, a comparison is made between the new and old solutions, and their benefits and drawbacks are discussed

    Assessment of the Triggers of Inefficient Materials Management Practices by Construction SMEs in Nigeria

    Get PDF
    Inefficient material management throughout the construction projects value chain has resulted to poor performance especially in terms of time, cost, quality, and productivity. Even well-organised large construction organisations still fall prey to this project performance killer; as adequate attention is not given to material management as a necessary key project management function. Thus, this study assessed the factors that trigger inefficient material management practices by construction SMEs in Port Harcourt, Nigeria. The study utilised a quantitative survey method and convenience sampling technique in the distribution of the structured questionnaire used to gather data from project managers, procurement officers and construction professionals working with the construction SMEs. With a 93.33% response rate, the gathered data were analysed using percentage, frequencies and factor analysis with principal component analysis. It was found that the major triggers of inefficient materials management among construction SMEs are; traditional approach and maintenance issues, manufacturer error and poor planning, inventory management issues, poor handling of procurement, materials estimating problems, storage problems and insecurity, and communication issues. It was concluded that the predominance of these triggers in the management of materials among construction SMEs would result in a continued poor performance of construction projects, especially with regards to project time, cost, quality and productivity. The study recommends a move away from the traditional methods of managing materials and the adoption of a technological-based material management system
    corecore