4 research outputs found

    AcDWH - A patented method for active data warehousing

    Get PDF
    The traditional needs of data warehousing from monthly, weekly or nightly batch processing have evolved to near real-time refreshment cycles of the data, called active data warehousing. While the traditional data warehousing methods have been used to batch load large sets of data in the past, the business need for extremely fresh data in the data warehouse has increased. Previous studies have reviewed different aspects of the process along with the different methods to process data in data warehouses in near real-time fashion. To date, there has been little research of using partitioned staging tables within relational databases, combined with a crafted metadata driven system and parallelized loading processes for active data warehousing. This study provides a throughout description and suitability assessment of the patented AcDWH method for active data warehousing. In addition, this study provides a review and a summary of existing research on the data warehousing area from the era of start of data warehousing in the 1990’s to the year 2020. The review focuses on different parts of the data warehousing process and highlights the differences compared to the AcDWH method. Related to the AcDWH, the usage of partitioned staging tables within a relational database in combination of meta data structures used to manage the system is discussed in detail. In addition, two real-life applications are disclosed and discussed on high level. Potential future extensions to the methodology are discussed, and briefly summarized. The results indicate that the utilization of AcDWH method using parallelized loading pipelines and partitioned staging tables can provide enhanced throughput in the data warehouse loading processes. This is a clear improvement on the study’s field. Previous studies have not been considering using partitioned staging tables in conjunction with the loading processes and pipeline parallelization. Review of existing literature against the AcDWH method together with trial and error -approach show that the results and conclusions of this study are genuine. The results of this study confirm the fact that also technical level inventions within the data warehousing processes have significant contribution to the advance of methodologies. Compared to the previous studies in the field, this study suggests a simple yet novel method to achieve near real-time capabilities in active data warehousing.AcDWH – Patentoitu menetelmä aktiiviseen tietovarastointiin Perinteiset tarpeet tietovarastoinnille kuukausittaisen, viikoittaisen tai yöllisen käsittelyn osalta ovat kehittyneet lähes reaaliaikaista päivitystä vaativaksi aktiiviseksi tietovarastoinniksi. Vaikka perinteisiä menetelmiä on käytetty suurten tietomäärien lataukseen menneisyydessä, liiketoiminnan tarve erittäin ajantasaiselle tiedolle tietovarastoissa on kasvanut. Aikaisemmat tutkimukset ovat tarkastelleet erilaisia prosessin osa-alueita sekä erilaisia menetelmiä tietojen käsittelyyn lähes reaaliaikaisissa tietovarastoissa. Tutkimus partitioitujen relaatiotietokantojen väliaikaistaulujen käytöstä aktiivisessa tietovarastoinnissa yhdessä räätälöidyn metatieto-ohjatun järjestelmän ja rinnakkaislatauksen kanssa on ollut kuitenkin vähäistä. Tämä tutkielma tarjoaa kattavan kuvauksen sekä arvioinnin patentoidun AcDWH-menetelmän käytöstä aktiivisessa tietovarastoinnissa. Työ sisältää katsauksen ja yhteenvedon olemassa olevaan tutkimukseen tietovarastoinnin alueella 1990-luvun alusta vuoteen 2020. Kirjallisuuskatsaus keskittyy eri osa-alueisiin tietovarastointiprosessissa ja havainnollistaa eroja verrattuna AcDWH-menetelmään. AcDWH-menetelmän osalta käsitellään partitioitujen väliaikaistaulujen käyttöä relaatiotietokannassa, yhdessä järjestelmän hallitsemiseen käytettyjen metatietorakenteiden kanssa. Lisäksi kahden reaalielämän järjestelmän sovellukset kuvataan korkealla tasolla. Tutkimuksessa käsitellään myös menetelmän mahdollisia tulevia laajennuksia menetelmään tiivistetysti. Tulokset osoittavat, että AcDWH-menetelmän käyttö rinnakkaisilla latausputkilla ja partitioitujen välitaulujen käytöllä tarjoaa tehokkaan tietovaraston latausprosessin. Tämä on selvä parannus aikaisempaan tutkimukseen verrattuna. Aikaisemmassa tutkimuksessa ei ole käsitelty partitioitujen väliaikaistaulujen käyttöä ja soveltamista latausprosessin rinnakkaistamisessa. Tämän tutkimuksen tulokset vahvistavat, että myös tekniset keksinnöt tietovarastointiprosesseissa ovat merkittävässä roolissa menetelmien kehittymisessä. Aikaisempaan alan tutkimukseen verrattuna tämä tutkimus ehdottaa yksinkertaista mutta uutta menetelmää lähes reaaliaikaisten ominaisuuksien saavuttamiseksi aktiivisessa tietovarastoinnissa

    Data virtualization design model for near real time decision making in business intelligence environment

    Get PDF
    The main purpose of Business Intelligence (BI) is to focus on supporting an organization‘s strategic, operational and tactical decisions by providing comprehensive, accurate and vivid data to the decision makers. A data warehouse (DW), which is considered as the input for decision making system activities is created through a complex process known as Extract, Transform and Load (ETL). ETL operates at pre-defined times and requires time to process and transfer data. However, providing near real time information to facilitate the data integration in supporting decision making process is a known issue. Inaccessibility to near realtime information could be overcome with Data Virtualization (DV) as it provides unified, abstracted, near real time, and encapsulated view of information for querying. Nevertheless, currently, there are lack of studies on the BI model for developing and managing data in virtual manner that can fulfil the organization needs. Therefore, the main aim of this study is to propose a DV model for near-real time decision making in BI environment. Design science research methodology was adopted to accomplish the research objectives. As a result of this study, a model called Data Virtualization Development Model (DVDeM) is proposed that addresses the phases and components which affect the BI environment. To validate the model, expert reviews and focus group discussions were conducted. A prototype based on the proposed model was also developed, and then implemented in two case studies. Also, an instrument was developed to measure the usability of the prototype in providing near real time data. In total, 60 participants were involved and the findings indicated that 93% of the participants agreed that the DVDeM based prototype was able to provide near real-time data for supporting decision-making process. From the studies, the findings also showed that the majority of the participants (more than 90%) in both of education and business sectors, have affirmed the workability of the DVDeM and the usability of the prototype in particular able to deliver near real-time decision-making data. Findings also indicate theoretical and practical contributions for developers to develop efficient BI applications using DV technique. Also, the mean values for each measurement item are greater than 4 indicating that the respondents agreed with the statement for each measurement item. Meanwhile, it was found that the mean scores for overall usability attributes of DVDeM design model fall under "High" or "Fairly High". Therefore, the results show sufficient indications that by adopting DVDeM model in developing a system, the usability of the produced system is perceived by the majority of respondents as high and is able to support near real time decision making data

    Intégration de données temps-réel issues de capteurs dans un entrepôt de données géo-décisionnel

    Get PDF
    Nous avons pu, au cours des dernières années, assister à une augmentation du nombre de capteurs utilisés pour mesurer des phénomènes de plus en plus variés. En effet, nous pouvons aujourd'hui utiliser les capteurs pour mesurer un niveau d'eau, une position (GPS), une température et même le rythme cardiaque d'un individu. La grande diversité de capteurs fait d'eux aujourd'hui des outils par excellence en matière d'acquisition de données. En parallèle à cette effervescence, les outils d'analyse ont également évolué depuis les bases de données transactionnelles et ont mené à l'apparition d'une nouvelle famille d’outils, appelés systèmes d’analyse (systèmes décisionnels), qui répond à des besoins d’analyse globale sur les données. Les entrepôts de données et outils OLAP (On-Line Analytical Processing), qui font partie de cette famille, permettent dorénavant aux décideurs d'analyser l'énorme volume de données dont ils disposent, de réaliser des comparaisons dans le temps et de construire des graphiques statistiques à l’aide de simples clics de la souris. Les nombreux types de capteurs peuvent certainement apporter de la richesse à une analyse, mais nécessitent de longs travaux d'intégration pour les amener jusqu'à un entrepôt géo-décisionnel, qui est au centre du processus de prise de décision. Les différents modèles de capteurs, types de données et moyens de transférer les données sont encore aujourd'hui des obstacles non négligeables à l'intégration de données issues de capteurs dans un entrepôt géo-décisionnel. Également, les entrepôts de données géo-décisionnels actuels ne sont pas initialement conçus pour accueillir de nouvelles données sur une base fréquente. Puisque l'utilisation de l'entrepôt par les utilisateurs est restreinte lors d'une mise à jour, les nouvelles données sont généralement ajoutées sur une base hebdomadaire, mensuelle, etc. Il existe pourtant des entrepôts de données capables d'être mis à jour plusieurs fois par jour sans que les performances lors de leur exploitation ne soient atteintes, les entrepôts de données temps-réel (EDTR). Toutefois, cette technologie est encore aujourd’hui peu courante, très coûteuse et peu développée. Ces travaux de recherche visent donc à développer une approche permettant de publier et standardiser les données temps-réel issues de capteurs et de les intégrer dans un entrepôt géo-décisionnel conventionnel. Une stratégie optimale de mise à jour de l'entrepôt a également été développée afin que les nouvelles données puissent être ajoutées aux analyses sans que la qualité de l'exploitation de l'entrepôt par les utilisateurs ne soit remise en cause.In the last decade, the use of sensors for measuring various phenomenons has greatly increased. As such, we can now make use of sensors to measure GPS position, temperature and even the heartbeats of a person. Nowadays, the wide diversity of sensor makes them the best tools to gather data. Along with this effervescence, analysis tools have also advanced since the creation of transactional databases, leading to a new category of tools, analysis systems (Business Intelligence (BI)), which respond to the need of the global analysis of the data. Data warehouses and OLAP (On-Line Analytical Processing) tools, which belong to this category, enable users to analyze big volumes of data, execute time-based requests and build statistic graphs in a few simple mouse clicks. Although the various types of sensor can surely enrich any analysis, such data requires heavy integration processes to be driven into the data warehouse, centerpiece of any decision-making process. The different data types produced by sensors, sensor models and ways to transfer such data are even today significant obstacles to sensors data streams integration in a geo-decisional data warehouse. Also, actual geo-decisional data warehouses are not initially built to welcome new data on a high frequency. Since the performances of a data warehouse are restricted during an update, new data is usually added weekly, monthly, etc. However, some data warehouses, called Real-Time Data Warehouses (RTDW), are able to be updated several times a day without letting its performance diminish during the process. But this technology is not very common, very costly and in most of cases considered as "beta" versions. Therefore, this research aims to develop an approach allowing to publish and normalize real-time sensors data streams and to integrate it into a classic data warehouse. An optimized update strategy has also been developed so the frequent new data can be added to the analysis without affecting the data warehouse performances
    corecore