3,070 research outputs found

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    SIT automation tool: failure use case automation and diagnosis

    Get PDF
    Study of systems to manage the performance and quality of service of wireless data networks. Work with optimization techniques and project management to solve complex networks issues.The scope of this thesis is the SIT (System Integration Testing) process which is the testing procedure executed in customer test environment before the software goes on production environment. The main objective for this thesis is no other than improving the current process step by step taking into account the automation, efficiency, missing checks and much more. This project is a kind of Industrial process to create a powerful testing tool which can allow the company to deliver quality adaptor products efficiently, do better in less time helping to reduce costs, as Adaptors are the most demanded product of MYCOM OSI portfolio. Take into account that business is not only generated when an Adaptor is delivered for first time but also when Vendors provide with new releases and new functionalities and operators needs to order an upgrade of the Adaptor to be able to monitor the new functionalities deployed on their network.El campo de aplicación en el que está centrado esta tesis es el SIT (System Integration Testing), proceso de testeo ejecutado en un servidor de testeo del cliente antes de desplegar el software el medio de producción. El objetivo principal de esta tesis no es otro que mejorar el proceso actual paso a paso teniendo en cuenta la automatización, eficiencia, la falta de verificaciones, entre otros. Este proyecto es una especie de proceso industrial para crear una aplicación potente de testeo que pueda permitir a la compañía entregar adaptadores de calidad con eficiencia, que hagan más en menos tiempo ayudando así a reducir costes. Los adaptadores son el producto más demandado del porfolio de MYCOM OSI. Hay que tener en cuenta que el negocio no se genera solamente cuando se entrega por primera vez el adaptador al cliente, sino que cuando los proveedores lanzan nuevas versiones con nuevas funcionalidades y los operadores necesitan encargar una mejora del adaptador para poder monitorizar las nuevas funcionalidades desplegadas en su red.El camp d'aplicació en que es basa aquesta tesi és el SIT (System Integration Testing), procés de testeig executat en un servidor de testeig del client abans de desplegar el software al mitjà de producció. L'objectiu principal d'aquesta tesi no és un altre que millorar el procés actual pas a pas tenint en compte l'automatització, l'eficiència, la falta de verificacions, d'entre altres. Aquest projecte és una mena de procés industrial per crear una aplicació potent de testeig que pugui permetre a la companyia lliurar adaptadors de qualitat amb eficiència, que facin més en menys temps ajudant així a reduir costos. Els adaptadors són el producte més demandat del porfolio de MYCOM OSI. Cal tenir en compte que el negoci no només es genera quan es lliura per primera vegada l'adaptador al client, sinó que quan els proveïdors llancen noves versions amb noves funcionalitats i els operadors necessiten encarregar una millora de l'adaptador per poder monitoritzar les noves funcionalitats desplegades a la seva xarxa

    IMPROVING THE QUALITY OF THE DECISION MAKING BY USING BUSINESS INTELLIGENCE SOLUTIONS

    Get PDF
    On the basis of the decision making stands information, as one of the main elements that determine the evolution of our-days society. As a consequence, data analysis tends to become a priority in the activity of an organization for decision making. The diBusiness Intelligence, Data Warehouse, decision making, SQL Server

    Assessing the Flexibility of a Service Oriented Architecture to that of the Classic Data Warehouse

    Get PDF
    The flexibility of a service oriented architecture (SOA) is compared to that of the classic data warehouse across three categories: (1) source system access, (2) integration and transformation, and (3) end user access. The findings suggest that an SOA allows better upgrade and migration flexibility if back-end systems expose their source data via adapters. However, the providers of such adapters must deal with the complexity of maintaining consistent interfaces. An SOA also appears to provide more flexibility at the integration tier due to its ability to merge batch with real-time source system data. This has the potential to retain source system data semantics (e.g., code translations and business rules) without having to reproduce such logic in a transformation tier. Additionally, the tight coupling of operational metadata and source system data within XML in an SOA allows more flexibility in downstream analysis and auditing of output . SOA does lag behind the classic data warehouse at the end user level, mainly due to the latter\u27s use of mature SQL and relational database technology. Users of all technical levels can easily work with these technologies in the classic data warehouse environment to query data in a number of ways. The SOA end user likely requires developer support for such activities

    Using Semantic Web technologies in the development of data warehouses: A systematic mapping

    Get PDF
    The exploration and use of Semantic Web technologies have attracted considerable attention from researchers examining data warehouse (DW) development. However, the impact of this research and the maturity level of its results are still unclear. The objective of this study is to examine recently published research articles that take into account the use of Semantic Web technologies in the DW arena with the intention of summarizing their results, classifying their contributions to the field according to publication type, evaluating the maturity level of the results, and identifying future research challenges. Three main conclusions were derived from this study: (a) there is a major technological gap that inhibits the wide adoption of Semantic Web technologies in the business domain;(b) there is limited evidence that the results of the analyzed studies are applicable and transferable to industrial use; and (c) interest in researching the relationship between DWs and Semantic Web has decreased because new paradigms, such as linked open data, have attracted the interest of researchers.This study was supported by the Universidad de La Frontera, Chile, PROY. DI15-0020. Universidad de la Frontera, Chile, Grant Numbers: DI15-0020 and DI17-0043

    Implementation of a data virtualization layer applied to insurance data

    Get PDF
    This work focuses on the introduction of a data virtualization layer to read and consolidate data from heterogeneous sources (Hadoop system, a data mart and a data warehouse) and provide a single point of data access to all data consumers
    corecore