429,727 research outputs found

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    Delivering ‘Effortless Experience’ Across Borders: Managing Internal Consistency in Professional Service Firms

    Get PDF
    This article explores how professional service firms manage across borders. When clients require consistent services delivered across multiple locations, especially across borders, then firms need to develop an organization that is sufficiently flexible to be able to support such consistent service delivery. Our discussion is illustrated by the globalization process of law firms. We argue that the globalization of large corporate law firms primarily takes place in terms of investments in the development of protocols, processes and practices that enhance internal consistency such that clients receive an ‘effortless experience’ of the service across multiple locations worldwide. Over the longer term the ability to deliver such effortless experience is dependent upon meaningful integration within and across the firm. Firms that achieve this are building a source of sustainable competitive advantage

    Heterogeneous biomedical database integration using a hybrid strategy: a p53 cancer research database.

    Get PDF
    Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB) was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.)

    The Research of Technological Approach to the Modeling of Information and Analytic Provision of Managing an Enterprise

    Get PDF
    The article is devoted to the solution of actual problems of innovative development of information and analytical provision of managing an enterprise according to the newest technology. A number of recommendations concerning technological upgrading foundation of modernization of information and analytic provision of managing an enterprise are elaborated. Technological approach to the modeling of information provision of business management is substantiated. Actualizing information and analytic provision of managing an enterprise has been carried out in developed flexible information system that is organized as internal network structure. Technological foundation of information and analytical process enterprise to modernize has been considered based on the modern tools of information and communication decisions. Information and analytical provision of managing have been developed through internal and external parallels of impact, which interconnection coordinates theory, methodology and organization of information processes with actualization of its model. The model of information and analytic provision of managing an enterprise according to the individual characteristics of corporate culture, and information environment and development strategy of business entity on the basis of characteristics of technological provision of information process is developed. Information complex has been suggested as developed system with technological process of forming initial data and modernizing processing, transmission and storage of information in accordance with distinctive characteristics of enterprise and general tendencies of its developmen

    Supply chain uncertainty:a review and theoretical foundation for future research

    Get PDF
    Supply-chain uncertainty is an issue with which every practising manager wrestles, deriving from the increasing complexity of global supply networks. Taking a broad view of supply-chain uncertainty (incorporating supply-chain risk), this paper seeks to review the literature in this area and develop a theoretical foundation for future research. The literature review identifies a comprehensive list of 14 sources of uncertainty, including those that have received much research attention, such as the bullwhip effect, and those more recently described, such as parallel interaction. Approaches to managing these sources of uncertainty are classified into: 10 approaches that seek to reduce uncertainty at its source; and, 11 approaches that seek to cope with it, thereby minimising its impact on performance. Manufacturing strategy theory, including the concepts of alignment and contingency, is then used to develop a model of supply-chain uncertainty, which is populated using the literature review to show alignment between uncertainty sources and management strategies. Future research proposed includes more empirical research in order to further investigate: which uncertainties occur in particular industrial contexts; the impact of appropriate sources/management strategy alignment on performance; and the complex interplay between management strategies and multiple sources of uncertainty (positive or negative)

    Knowledge and Metadata Integration for Warehousing Complex Data

    Full text link
    With the ever-growing availability of so-called complex data, especially on the Web, decision-support systems such as data warehouses must store and process data that are not only numerical or symbolic. Warehousing and analyzing such data requires the joint exploitation of metadata and domain-related knowledge, which must thereby be integrated. In this paper, we survey the types of knowledge and metadata that are needed for managing complex data, discuss the issue of knowledge and metadata integration, and propose a CWM-compliant integration solution that we incorporate into an XML complex data warehousing framework we previously designed.Comment: 6th International Conference on Information Systems Technology and its Applications (ISTA 07), Kharkiv : Ukraine (2007

    A community based approach for managing ontology alignments

    Get PDF
    The Semantic Web is rapidly becoming a defacto distributed repository for semantically represented data, thus leveraging on the added on value of the network effect. Various ontology mapping techniques and tools have been devised to facilitate the bridging and integration of distributed data repositories. Nevertheless, ontology mapping can benefitfrom human supervision to increase accuracy of results. The spread of Web 2.0 approaches demonstrate the possibility of using collaborative techniques for reaching consensus. While a number of prototypes for collaborative ontology construction are being developed, collaborative ontology mapping is not yet well investigated. In this paper, we describe a prototype that combines off-the-shelf ontology mapping tools with social software techniques to enable users to collaborate on mapping ontologies

    A DevOps approach to integration of software components in an EU research project

    Get PDF
    We present a description of the development and deployment infrastructure being created to support the integration effort of HARNESS, an EU FP7 project. HARNESS is a multi-partner research project intended to bring the power of heterogeneous resources to the cloud. It consists of a number of different services and technologies that interact with the OpenStack cloud computing platform at various levels. Many of these components are being developed independently by different teams at different locations across Europe, and keeping the work fully integrated is a challenge. We use a combination of Vagrant based virtual machines, Docker containers, and Ansible playbooks to provide a consistent and up-to-date environment to each developer. The same playbooks used to configure local virtual machines are also used to manage a static testbed with heterogeneous compute and storage devices, and to automate ephemeral larger-scale deployments to Grid5000. Access to internal projects is managed by GitLab, and automated testing of services within Docker-based environments and integrated deployments within virtual-machines is provided by Buildbot
    corecore