15,321 research outputs found

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    Educational Warehouse: Modular, Private and Secure Cloudable Architecture System for Educational Data Storage, Analysis and Access

    Get PDF
    [Abstract] Data in the educational context are becoming increasingly important in decision-making and teaching-learning processes. Similar to the industrial context, educational institutions are adopting data-processing technologies at all levels. To achieve representative results, the processes of extraction, transformation and uploading of educational data should be ubiquitous because, without useful data, either internal or external, it is difficult to perform a proper analysis and to obtain unbiased educational results. It should be noted that the source and type of data are heterogeneous and that the analytical processes can be so diverse that it opens up a practical problem of management and access to the data generated. At the same time, ensuring the privacy, identity, confidentiality and security of students and their data is a “sine qua non” condition for complying with the legal issues involved while achieving the required ethical premises. This work proposes a modular and scalable data system architecture that solves the complexity of data management and access. On the one hand, it allows educational institutions to collect any data generated in both the teaching-learning and management processes. On the other hand, it will enable external access to this data under appropriate privacy and security conditions.Generalitat de Catalunya; 2017 SGR 93

    Paradigm Innovation through the Strategic Collaboration between TORAY & UNIQLO : Evolution of A New Fast Fashion Business Model

    Get PDF
    The key purpose of this study is to examine the remarkable context within the evolution of the paradigm innovation in fashion product development, in the case of Japanese fashion apparel, UNIQLO, created by Fast Retailing Corp in 1998. The key theme hereby concerns innovation, and this perspective surely necessitates Fast Retailing's strategic collaboration with a Japanese new material and artificial textile powerhouse, TORAY: as TORAY's technological provision was an essential source for the dynamic product and process innovation behind the extraordinary growth of UNIQLO. Furthermore, the technological superiority also entailed its innovative positioning in market competition. It is crucial to examine how and why the two brought about their core competences together through new combinations of concepts. This should impart a few promising research perspectives regarding their innovative model of unchallenged value creation, strong market competitiveness, and sustainable corporate growth.Paradigm Innovation, Product Development, Business Model, Japanese Apparel Industry, Fashion Apparel, Fast Fashion, Fast Retailing, UNIQLO, TORAY, Alliance, Virtual Vertical Integration

    Towards a Novel Cooperative Logistics Information System Framework

    Get PDF
    Supply Chains and Logistics have a growing importance in global economy. Supply Chain Information Systems over the world are heterogeneous and each one can both produce and receive massive amounts of structured and unstructured data in real-time, which are usually generated by information systems, connected objects or manually by humans. This heterogeneity is due to Logistics Information Systems components and processes that are developed by different modelling methods and running on many platforms; hence, decision making process is difficult in such multi-actor environment. In this paper we identify some current challenges and integration issues between separately designed Logistics Information Systems (LIS), and we propose a Distributed Cooperative Logistics Platform (DCLP) framework based on NoSQL, which facilitates real-time cooperation between stakeholders and improves decision making process in a multi-actor environment. We included also a case study of Hospital Supply Chain (HSC), and a brief discussion on perspectives and future scope of work

    Taming Corporate Data

    Get PDF
    Today, every company is faced with a large amount of data that reaches it through multiple communication channels. To tame these data, the company (1) needs to carefully consider the data it receives to decide what to do with it; (2) must evaluate which procedures and tools to use; (3) must invest in Human Resources to manage data and information. We need new facilities to implement, we need new professionals, the Chief Data Officer and the Data Scientist, we need to increase final user knowledge by means of Data Literacy
    • …
    corecore