7,843 research outputs found

    Towards Design Principles for Data-Driven Decision Making: An Action Design Research Project in the Maritime Industry

    Get PDF
    Data-driven decision making (DDD) refers to organizational decision-making practices that emphasize the use of data and statistical analysis instead of relying on human judgment only. Various empirical studies provide evidence for the value of DDD, both on individual decision maker level and the organizational level. Yet, the path from data to value is not always an easy one and various organizational and psychological factors mediate and moderate the translation of data-driven insights into better decisions and, subsequently, effective business actions. The current body of academic literature on DDD lacks prescriptive knowledge on how to successfully employ DDD in complex organizational settings. Against this background, this paper reports on an action design research study aimed at designing and implementing IT artifacts for DDD at one of the largest ship engine manufacturers in the world. Our main contribution is a set of design principles highlighting, besides decision quality, the importance of model comprehensibility, domain knowledge, and actionability of results

    On-line analytical processing

    Get PDF
    On-line analytical processing (OLAP) describes an approach to decision support, which aims to extract knowledge from a data warehouse, or more specifically, from data marts. Its main idea is providing navigation through data to non-expert users, so that they are able to interactively generate ad hoc queries without the intervention of IT professionals. This name was introduced in contrast to on-line transactional processing (OLTP), so that it reflected the different requirements and characteristics between these classes of uses. The concept falls in the area of business intelligence.Peer ReviewedPostprint (author's final draft

    The necessities for building a model to evaluate Business Intelligence projects- Literature Review

    Full text link
    In recent years Business Intelligence (BI) systems have consistently been rated as one of the highest priorities of Information Systems (IS) and business leaders. BI allows firms to apply information for supporting their processes and decisions by combining its capabilities in both of organizational and technical issues. Many of companies are being spent a significant portion of its IT budgets on business intelligence and related technology. Evaluation of BI readiness is vital because it serves two important goals. First, it shows gaps areas where company is not ready to proceed with its BI efforts. By identifying BI readiness gaps, we can avoid wasting time and resources. Second, the evaluation guides us what we need to close the gaps and implement BI with a high probability of success. This paper proposes to present an overview of BI and necessities for evaluation of readiness. Key words: Business intelligence, Evaluation, Success, ReadinessComment: International Journal of Computer Science & Engineering Survey (IJCSES) Vol.3, No.2, April 201

    Adding Value to Statistics in the Data Revolution Age

    Get PDF
    As many statistical offices in accordance with the European Statistical System commitment to Vision 2020, since the second half of 2014 Istat has implemented its internal standardisation and industrialisation process within the framework of a common Business Architecture. Istat modernisation programme aims at building services and infrastructures within a plug-and-play framework to foster innovation, promote reuse and move towards full integration and interoperability of statistical process, consistent with a service-oriented architecture. This is expected to lead to higher effectiveness and productivity by improving the quality of statistical information and reducing the response burden. This paper addresses the strategy adopted by Istat which is focused on exploiting administrative data and new data sources in order to achieve its key goals enhancing value to users. The strategy is based on some priorities that consider services centred on users and stakeholders as well as Linked Open Data, to allow Machine-to-Machine data and metadata integration through definition of common statistical ontologies and semantics

    CERN openlab Whitepaper on Future IT Challenges in Scientific Research

    Get PDF
    This whitepaper describes the major IT challenges in scientific research at CERN and several other European and international research laboratories and projects. Each challenge is exemplified through a set of concrete use cases drawn from the requirements of large-scale scientific programs. The paper is based on contributions from many researchers and IT experts of the participating laboratories and also input from the existing CERN openlab industrial sponsors. The views expressed in this document are those of the individual contributors and do not necessarily reflect the view of their organisations and/or affiliates
    • …
    corecore