277,465 research outputs found

    Agile Data Architecture in Mining Industry for Continuously Business-IT Alignment: EA Perspective

    Get PDF
    Data plays a vital role particularly in mining enterprises to foster innovation and business performance through precise decision-making. Scientists have created many kinds of database technology to support them, such as big data, data cloud, etc. Meanwhile business is always facing a fluctuant situation in using those various database technologies. Unfortunately, there is still hard to use database technology in a proper way because there are no model rules in making a perfect planning so that they can choose suitable tools for its own case. Therefore, this paper aims to design an agile enterprise data architecture model rules (blueprint) in mining company based on various frameworks, tools and methods/techniques so that managers can govern their data asset for sustainably business-IT alignment. This data architecture reference models can be easily adopted by CIOs in order to move toward an integrated mining enterprises as well as guide user in producing a precise decision-making

    Revisiting Ralph Sprague’s Framework for Developing Decision Support Systems

    Get PDF
    Ralph H. Sprague Jr. was a leader in the MIS field and helped develop the conceptual foundation for decision support systems (DSS). In this paper, I pay homage to Sprague and his DSS contributions. I take a personal perspective based on my years of working with Sprague. I explore the history of DSS and its evolution. I also present and discuss Sprague’s DSS development framework with its dialog, data, and models (DDM) paradigm and characteristics. At its core, the development framework remains valid in today’s world of business intelligence and big data analytics. I present and discuss a contemporary reference architecture for business intelligence and analytics (BI/A) in the context of Sprague’s DSS development framework. The practice of decision support continues to evolve and can be described by a maturity model with DSS, enterprise data warehousing, real-time data warehousing, big data analytics, and the emerging cognitive as successive generations. I use a DSS perspective to describe and provide examples of what the forthcoming cognitive generation will bring

    The Process of Approaching the Taxation of the Exploitation of Personal Data

    Get PDF
    As pointed out, Harakat M. (2015), in seeking to elucidate the fiscal impact of such or such a measure on economic and social growth, through the evaluation of positive or negative distortions of the tax, it is necessary to take into account the social cost, the tax architecture of security and transparency, in order to satisfy the four maxims, put forward by Smith A. (1776): Proportionality, Stability, Equity, Efficiency. For this, the State has, at the time of the introduction or the re-examination of an instrument of state action, the possibility of examining its appropriateness to know if it has the expected effect, if it is the most effective and the least expensive to achieve the objective fixed by the law or if, on the contrary, other instruments better marketed, would be more appropriate, in reference to the famous duality Equity-Efficiency. This article follows up on our previous work of immersing ourselves in the literature dealing with instruments for taxing the negative externalities of the exploitation of Big Data by Data-Driven-Business-Models, and proposes narrative an overview of the theory of externalities by attempting to draw some lessons from environmental taxation, and then by presenting the various proposals for tax measures on Big Data, in order to prepare ourselves for a better approach to the framework for adopting a tax measure taxing the negative externalities of Big Data in Morocco, which is the subject of our doctoral thesis.   Keywords: Big-Data, Value, Externality, Taxation, Policy Strategies, Free Labour JEL Classification: F38, H20 Paper type: Theoretical ResearchAs pointed out, Harakat M. (2015), in seeking to elucidate the fiscal impact of such or such a measure on economic and social growth, through the evaluation of positive or negative distortions of the tax, it is necessary to take into account the social cost, the tax architecture of security and transparency, in order to satisfy the four maxims, put forward by Smith A. (1776): Proportionality, Stability, Equity, Efficiency. For this, the State has, at the time of the introduction or the re-examination of an instrument of state action, the possibility of examining its appropriateness to know if it has the expected effect, if it is the most effective and the least expensive to achieve the objective fixed by the law or if, on the contrary, other instruments better marketed, would be more appropriate, in reference to the famous duality Equity-Efficiency. This article follows up on our previous work of immersing ourselves in the literature dealing with instruments for taxing the negative externalities of the exploitation of Big Data by Data-Driven-Business-Models, and proposes narrative an overview of the theory of externalities by attempting to draw some lessons from environmental taxation, and then by presenting the various proposals for tax measures on Big Data, in order to prepare ourselves for a better approach to the framework for adopting a tax measure taxing the negative externalities of Big Data in Morocco, which is the subject of our doctoral thesis.   Keywords: Big-Data, Value, Externality, Taxation, Policy Strategies, Free Labour JEL Classification: F38, H20 Paper type: Theoretical Researc

    A Cloud-Edge Orchestration Platform for the Innovative Industrial Scenarios of the IoTwins Project

    Get PDF
    The concept of digital twins has growing more and more interest not only in the academic field but also among industrial environments thanks to the fact that the Internet of Things has enabled its cost-effective implementation. Digital twins (or digital models) refer to a virtual representation of a physical product or process that integrate data from various sources such as data APIs, historical data, embedded sensors and open data, giving to the manufacturers an unprecedented view into how their products are performing. The EU-funded IoTwins project plans to build testbeds for digital twins in order to run real-time computation as close to the data origin as possible (e.g., IoT Gateway or Edge nodes), and whilst batch-wise tasks such as Big Data analytics and Machine Learning model training are advised to run on the Cloud, where computing resources are abundant. In this paper, the basic concepts of the IoTwins project, its reference architecture, functionalities and components have been presented and discussed

    High Performance Pre-Computing: Prototype Application to a Coastal Flooding Decision Tool

    No full text
    International audienceAfter defining the High Performance Pre- Computing --referred as HPPC-- concept, the aim of the present study is to develop a prototype whether to approve or not the benefits of this concept. Our application case tries to answer the geophysical issue of coastal flooding. This is an example of an alert system based on the HPPC architecture, thus on pre-computed scenarios. The prototype provides the scientists with an ergonomic and on-demand tool allowing the run of scenarios of any implemented numerical models. These runs are available through a web application which submits the corresponding jobs on the remote french public cluster of HPC@LR. In this study we simulate the waves propagation over a Mediterranean grid using the wave model WaveWatch III⃝R . A reference simulation using usual conditions is approximated using the k-NN algorithm over 12, 98 and then 980 pre-computed scenarios. This simple experiment demonstrates how useful the pre-computing of scenarios is for alert systems as far as enough and relevant scenarios are pre-computed. This is the reason why searches continue in each critical points of the HPPC architecture such as the design of experiment, the approximation of the results by meta-models and the research of the closest scenarios in this big data context

    Bosch's industry 4.0 advanced Data Analytics: historical and predictive data integration for decision support

    Get PDF
    Industry 4.0, characterized by the development of automation and data exchanging technologies, has contributed to an increase in the volume of data, generated from various data sources, with great speed and variety. Organizations need to collect, store, process, and analyse this data in order to extract meaningful insights from these vast amounts of data. By overcoming these challenges imposed by what is currently known as Big Data, organizations take a step towards optimizing business processes. This paper proposes a Big Data Analytics architecture as an artefact for the integration of historical data - from the organizational business processes - and predictive data - obtained by the use of Machine Learning models -, providing an advanced data analytics environment for decision support. To support data integration in a Big Data Warehouse, a data modelling method is also proposed. These proposals were implemented and validated with a demonstration case in a multinational organization, Bosch Car Multimedia in Braga. The obtained results highlight the ability to take advantage of large amounts of historical data enhanced with predictions that support complex decision support scenarios.This work has been supported by FCT -Fundacao para a Ciencia e Tecnologia within the Project Scope: UIDB/00319/2020, the Doctoral scholarships PD/BDE/135100/2017 and PD/BDE/135105/2017, and European Structural and Investment Funds in the FEDER component, through the Operational Competitiveness and Internationalization Programme (COMPETE 2020) [Project n degrees 039479; Funding Reference: POCI-01-0247-FEDER039479]. The authors also wish to thank the automotive electronics company staff involved with this project for providing the data and valuable domain feedback. This paper uses icons made by Freepik, from www.flaticon.com

    Enterprise Composition Architecture for Micro-Granular Digital Services and Products

    Get PDF
    The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This defines the strategical context for composing resilient enterprise architectures for micro-granular digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of system architectures defines the moving context for adaptable systems, which are essential to enable the digital transformation. Enterprises are presently transforming their strategy and culture together with their processes and information systems to become more digital. The digital transformation deeply disrupts existing enterprises and economies. Since years a lot of new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT systems with many rather small and distributed structures, like Internet of Things or mobile systems. In this paper, we are focusing on the continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, like Internet of Things and Microservices, as part of a new digital enterprise architecture. To integrate micro-granular architecture models to living architectural model versions we are extending more traditional enterprise architecture reference models with state of art elements for agile architectural engineering to support the digitalization of services with related products, and their processes
    corecore