50,496 research outputs found

    DESAIN SISTEM INTELEJENSIA BISNIS PADA RANTAI PASOK PRODUK JAMU BERBASIS PENTAHO BUSINESS INTELLIGENCE

    Get PDF
    This paper designs business intelligence system for supply chain of herbal products in Indonesia. System design is done with system entity approach to obtain system attributes, conceptual system design using Businees Process Modeling Notation (BPMN) 2.0 and unified modeling language (UML) and finally integrated with pentaho business intelligence suite suite including integration and business analytics data. The result of business intelligence system design is able to transform raw data into information and mengintrpertasikannya in the form of a visual supply chain herbal products for the purposes of identification, development and create new business strategy opportunities

    A conceptual method for data integration in business analytics

    Get PDF
    Viele Unternehmen funktionieren derzeit in einem schnellen, dynamischen und vor allem unbeständigen Umfeld und wettbewerbsintensiven Markt. Daraus folgt, dass schnelle und faktenbasierende Entscheidungen ein wichtiger Erfolgsfaktor sein können. Basis für solche Entscheidungen sind oft Informationen aus Business Intelligence und Business Analytics. Eine der Herausforderungen bei der Schaffung von hochqualitativer Information für Geschäftsentscheidungen ist die Konsolidierung der Daten, die häufig aus mehrfachen heterogenen Systemen innerhalb eines Unternehmens oder in ein oder mehreren Standorten verteilt sind. ETL-Prozesse (Extraction, Transforming and Loading) sind häufig im Einsatz, um heterogene Daten aus einem oder mehreren Datenquellen in einem Zielsystem zusammenzuführen mit dem Ziel Data Marts oder Date Warehouse zu erstellen. Aufgrund mangelnder allgemeiner Methoden oder Ansätze, um systematisch solche ETL-Prozesse zu bewältigen, und Aufgrund der hohen Komplexität der Integration von Daten aus multiplen Quellen in einer allgemeinen, vereinheitlichten Darstellung, ist es sowohl für Fachleute als auch für die wenige erfahrene Anwender schwierig, Daten erfolgreich zu konsolidieren. Derzeit wird der analytische Prozess oft ohne vordefiniertes Rahmenwerk durchgeführt und basiert eher auf informelles Wissen als auf eine wissenschaftliche Methodik. Das größte Problem mit kommerzieller Software, die den Datenintegrationsprozess inklusive Visualisierung, Wiederverwendung von analytischen Sequenzen und automatischer Übersetzung der visuellen Beschreibung in einem ausführbaren Code unterstützt, ist, dass Metadaten für die Datenintegration generell nur syntaktisches Wissen darstellt. Semantische Informationen über die Datenstruktur sind typsicherweise nur in rudimentärer Form vorhanden und das obwohl sie eine signifikante Rolle bei der Definition des analytischen Modells und der Evaluierung des Ergebnisse spielen. Vor diesem Hintergrund hat Grossmann das “Conceptual Approach for Data Integration for Business Analytics” formuliert. Es zielt darauf hin, die Komplexität der analytischen Prozesse zu reduzieren und Fachkräfte in ihrer Arbeit zu unterstützen, um somit auch den Prozess für weniger erfahrene Anwender in unterschiedlichen Domänen zugänglich zu machen. Das Konzept ist detailliertes Wissen über Daten in Business Analytics, speziell Information über Semantik, zu berücksichtigen. Der Fokus liegt auf die Einbeziehung der strukturierten Beschreibung der Transformationsprozesse im Business Analytics, wo Informationen über Abhängigkeiten und Nebeneffekte von Algorithmen auch inkludiert sind. Darüber hinaus bezieht dieser Ansatz das Meta-Modell Konzept mit ein: es präsentiert ein Rahmenwerk mit Modellierungskonzepte für Datenintegration für Business Analytics. Basierend auf Grossmans Ansatz ist das Ziel dieser Masterarbeit die Entwicklung eines Meta-Model Prototyps, der die Datenintegration für Business Analytics unterstütz. Der Fokus liegt auf dem intellektuellen Prozess der Umwandlung einer theoretischen Methode in einem konzeptuellen Model, das auf ein Rahmenwerk von Modellierungsmethoden angewendet werden kann und welches zu den spezifischen Konzepten für eine bestimmte angewandte Meta-Model Plattform passt. Das Ergebnis ist ein Prototyp, der auf einer generischen konzeptuellen Methode basiert, welche unabhängig von der Ausführbarkeit einer Plattform ist. Darüber hinaus gibt es keine vordefinierte Granularitätsebene und die Modellobjekte sind für die unterschiedlichen Phasen der Datenintegration Prozess wiederverwendbar. Der Prototyp wurde auf der Open Model Plattform eingesetzt. Die Open Model Plattform ist eine Initiative der Universität Wien mit dem Ziel die Verwendung von Modellierungsmethoden zu erweitern und diese durch das Rahmenwerk, welches alle mögliche Modellierungsaktivitäten beinhaltet, für Geschäftsdomäne zur Verfügung zu stellen und nützlich zu machen, um die Zugänglichkeit bei dein Anwendern zu steigern.Today many organizations are operating in dynamic and rapid changing environment and highly competitive markets. Consequently fast and accurate fact-based decisions can be an important success factor. The basis for such decisions is usually business information as a result of business intelligence and business analytics in the corporate associations. One of the challenges of creating high-quality information for business decision is to consolidate the collected data that is spread in multiple heterogeneous systems throughout the organization in one or many different locations. Typically ETL-processes (Extraction, Transforming and Loading) are used to merge heterogeneous data from one or more data sources into a target system to form data repositories, data marts, or data warehouses. Due to the lack of a common methods or approaches to systematically manage such ETL processes and the high complexity of the task of integrating data from multiple sources to one common and unified view, it is difficult for both professionals and less experienced users to successfully consolidate data. Currently the analysis process is often performed without any predefined framework and is rather based on informal basis than a scientific methodology. Hence, for commercial tools that are supporting the data integration process including visualization of the integration, the reuse of analyses sequences and the automatic translation of the visual description to executable code, the major problem is that metadata used for data integration in general is only employed for representation of syntactic knowledge. Semantic information about the data structure is typically only available in a rudimentary form though it plays a significant role in defining the analysis model and the evaluation of the results. With this background Grossmann developed a “Conceptual Approach for Data Integration for Business Analytics”. It aims to support professionals by making business analytics easier and consequently more applicable to less experienced user in different domains. The idea is to incorporate detailed knowledge about the data in business analytics, especially information about semantics. It focuses on the inclusion of a more structured description of the transformation process in business analytics in which information about dependencies and side effects of the algorithms are included. Furthermore the approach incorporates the concept of meta-modelling; it presents a framework including the modelling concepts for data integration for business analytics. The idea of the thesis at hand is to develop a meta-model prototype that supports Data Integration for Business Analytics based on Grossman’s approach. The paper focuses on the intellectual process of transforming the theoretical method into a conceptual model which can be applied to the framework of a modelling methods and which fits to the specific concepts of a meta-model platform used. The result is a prototype based on a generic conceptual method which is execution platform independent, there are no pre-defined granularity levels and the objects of the model are re-usable for the different phases of the data integration process. The prototype is deployed on the Open Model Platform, an initiative started at the University of Vienna that aims to extend the usage of modelling methods and models and to make it more accessible to users by offering a framework including all kinds of modelling activities useful for business applications

    An artificial intelligence-based collaboration approach in industrial IoT manufacturing : key concepts, architectural extensions and potential applications

    Get PDF
    The digitization of manufacturing industry has led to leaner and more efficient production, under the Industry 4.0 concept. Nowadays, datasets collected from shop floor assets and information technology (IT) systems are used in data-driven analytics efforts to support more informed business intelligence decisions. However, these results are currently only used in isolated and dispersed parts of the production process. At the same time, full integration of artificial intelligence (AI) in all parts of manufacturing systems is currently lacking. In this context, the goal of this manuscript is to present a more holistic integration of AI by promoting collaboration. To this end, collaboration is understood as a multi-dimensional conceptual term that covers all important enablers for AI adoption in manufacturing contexts and is promoted in terms of business intelligence optimization, human-in-the-loop and secure federation across manufacturing sites. To address these challenges, the proposed architectural approach builds on three technical pillars: (1) components that extend the functionality of the existing layers in the Reference Architectural Model for Industry 4.0; (2) definition of new layers for collaboration by means of human-in-the-loop and federation; (3) security concerns with AI-powered mechanisms. In addition, system implementation aspects are discussed and potential applications in industrial environments, as well as business impacts, are presented

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    Data and Predictive Analytics Use for Logistics and Supply Chain Management

    Get PDF
    Purpose The purpose of this paper is to explore the social process of Big Data and predictive analytics (BDPA) use for logistics and supply chain management (LSCM), focusing on interactions among technology, human behavior and organizational context that occur at the technology’s post-adoption phases in retail supply chain (RSC) organizations. Design/methodology/approach The authors follow a grounded theory approach for theory building based on interviews with senior managers of 15 organizations positioned across multiple echelons in the RSC. Findings Findings reveal how user involvement shapes BDPA to fit organizational structures and how changes made to the technology retroactively affect its design and institutional properties. Findings also reveal previously unreported aspects of BDPA use for LSCM. These include the presence of temporal and spatial discontinuities in the technology use across RSC organizations. Practical implications This study unveils that it is impossible to design a BDPA technology ready for immediate use. The emergent process framework shows that institutional and social factors require BDPA use specific to the organization, as the technology comes to reflect the properties of the organization and the wider social environment for which its designers originally intended. BDPA is, thus, not easily transferrable among collaborating RSC organizations and requires managerial attention to the institutional context within which its usage takes place. Originality/value The literature describes why organizations will use BDPA but fails to provide adequate insight into how BDPA use occurs. The authors address the “how” and bring a social perspective into a technology-centric area

    Adding Value to Statistics in the Data Revolution Age

    Get PDF
    As many statistical offices in accordance with the European Statistical System commitment to Vision 2020, since the second half of 2014 Istat has implemented its internal standardisation and industrialisation process within the framework of a common Business Architecture. Istat modernisation programme aims at building services and infrastructures within a plug-and-play framework to foster innovation, promote reuse and move towards full integration and interoperability of statistical process, consistent with a service-oriented architecture. This is expected to lead to higher effectiveness and productivity by improving the quality of statistical information and reducing the response burden. This paper addresses the strategy adopted by Istat which is focused on exploiting administrative data and new data sources in order to achieve its key goals enhancing value to users. The strategy is based on some priorities that consider services centred on users and stakeholders as well as Linked Open Data, to allow Machine-to-Machine data and metadata integration through definition of common statistical ontologies and semantics

    Overcoming Barriers in Supply Chain Analytics—Investigating Measures in LSCM Organizations

    Get PDF
    While supply chain analytics shows promise regarding value, benefits, and increase in performance for logistics and supply chain management (LSCM) organizations, those organizations are often either reluctant to invest or unable to achieve the returns they aspire to. This article systematically explores the barriers LSCM organizations experience in employing supply chain analytics that contribute to such reluctance and unachieved returns and measures to overcome these barriers. This article therefore aims to systemize the barriers and measures and allocate measures to barriers in order to provide organizations with directions on how to cope with their individual barriers. By using Grounded Theory through 12 in-depth interviews and Q-Methodology to synthesize the intended results, this article derives core categories for the barriers and measures, and their impacts and relationships are mapped based on empirical evidence from various actors along the supply chain. Resultingly, the article presents the core categories of barriers and measures, including their effect on different phases of the analytics solutions life cycle, the explanation of these effects, and accompanying examples. Finally, to address the intended aim of providing directions to organizations, the article provides recommendations for overcoming the identified barriers in organizations

    Current Advancements of and Future Developments for Fourth Party Logistics in a Digital Future

    Get PDF
    This paper aims to analyze the potential future of the 4PL concept based on expert opinions with special regard to the influence of digitalization coming with a disruptive trans-formation of supply chains. Service arrangements, provider capabilities and benefits resulting from a 4PL partnership are compared in current and future configurations. The research follows an explorative mixed methods approach with semi structured interviews followed by an expert panel. This builds a basis for an online survey questionnaire to inquire on important future aspects for the 4PL concept by a sample of respondents from multinational companies. Our results show a clear trend away from simply organizing transportation and logistics activities towards the provision of an IT platform as well as further value-added service activities such as planning, analytics and monitoring. Along with this, IT capabilities appear to be an important differentiator for 4PL providers in the future. Moreover, relationships between 4PL providers and their clients become closer and more strategic, which leads to a customer valuing not only direct cost reductions but rather improvements resulting from optimized operations through superior analysis and planning functions
    • …
    corecore